On 9/10/2018 9:43 AM, Joakim wrote: > Yes, I know, these devices won't replace your quad-core Xeon workstation > with 32-64 GBs of RAM anytime soon, but most people don't need anywhere > near that level of compute. That's why PC sales keep dropping while > mobile sales are now 6-7X that per year: I'm all for supporting modern open CPU architectures. At the same time, I fear that the specific trend you're describing here (people ditching PCs for cellphones/tablets) is effectively a reversal of the PC revolution.
For the last 30+ years people benefited from "trickle down computing". They had access to PCs that were equivalent to cutting edge servers of 6-7 years prior. They had ability to choose their operating system, expand and upgrade their hardware and install any software they wanted. All of this is breaking down right now. Intel got lazy without competition and high-end CPU architectures stagnated. All the cutting-edge computing is done on NVidia cards today. It requires hundreds of gigabytes of RAM, tens of terabytes of data and usage of specialized computing libraries. I very much doubt this will "trickle down" to mobile in foreseeable future. Heck, most developer laptops today have no CUDA capabilities to speak of. Moreover, mobile devices are locked down by default and it's no trivial task to break out of those walled gardens. IIRC, Apple has an official policy of not allowing programming tools in their app store. Alan Kay had to personally petition Steve Jobs to allow Scratch to be distributed, so kids could learn programming. I believe the general policy is still in place. Android is better, but it's still a horror to do real work on, compared to any PC OS. Fine, you rooted it, installed some compilers and so on. How will you share your software with fellow Android users? In essence, we are seeing the rapid widening of two digital divides. The first one is between users and average developers. The second one is between average developers and researchers at companies like Google. I very much doubt that we will see an equivalent of today's high-end machine learning server on user's desk, let alone in anyone's pocket, within 7 years. My only hope is that newer AMD processors and popularity of VR rigs may help narrow these divides.