Intel is cutting 12,000 workers as it faces the financial consequences of underestimating a profound shift in computing from desktop computers to pocket-sized devices. And more trouble may lie ahead. The rate at which Intel makes technological advances suddenly seems to be slowing, and other looming trends, including artificial intelligence and perhaps virtual reality, look set to benefit a different kind of computer architecture.
The company is now finding that the rate at which it can squeeze twice as much power out of its chips, something dubbed Moore’s Law after the company’s founder, Gordon Moore, is slowing down. And while Intel says it will refocus its attention on cloud computing and devices for the Internet of things, it risks missing out on several up-and-coming opportunities.
Last week, I spent a few days at a developer conference in San Jose organized by Nvidia, a chip company that makes graphics processing units, or GPUs. This type of chip is especially good for the kind of parallel computations companies are harnessing to perform deep learning (a powerful kind of machine learning); and of course they are geared toward rendering the highly realistic 3-D environments needed for virtual reality. Indeed, the Nvidia event was filled with demos of self-driving cars, deep-learning systems, and virtual-reality headsets.
So beyond cutting jobs, Intel might need to think about how it can feed the industry’s appetite for AI and VR if it doesn’t want to miss the next big shift in how we use computers.