Apple’s making its own GPU to control its own destiny

Brian Barrett, Wired:

Graphics processors underpin virtually all of the features and experiences that today’s tech companies scramble to lead in.

Machine learning? GPUs. Augmented and virtual reality? Likewise. And while it often gets lost in the conversation around glitzier use cases, good old-fashioned, high-resolution gaming horsepower leans heavily on GPUs too.

We saw this with the CPU, Apple wants to own this part of the hardware because they believe they can innovate faster than anyone else - and they don't want to share those gains.

On Augmented Reality, I wrote previously in response to a Bloomberg article noting that the next iPhone would likely have depth sensing capability provided by PrimeSense:

And of course, to place objects in a scene and have them persist, you need some sort of mapping technology - both hardware to map the environment and software to make sense of what's mapped.

A sensor that can map the external environment is a key component required for true Augmented Reality and missing from the iPhone. What we've seen so far with AR apps on the iPhone are image recognition technologies and overlays or apps that use the compass + GPS for a "good enough" experience. But what they haven't been able to do is persist objects spatially with precision - that is, placing a 3D object in a fixed position and having it viewable from any angle using the iPhone as a viewport onto the scene.

Disclaimer: At the time of this writing, I own Apple stock.