Mark Gurman of Bloomberg writes a double-hype article featuring Apple and Augmented Reality.
Hundreds of engineers are now devoted to the cause, including some on the iPhone camera team who are working on AR-related features for the iPhone, according to one of the people.
I would be surprised if Apple didn't have an army of people working on AR features for the iPhone.
One of the features Apple is exploring is the ability to take a picture and then change the depth of the photograph or the depth of specific objects in the picture later; another would isolate an object in the image, such as a person's head, and allow it to be tilted 180 degrees.
This sounds like the light field technology popularized by Lytro: the camera senses both the intensity of the light and the angle of the light. When coupled with software, you can change the depth of field of the image. Cool tech and I can't wait to see this in the iPhone.
A different feature in development would use augmented reality to place virtual effects and objects on a person, much the way Snapchat works.
I think the "on a person" is a little misleading. I doubt Apple is going Snapchat, but more like Hololens where you can place an object in a particular location and have that persist "in place" as the phone moves around.
The iPhone camera features would probably rely on a technology known as depth sensing and use algorithms created by PrimeSense, an Israeli company acquired in 2013.
And of course, to place objects in a scene and have them persist, you need some sort of mapping technology - both hardware to map the environment and software to make sense of what's mapped.
To add more credibility to these claims, both Microsoft (with Windows 10 Creator) and Google are aiming in this direction. Here is Google's Project Tango that was announced at Google I/O mid last-year. The next question: will we see this sort of AR capability, that is, spatial mapping and object detection, in the 10th anniversary iPhone?
Disclaimer: At the time of this writing, I own stock in Apple.