Next AR-focused iPhone Could Have Rear-Facing 3-D Sensor
Apple is reportedly working on a rear-facing 3-D sensor system for the iPhone in 2019, as the company is trying to turn the handset into an augmented-reality device.
The current iPhone X device is using a TrueDepth sensor system placed on the front of the device. The system relies projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3-D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment, unnamed sources told Bloomberg.
The company is expected to keep the TrueDepth system, so future iPhones will have both front and rear-facing 3-D sensing capabilities.
Apple declined to comment.
The addition of a rear-facing sensor would enable more augmented-reality applications in the iPhone. Apple Chief Executive Officer Tim Cook considers AR potentially as revolutionary as the smartphone itself. "We're already seeing things that will transform the way you work, play, connect and learn," he said in Apple's latest call. "AR is going to change the way we use technology forever."
Apple has released the ARKit software in order to make it easier for developers to make apps for the iPhone using AR. The tool is good at identifying flat surfaces and placing virtual objects or images on them. But it struggles with vertical planes, such as walls, doors or windows, and lacks accurate depth perception, which makes it harder for digital images to interact with real things. A rear-facing 3-D sensor would help remedy that.
Google has been also working with Infineon on depth perception as part of its AR development push, Project Tango, unveiled in 2014. The Infineon chip is already used in Lenovo's Phab 2 Pro and Asustek's ZenFone AR, both of which run on Google's Android operating system.