Apple is developing 3D depth sensing technology for the rear-facing cameras in its 2019 iPhones, according to a new report by Bloomberg on Tuesday. The 3D sensor system will be different to the one found in the iPhone X's front-facing camera, and is said to be the next big step in turning the smartphone into a leading augmented reality device.
Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured-light technique that projects a pattern of 30,000 laser dots onto a user's face and measures the distortion to generate an accurate 3D image for authentication. The planned rear-facing sensor would instead use a time-of-flight approach that calculates the time it takes for a laser to bounce off surrounding objects to create a three-dimensional picture of the environment.
The existing TrueDepth camera would continue to be used in the front-facing camera of future iPhones in order to power Face ID, while the new system would bring the more advanced "time-of-flight" 3D sensing capability to the rear camera, according to the sources cited. Discussions with manufacturers are reportedly already underway, and include Infineon, Sony, STMicroelectronics, and Panasonic. Testing is said to be still in the early stages, and could end up not being used in the phones at all.
With the release of iOS 11, Apple introduced the ARKit software framework that allows iPhone developers to build augmented reality experiences into their apps. The addition of a rear-facing 3D sensor could theoretically increase the ability for virtual objects to interact with environments and enhance the illusion of solidity.
Apple was reportedly beset with production problems when making the sensor in the iPhone X's front-facing camera, because the components used in the sensor array have to be assembled with a very high degree of accuracy. According to Bloomberg, while the time-of-flight technology uses a more advanced image sensor than the existing one in the iPhone X, it does not require the same level of precision during assembly. That fact alone could make a rear-facing 3D sensor easier to produce at high volume.
Late last month, oft-reliable KGI Securities analyst Ming-Chi Kuo claimed that Apple is unlikely to expand its front-facing 3D sensing system to the rear-facing camera module on iPhones released in 2018. Kuo said the iPhone X's 3D sensing capabilities are already at least one year ahead of Android smartphones, therefore he believes Apple's focus with next year's iPhone models will be ensuring an on-time launch with adequate supply.
Top Rated Comments
Example: For depth resolution of 1cm you'd need to be able to measure time differences of 33.3 pico seconds (i.e. 0,0000000000333 or 33.3 trillionths seconds). But 1cm for facial recognition is too coarse; they'd need a fraction of this.
It's been done ('https://en.wikipedia.org/wiki/Time-of-flight_camera#Direct_Time-of-Flight_imagers') in high-tech systems taking up more space than a whole iPhone.
Good luck guinea pigs! I’ll be buying the iPhone after next as all the kinks in the rear camera will have been ironed out.
Edit: I thought I was obviously tongue-in-cheek but I guess not. So here’s the tag: /s
These types of comments are so common here that people just assume you’re being serious. Lol.