Two Apple Rumor reports focus on the iPhone 12's upgraded TrueDepth camera & the new Time-of-Flight 3D backside camera
A Chinese rumor today claims that people familiar with the upcoming iPhone 12 say that at least one new iPhone this year will be equipped with a rear-facing 3D deep-sensing lens known as ToF (Time of Flight). In 2019 Patently Apple posted a report titled "LG Innotek begins production of iPhone Camera Modules with Apple's Time-of-Flight Camera Module being Postponed to 2020
The report further noted that "In addition, it is understood that this camera will include a system of lasers , sensors and software. By emitting light to measure the distance between the mobile phone and the object, it can bring new photographic and video effects, and improve and enhance Reality (AR) experience.
Apple's new backside camera will reportedly be branded the "World Facing" camera, and Apple developers have been debugged it for at least two years.
In addition to being equipped with a backside 3D sensing lens, Apple's TrueDepth camera for Face ID will be improved on iPhone 12 by being faster and more accurate.
Lastly, the rumor suggests that the iPhone 12 may introduce a new design, expand its internal memory to 6GB and deliver an unprecedentedly powerful A14 bionic chip.
In a second rumor report, Fast Company reported yesterday that the updated Face ID camera (TrueDepth camera) may "create a better-looking bokeh effect by more accurately distinguishing between foreground and background layers, and perhaps adding more depth layers to blur or focus. It might become possible to adjust which layers of a photo are blurry and which are focused after the fact in editing mode."
The latter feature was first introduced by Lytro which Google acquired in 2018. Google has failed to bring this to market thus far but may later this year. Apple delivering such a feature would be timely and appreciated by future iPhone 12 customers.
Andre Wong, Lumentum’s VP of 3D Sensing stated for the Fast Company report that "When you use AR apps without depth information, it’s a bit glitchy and not as powerful as it ultimately could be. Now that ARKit and (Google’s) ARCore have both been out for some time now, you’ll see new AR apps coming out that are more accurate in the way they place objects within a space." For more on the Fast Company report click here.