Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Kuo: Apple unlikely to integrate rear-facing 3D sensor in 2019 iPhone

Last updated

Contrary to industry expectations, Apple analyst Ming-Chi Kuo believes the company will not integrate a rear-side time of flight (TOF) solution in its 2019 iPhone lineup, saying the technology is not yet ready for the augmented reality revolution.

In a note to investors seen by AppleInsider, Ming-Chi Kuo says industry analysts expect Apple to incorporate rear-side TOF as it looks to develop next-generation augmented reality experiences. For example, Apple is thought to be developing an AR version of Apple Maps, potentially for use with a rumored AR headset.

According to Kuo, the distance and depth information provided by existing rear-side TOF hardware is insufficient for creating the "revolutionary AR experience" that Apple is presumably working toward.

The analyst believes a comprehensive AR ecosystem is one that integrates 5G connectivity, AR glass (a wearable, head-mounted device) and a "more powerful Apple Maps database" that includes appropriate distance and depth information. It appears Kuo, like others, assumes Apple Maps will be marketed as a "killer app" for Apple's next-gen AR experience.

Additionally, TOF tech does not improve photo taking functionality, a major consideration for a company that touts its handsets as the best portable cameras in the world.

As such, Kuo says Apple will likely forego rear-side TOF in 2019, instead relying on a dual-camera system first introduced with iPhone 7 Plus in 2016.

"We believe that iPhone's dual-camera can simulate and offer enough distance/depth information necessary for photo-taking; it is therefore unnecessary for the 2H19 new iPhone models to be equipped with a rear-side ToF," Kuo says.

Rumors of a rear-facing TrueDepth-style camera date back to last July, when reports claimed Apple planned to debut a rear-facing VCSEL system for AR applications and faster camera autofocus. That solution was due to arrive in what would become iPhone X, but Apple's flagship smartphone uses a single VCSEL module in its front-facing TrueDepth camera array.

Unlike TrueDepth, which measures distortion in structured light, a TOF system calculates the time it takes pulses of light to travel to and from a target. Such systems allow for extremely accurate depth mapping and can therefore assist in AR applications.

The July rumor was followed by a second report in November claiming much the same, while analysts jumped on the bandwagon in February.



12 Comments

Soli 9 Years · 9981 comments

Why does this guy get any mention on tech forums. I can easily point to a dozen commenters on this forum that I'd rather get opinions about what Apple will likely do in the near future.

ols 6 Years · 51 comments

Does it make sense to integrate the rear-facing camera with face-id? What is the use-case?
IMHO none.

avon b7 20 Years · 8046 comments

ols said:
Does it make sense to integrate the rear-facing camera with face-id? What is the use-case?
IMHO none.

It's purpose would not be Face ID.

These solutions normally have a primary consideration. Distance. Front facing Face ID was designed with short range in mind.

Rear facing depth sensing would need to cover longer distances.

JFC_PA 7 Years · 947 comments

ols said:
Does it make sense to integrate the rear-facing camera with face-id? What is the use-case?
IMHO none.

I gather from the article, not face I. D. but advanced AR applications. There’s a mention of future generation Apple Maps. 

tomasulu 10 Years · 62 comments

Soli said:
Why does this guy get any mention on tech forums. I can easily point to a dozen commenters on this forum that I'd rather get opinions about what Apple will likely do in the near future.

So just leave already. Why are you still here? This is a site on Apple rumors and this is the guy who’s by far the most accurate analyst. I appreciate reading about his prognostication on Apple.