Apple's 'iPhone 8' to feature rear-facing 3D laser for AR and faster autofocus, report says

By Mikey Campbell

Apple is reportedly working to implement a specialized rear-facing laser system in its upcoming "iPhone 8" that will facilitate augmented reality applications, like those produced with ARKit, as well as faster and more accurate autofocus capabilities.

'iPhone 8' concept rendering by Marek Weidlich.
'iPhone 8' concept rendering by Marek Weidlich.

Citing a source familiar with Apple's plans, Fast Company reports the company is developing a VCSEL laser system for integration in a new iPhone model set for debut this fall. The rumored "iPhone 8," which is expected to become Apple's next flagship smartphone, is a likely candidate for deployment, the source said.

Building on past rumors regarding a front-facing 3D-sensing camera, today's report claims Apple will apply VCSEL technology to the rear-facing shooter. The system, which calculates distance to a target using light pulses and time of flight (TOF) measurements, would allow for extremely accurate depth mapping, a plus for AR applications.

Currently, Apple's ARKit relies on complex algorithms derived from optical information provided by iPhone's iSight camera.

Analyst Ming-Chi Kuo revealed Apple's intent to embed a 3D scanning subsystem into iPhone's front-facing FaceTime camera array in February. That system also integrates an infrared VCSEL transmitter and specialized receiver alongside the traditional color RGB camera module. Judging by today's report, Apple wants to do the same for iSight.

Fast Company says the rear-facing laser will also aid in faster and more accurate autofocus capabilities. Similar systems have been employed in digital SLRs and compact cameras for years, but have only recently made their way to small form factor devices like smartphones.

Apple has in the past relied on focus technology that comes with camera modules provided by third-party suppliers like Sony. Most recently, the company added phase shift autofocus, dubbed "Focus Pixels" in Apple speak, to iPhone with the iPhone 6 series in 2014.

Phase detection systems achieve focus by detecting and comparing two or more sets of incoming incident light rays. Laser systems, on the other hand, directly measure scene depth by measuring the time it takes a laser light pulse to travel to and from a target object.

As for suppliers, Apple has tapped Lumentum to provide most of the VCSEL lasers, with the remainder to be produced by Finisar and II-VI, the source said. The time of flight sensor is expected to come from STMicro, Infineon or AMS. As it does with other major operating components, Apple could purchase the part in module form from LG Innotek, STMicro, AMS or Foxconn, the report said.

While it would be a boon for ARKit, Apple's first official push into the AR space, the supposed 3D sensor might not be ready in time for a 2017 launch. Engineers are currently working to integrate the component, but inclusion is not guaranteed for "iPhone 8," the source implied.