Apple's new iPhone 12 Pro models feature a new LiDAR sensor that will significantly enhance its augmented reality and photography capabilities.
Credit: Apple
First introduced on the iPad Pro, the LiDAR sensor suite is a laser-based, time-of-flight system that allows a device to quickly calculate the distance to an object.
On the iPhone 12 Pro and iPhone 12 Pro Max, the LiDAR sensor will enhance Apple's ARKit, as well as photo and video capture. The company says it offers "the ability to measure light distance and use pixel depth information of a scene."
For example, the LiDAR sensor can create precise depth maps of an environment or scene. That will allow for "instant AR" experiences, more realistic AR scenes, and "endless opportunities" for developers to take advantage of it.
It can also be used to enhance autofocus in low-light scenes. The LiDAR sensor allows for 6x faster focus time in low-light scenes for both video and photo sessions. It also enables Night portrait modes, which Apple says will allow for a "beautiful low-light bokeh effect."
There's also a good chance that LiDAR will play a role in Apple's own proprietary AR features, including for indoor navigation and item tracking alongside Ultra Wideband technology.