Snapchat on Tuesday confirmed that it will be one of the first third-party developers to take advantage of the new LiDAR sensor on iPhone 12 Pro models.
The LiDAR sensor, first introduced on the iPad Pro lineup, brings a suite of new augmented reality and photographic capabilities to the iPhone 12 Pro and iPhone 12 Pro Max. Apple, for example, said it will allow for 6x faster autofocus in low-light situations.
Combined with new machine learning capabilities and Apple's AR frameworks, the LiDAR sensor will also significantly enhance AR experiences from third-party developers. And, on Tuesday, Snapchat announced that it plans to unveil a Lens specifically made for iPhone 12 Pro devices.
Apple even gave a sneak preview of the new Snapchat capabilities during its keynote Tuesday morning. The new filter, which could indicate what Snapchat has in store, can be seen at 59:41 in the video below.
Snapchat later confirmed to TechCrunch that the Lens in Apple's announcement is the same one it plans to launch later in 2020.
The LiDAR sensor itself is a time-of-flight system that can accurately create a depth map of an environment using lasers. The result is faster and much more accurate AR, as well as new opportunities to use Apple's ARKit in creative ways.
Apple's new iPhone 12 Pro and iPhone 12 Pro Max models with the LiDAR sensor start at $999 and $1,099, respectively. The iPhone 12 Pro will be available for preorder at 5 a.m. on Friday, Oct. 16, while the 12 Pro Max goes up for preorder on Nov. 6.