The iPhones launching in 2020 may feature a Time of Flight sensor, which helps determine distances between objects and the lens.
According to Digitimes, Apple is in talks with its Face ID processor supplier, Lumentum, to bring a Time of Flight sensor to the rear cameras in its 2020 iPhones. Time of Flight scans in a similar way to Face ID, but rather than for very close-up identification, it's used for helping to make it faster for the iPhone to take regular photos.
A Time of Flight 3D sensor scans the area in front of the camera and calculates distances between objects and the lens, which speeds up focusing and may also help with augmented reality.
Apple has previously been reported to be planning to include Time of Flight in its 2020 iPhones. As long ago as 2017, the company was said to be investigating the use of such a laser-based system.
Assorted rumors dating back to 2017 have put the feature as coming to the 2019 iPhones. Most recently, analyst Ming-Chi Kuo has claimed that Apple will use methods such as this to improve its Face ID systems for the 2019 iPhones and iPads.
Whenever it arrives, a Time of Flight sensor will be a VCSEL, or vertical cavity surface-emitting laser, and the Digitimes report says it will be made by US firm Lumentum. Digitimes also claims that this is related to Apple's ongoing augmented reality work, despite recently claiming that the company had abandoned its AR Glasses plan.
DigiTimes has a decent track record when it regards to Apple's suppliers, but a poor one when it predicts features of Apple's products.
5 Comments
It's mind boggling that sensors and computers are precise and powerful enough now to be able to compute distance based on timing "echos" of a beam of light. If something is 5 feet away, the round trip for a beam of light takes 10 ns (10 billionths of a second, or 1 * 10**−8); the round trip for something 5.1 feet away takes 1.02 * 10**-8 second--a difference of 2 10-billionths of a second. And yet a tiny handheld computer identify such difference? Wow.
My experience with these new camera (or other) announced capabilities is that in reality they never live up to expectations. I thought the new "measure" app would be handy and useful. Tried it out, and it simply didn't work. Had a project once, and decided to try it again. No dice. Spent more time fumbling and fiddling than I would have by simply fetching my tape measure. Delete.
The camera on my Xr is great. It does take nice pictures, and that is evident when I look back at old pics in the Photos app (when I an figure out how to make that work...) But when they hyped the bocah stuff, and post pic editing I got real excited...until I discovered "portrait" mode only works on people. I don't take pics of people. I hate people. Most of my portraits of my dogs. I love my dogs. Yes, there's an app for that, but more fumble around, fiddle. Eh.
I sort of gave up on Siri. A few very narrow useful functions. Why on why, Siri, can't you display the route I *always, always* take that doesn't include the toll lanes I never, ever use?
Yes, YMMV. Yeah, that most of this doesn't work for me is probably my fault.
The iPhone 12 is shaping up to be a legendary release. A quick checklist, shall we? 5G, 3D TOF, USB-C, smaller notch, the return of Touch ID, 6.7 inch Max. Am I missing anything? Next year, always next year.
Apple's strategy seems to be to implement the hardware and software for future AR systems in the phones and get them 'battle hardened' in whatever applications can be found for them on mobile. I'm very curious what use cases the ToF camera would have, ARKit seems to do rather well without it. Maybe it could fix the holes (literally) in the depth camera lighting effects.