Analyst Ming-Chi Kuo now claims that the forthcoming Apple AR headset will track hand movements, which will allow for the device to interpret gestures.
In his second Apple AR report in as many days, respected analyst Ming-Chi Kuo is reporting more details of the sensors expected to be in the first of Apple's forthcoming headsets.
"Gesture control and object detection are critical human-machine UI designs of Apple's AR/MR headset," writes Kuo in a note for investors. "Apple's AR/MR headset is equipped with more 3D sensing modules than iPhones."
"We predict that Apple's AR/MR headset will have four sets of 3D sensing (vs. one to two sets for iPhone/high-end smartphones)," he continues. "We predict that the structured light of the AR/MR headset can detect not only the position change of the user or other people's hand and object in front of the user's eyes but also the dynamic detail change of the hand."
Saying that this "dynamic detail change" is similar to how Face ID can detect changes of expression, Kuo claims that capturing "the details of hand movement can provide a more intuitive and vivid human-machine UI."
There is a difference between Face ID and this hand tracking, though.
"Although both adopt structured light, the distance between the hand (user or other people's) and the object detected by the headset device needs to be longer than that seen by iPhone's Face ID," continues Kuo, "so the structured light power consumption of the headset device is higher."
"We predict that the detection distance of Apple's AR/MR headset with structured light is 100-200% farther than the detection distance of the iPhone Face ID," he says. "To increase the field of view (FOV) for gesture detection, we predict that the Apple AR/MR headset will be equipped with three sets of ToFs [Time of Flight] to detect hand movement trajectories with low latency requirements."
Kuo's focus is on the impact these components and requirements will have on companies that his clients may choose to invest in. So he reports that the firms WIN semi and Lumentum are believed to be the main providers for the headset's VCSELs (Vertical Cavity Surface-Emitting Laser) sensors.
However, he also backs up his previous report that Apple ultimately intends the AR headset to replace the iPhone.
"The innovative human-machine UI for headset devices requires the integration of many technologies," he writes. "It will be critical for headsets to replace the existing consumer electronic products with displays in the next ten years."
10 Comments
I love VR/AR headsets but they won't be replacing hand held devices for a long, long time if ever. If they got small and light enough, they could replace your computer screen and iPad but the iPhone is in a different category. I keep mine in my back pocket when I am doing yard work which includes swinging a pickaxe. No headset will replace it as I require some degree of eye protection and wont risk damaging either my eyes or an expensive gadget. This is just one personal example but there are many others. Can you use an AR headset while driving? I am guessing not or at least not until your car drives itself. Let's take a look at the technological roadblocks preventing constant use of an AR headset:
Currently VR headsets are so big and heavy that they become uncomfortable to wear after an hour or two.
Even if an AR headset were smaller, it would still need to be powered by batteries. Where do they go? In your pocket with a cord dangling down your neck?
You can shrink the vision system but what about the rest of the electronics? It all generates heat so how to you dissipate it if the electronics are built into a pair of glasses?
AR is great but for the best virtual experiences, you need full VR. Will the AR glasses have a full VR mode or will we need another headset for that?
Many people are very sensitive to differences between motion they see with their eyes and what they feel with their sense of balance. How do you do VR without making people sick?
Finally the cameras. You can't have AR without some cameras to help with the tracking and scene analysis. People threw a hissy fit over the original Google Glass AR glasses because they had cameras. There was at least one fight over them. Is everyone ready to accept a camera on everyone's face constantly recording and sending the video back to the Apple's servers? What happens when Apple starts scanning those videos for anything illegal to protect kids or something?
FaceID & detailed face tracking, LiDAR, head tracking (AirPods) & spatial audio, etc. are each individually awesome and valuable, but I’m excited to experience them (and of course more) work together on this!