The latest Apple Vision Pro developer beta includes a video tutorial on how to set up the view of your eyes that gets shown to people on the exterior display.
Apple wants Vision Pro to appear to anyone around the user that it is as if the wearer is looking through the headset at them. In reality, a user's eyes are completely covered by the headset, and a recording of them is shown on the front display.
Now in the latest developer beta, Apple has added a tutorial that covers setting up this face and eye recording for that external display. The tutorial is new, but so is the process, which did not exist when Apple Vision Pro was first announced.
New video tutorial showing Persona Enrollment for Apple Vision Pro added in visionOS beta 6!
— M1 (@M1Astra) November 14, 2023
The enrollment uses the EyeSight display to guide the user. pic.twitter.com/cGfsdTuIaY
Apple calls this setup a Persona Enrollment, and it's similar in principle to how users first configure Face ID. Holding the headset with its forward cameras facing the user, Vision Pro prompts for the owner to look left, look right, and so on.
Without any apparent shortening of the sequence, Apple's tutorial video shows the process taking around 30 seconds to complete.
The video was spotted in the developer beta by Twitter/X user @M1Astra.
This text has been updated showing new information regarding Apple Vision Pro Light Seal cushions.
— M1 (@M1Astra) November 14, 2023
"Try the thicker N+ or W+ Light Seal cushion in the Apple Vision Pro packaging so your eyes are farther from the displays."
"Try the thinner N or W Light Seal cushion in the https://t.co/5pcCRBgLdC pic.twitter.com/17W8zOLoMx
At the same time, he or she has spotted a text list of Vision Pro suggestions. Most significantly, they include a prompt to "try the thicker N+ or W+ Light Seal cushion... so your eyes are farther from the displays."
Apple continues to say that Vision Pro is on track for release early in 2024.
1 Comment
Surprised that there were so few emotions recorded in that enrollment video.
Also curious about the OLED display for EyeSight. It looks low resolution, but probably not. The low resolution looking graphics are caused by having to split up the display into a whole lot of slices for the lenticular lens.
One thing I've been pondering is that the EyeSight is a 3rd display on the VP. So, if the M2 can only drive 2 displays, and they are driving 2 4.5k-ish displays at up to 100 Hz, what's driving the EyeSight display? The R1 chip?