Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

A unique attack could steal your passwords from Apple Vision Pro

Apple Vision Pro's eye-tracking technology offers a new way to interact with typing, but hackers are already exploiting it to steal sensitive information. Here's what you need to know to protect your data.

New technologies always come with new vulnerabilities. One such vulnerability, GAZEploit, exposes users to potential privacy breaches on Apple Vision Pro FaceTime calls.

GAZEploit, developed by researchers from the University of Florida, CertiK Skyfall Team, and Texas Tech University, uses eye-tracking data in virtual reality to guess what a user is typing.

When users don a virtual or mixed reality device, like the Apple Vision Pro, they can type by looking at keys on a virtual keyboard. Instead of pressing physical buttons, the device tracks eye movements to determine the selected letters or numbers.

Diagram showing facial biometric extraction, eye aspect ratio, gaze estimation, and keystroke inference. Includes charts for click events and keystroke sessions, concluding with keyboard mapping and key guess inference. Overview of the attack

The virtual keyboard is where GAZEploit comes in. It analyzes the data from eye movements and guesses what the user is typing.

GAZEploit works by recording the movements of the virtual avatar's eyes of the user. It focuses on the eye aspect ratio (EAR), which measures how wide a person's eyes are open, and eye gaze estimation, which tracks exactly where they're looking on the screen.

By analyzing these factors, hackers can determine when the user is typing and even pinpoint the specific keys they're selecting.

When users type in VR, their eyes move in a particular way and blink less often. GAZEploit detects this and uses a machine learning program called a recurrent neural network (RNN) to analyze these eye patterns.

The researchers trained the RNN with data from 30 different people and got it to accurately identify typing sessions 98% of the time.

Guessing the right keystrokes

Once a typing session is identified, GAZEploit predicts the keystrokes by analyzing rapid eye movements, called saccades, followed by pauses, or fixations, when the eyes settle on a key. The attack matches these eye movements to the layout of a virtual keyboard, figuring out the letters or numbers being typed.

GAZEploit can accurately identify the selected keys by calculating the gaze's stability during fixations. In their tests, the researchers reported 85.9% accuracy in predicting individual keystrokes and nearly perfect 96.8% recall in recognizing typing activity.

Since the attack can be carried out remotely, attackers only need access to video footage of the avatar to analyze eye movements and infer what is being typed.

Remote access means that even in everyday scenarios such as virtual meetings, video calls, or live streaming, personal information like passwords or sensitive messages could be compromised without the user's knowledge.

How to protect yourself from Gazeploit

To protect against potential attacks like GAZEploit, users should take several precautions. First, they should avoid entering sensitive information, such as passwords or personal data, using eye-tracking methods in virtual reality (VR) environments.

Instead, it's safer to use physical keyboards or other secure input methods. Keeping software updated is also crucial, as Apple often releases security patches to fix vulnerabilities.

Finally, adjusting privacy settings on VR/MR devices to limit or disable eye-tracking when not needed can further reduce exposure to risks.



7 Comments

mknelson 1148 comments · 9 Years

The method makes sense, but where is Gazepoit running? Is it on your Vision Pro (and shouldn't privacy settings limit access to the avatar?) or is it running on another device you are communicating with?

gatorguy 24627 comments · 13 Years

mknelson said:
The method makes sense, but where is Gazepoit running? Is it on your Vision Pro (and shouldn't privacy settings limit access to the avatar?) or is it running on another device you are communicating with?

No, it's a remote monitoring of your virtual Persona. You can watch how it works, already linked in the Apple Insider article
https://www.youtube.com/watch?v=DPYT8IH-R18

Fidonet127 598 comments · 5 Years

Apple already fixed this. 1.3 turns off the persona when typing.

gatorguy 24627 comments · 13 Years

Apple already fixed this. 1.3 turns off the persona when typing.

It was never going to be a huge thing to fix anyway. Something as simple as dumping every other frame or something to that effect would eliminate it. 

blurpbleepbloop 202 comments · 18 Years

Since Apple already manipulates one’s gaze (at least on iOS) so that it appears as though one is looking right at one’s FaceTime caller, why not just fix the pupils (or otherwise naturally animate them) while one has a virtual keyboard active?