Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Hands on with Apple's FaceTime Attention Correction feature in iOS 13

FaceTime Attention Correction fixes your gaze

Last updated

In the latest iOS 13 beta, Apple is leaning into its augmented reality prowess to fix a common eye contact issue had during FaceTime calls. AppleInsider goes hands on and dives into the new feature to find out how it works.

When making a FaceTime call, a user naturally wants to look at the screen — at the person they are conversing with. On the other end, the recipient of the call just sees the caller looking down.

When the caller looks at the camera, the recipient sees this as the caller looking them in the eye — a much more natural point of view. However the caller is no longer looking at the recipient on the screen at that point, which means they can only count on their peripheral vision to see the other person's reactions, rather than a clearer image when viewing normally.

You can see how this looks for yourself by opening the camera app and looking directly at the screen and taking a picture, then looking at the camera and taking a picture.

FaceTime Attention Correction toggle FaceTime Attention Correction toggle in Settings

With the third beta of iOS 13, Apple added a new toggle for FaceTime Attention Correction. This aims to make it appear as though you are looking directly at a friend during FaceTime call when you're actually looking at the screen.

An iPhone's TrueDepth Camera System An iPhone's TrueDepth Camera System

The feature uses the TrueDepth camera system, the same camera system used for Animoji, unlocking the phone and even augmented reality features found in FaceTime.

Taking a look behind the scenes, the new eye contact functionality is powered, of course, by Apple's ARKit framework. It creates a 3D face map and depth map of the user, determines where the eyes are, and adjusts them accordingly.

We don't know with certainty, but Apple is likely using ARKit 3 APIs, which would explain why this functionality is limited to the iPhone XS and iPhone XS Max, and not on the iPhone X, as those APIs aren't supported on earlier models. It doesn't explain the lack of support on the iPhone XR, which uses Apple's beefy A12 Bionic processor.

Slight warping from the augmented reality feature of the attention correction feature Slight warping from the augmented reality feature of the attention correction feature

If we take a straight object and move it in front of our eyes while on camera, you are able to see the slight distortion that is a byproduct of the augmented reality addition. There is a bit of a curve around the eyes and even the nose.

The whole thing looks very natural and no one would likely even notice without prior knowledge of the feature. It simply looks like the person calling is looking right at you, instead of your nose or your chin.

FaceTime Attention Correction example FaceTime Attention Correction example

Here you can see us looking at the screen and the camera within the Camera app, and how when looking at the screen it looks like we are looking down. When we are in a FaceTime call with the feature on, you can see how it looks like we are looking upwards slightly when we look directly at the camera.



10 Comments

titantiger 297 comments · 14 Years

It's really cool and I was all excited until I realized it had to have the camera/sensor array from the XS and XS Max.  Guess I'll just have to wait until it makes sense money wise to upgrade from my 8 Plus.

mjtomlin 2690 comments · 20 Years


You don't have to license anything if you can prove you're using your own proprietary implementation. You can't patent ideas, you patent the implementation of the idea.

sflagel 867 comments · 11 Years

It seems then that the adjustment is done on the sender's phone, using the TrueDepth camera, but arguably, a modification could also be done at the receiving end, although maybe not as natural looking?

currentinterest 206 comments · 12 Years

If the adjustment is on the sender’s end, then the receiver may not need the XS or XS Max. Has anyone tried that out?