Affiliate Disclosure
If you buy through our links, we may get a commission. Read our ethics policy.

Apple fine-tuning accuracy of AR movement and touch detection

Apple wants to make the connection between real and Apple AR objects seamless, so that when a user taps a specific virtual control, the system gets it right every time.

Very many of Apple's Apple AR patents and patent applications are to do with what can be presented to a user, and what they can do. So there are examples where Apple wants to make it so that any surface of any object in the real world could become a touch control, as far as the wearer of "Apple Glass" is concerned.

What all of these require, though, is accuracy. When the user sees a virtual object and believes they've picked it up, Apple AR has to know that. It has to know that for certain so that it can animate the virtual object in its new position.

"Positioning a user-controlled spatial selector based on extremity tracking information and eye tracking information," is a newly-revealed patent application that is concerned with exactly this. It proposes to be better than present AR and VR systems by using two different methods to determine precisely where a user is touching.

"[Current] applications do not provide a mechanism for accurately determining a virtual spatial location of a virtual contact between a virtual object and a user-controlled spatial selector," says Apple. "Current applications also do not provide a mechanism for accurately determining when the virtual contact occurs."

"For example, some systems utilize extremity tracking to estimate a position of a user's extremities relative to the virtual object," it continues. "However, the estimate provided by the extremity tracking is inaccurate, and therefore the assessment as to whether the user is selecting the virtual object is likewise inaccurate."

Extremity tracking can be done by, for instance, accelerometers within gloves, but even then it's still approximate. Apple wants to start with any kind of extremity tracker, but then supplement it with eye tracking, too.

Apple extensively uses eye tracking in its patents for "Apple Glass," and this patent application does not go into detail about how it works. Rather, it shows how eye and gaze tracking can assist with making accurate spatial calculations over where a user's hand or fingers are.

The reasoning is that you're reaching to pick up a virtual cup, or you want to press a virtual button, you're pretty certain to be looking at it. That's not always necessarily the case, but in the giant majority of times, as your finger goes toward something, your eyes are watching.

"[For example, a proposed] electronic device performs an extremity tracking function with respect to the finger of the right hand in order to determine a first candidate virtual spatial location of the CGR [Computer Generated Reality] environment," says Apple. This first candidate is effectively the first guess as to where a virtual object is, albeit a quite accurate one.

Apple emphazises that by itself, this is not enough. There are disparities that result "from inherent inaccuracies of the extremity tracking function." If the user is trying to tap on a virtual control, warns Apple, these inaccuracies could mean that "the electronic device may perform an operation that is contrary to the intended operation."

Under this proposal, just about any surface could be made to show any controls It's one thing to display a virtual control, another to be certain when a user has touched it

So Apple proposes that this extremity tracking be used, but then the determination of where an object and a user are in relation to each other, uses two systems. It "utilizes a weighted combination of the extremity tracking function and the eye tracking function."

Rather than simply adding the eye tracking information to make the system more accurate, Apple AR does have to weight the combination of information. It's perhaps not likely that there will be cases that the extremity tracking is more accurate than when combined with eye tracking, but there is one key case when actually the system should ignore both.

"As another example, the electronic device discards extremity tracking information when the eye tracking data indicates that the user is looking in the periphery," continues Apple, "which may indicate that the user likely does not want to select a virtual affordance at that point in time."

The patent application is credited to three inventors. They include Aaron Mackay Burns, whose previous related work includes a granted patent regarding virtual reality input.

Keep up with everything Apple in the weekly AppleInsider Podcast — and get a fast news update from AppleInsider Daily. Just say, "Hey, Siri," to your HomePod mini and ask for these podcasts, and our latest HomeKit Insider episode too.

If you want an ad-free main AppleInsider Podcast experience, you can support the AppleInsider podcast by subscribing for $5 per month through Apple's Podcasts app, or via Patreon if you prefer any other podcast player.



There are no Comments Here, Yet

Be "First!" to Reply on Our Forums ->