Apple has considered ways to detect the eye movements of the wearer of a head-mounted display, such as the company's long-rumored augmented reality headset, one that uses a system of cameras and mirrors to work out the direction of the user's gaze while looking at virtual environments.
The patent application identified as an "Eye Tracking System" describes one method of monitoring how the user looks at a display very close to their eyes, namely in VR and AR headsets. While this technology does exist in other devices, these systems typically rely on a camera and an IR emitter pointing at the user's face or using a mirror to reflect at a 90-degree angle, but the extra space required in front of the user to function properly isn't available in the compact and component-filled headsets.
Apple's solution is to mount the infrared emitter, cameras, and any other equipment to the sides of the device, rather than directly in front of the eyes. These components are aimed in front of a "hot mirror," a type of dielectric mirror that can reflect infrared light while allowing visible light to pass through, which is located with other optical lenses between the eyes and the display panel.
By using a hot mirror, the infrared emitter can bounce light into the user's eye. The IR light is reflected back from the user's eyes onto the hot mirror, which then returns to the eye-tracking camera.
Mounting the tracking hardware to the side or below the gaze of the user and using hot mirrors means the user's view of the display panel is not obscured by the components. Mounting closer to the face and reflecting also saves from extending the length the headset protrudes from the user's head, minimizing extra pressure that can be applied to the face while in use.
Adding an eye-tracking system to a VR or AR headset would offer a few benefits to the experience. By knowing the point of the user's gaze, a computer rendering the virtual scene can alter what is being shown to give a more realistic depth of field effect.
It can also help enable gaze-based interaction, such as content in a game performing an action when looked at by a player, or to navigate and select options in a graphical user interface. Other information, such as pupil dilation and eyelid closure could also be tracked and used, with the patent application suggesting the "creation of eye image animations used for avatars in a VR/AR environment."
Published on Thursday by the United States Patent and Trademark Office, the patent application was filed by Apple on October 19, 2017. The inventors are listed as Kathrin Berkner-Cieslicki, Ricardo J. Motta, Se Hoon Lim, Minwoog Kim, Kenichi Saito, Branko Petljanski, Jason C. Sauers, and Yoshikazu Shinohara.
Apple regularly applies for patents, filing ideas with the USPTO tens or hundreds of times a week, and in many cases the company doesn't commercialize the concept. As a result, there is no guarantee the applied patents will make an appearance in a future Apple product or service.
The patent application is the latest in a number of similar filings relating to the field of AR and VR for Apple, but aside from producing ARKit for iOS developers to create AR applications, the company has not yet produced any commercial AR or VR hardware. This hasn't stopped rumors of an Apple-produced headset from circulating, with some reports suggesting AR-equipped smart glasses could arrive on the market in 2020.
In response, CEO Tim Cook advised in October the technology to produce such a device "doesn't exist to do that in a quality way" at this time, citing challenges in display technology and hardware placement. Preferring to provide a great experience over being the first to release smart glasses, Cook warned "anything you would see on the market anytime soon would not be something any of us would be satisfied with, nor do I think the vast majority of people would be satisfied."
8 Comments
Apple is doing glasses. Let everybody else make boxes for your face, isolating you from the world. Apple is going to make something that connects you to the world.
Eye tracking will close the loop between you, exactly what you look at, and the computational network. Everything in the environment can be labeled.
It could even let you isolate a word in a text, or a letter in a word. Three-dimensional text — a completely different way to read.
RICARDO J. MOTTA used to work at nvidia was their Technical leader for the Camera Organization. Sehoon Lim now works at Microsoft is a Optical engineer with expertise in designing compact camera systems and processing frameworks. Branko Petljanski apple Camera Hardware & Systems Engineer Jason C. Sauers has tons of patents under his name at apple: https://patents.justia.com/inventor/jason-c-sauers Yoshikazu Shinohara is a Camera engineer at apple and also has a lot of patents https://patents.justia.com/inventor/yoshikazu-shinohara Surprised to not see any former SMI employees
Auto-focus glasses. Multi-billion dollar idea using most of the tech available today.
Eye tracking to know what you are looking at, 2 small cameras on each edge of the glasses for range determination. Trick is the material for the lens. 50% of people over 40 would eat them up.