Eye dominance tracking could give Apple's VR and AR headsets an advantage
Apple's efforts to produce augmented reality or virtual reality devices could be improved by knowing which eye is dominant for the user, a piece of information that could help improve how eye detection systems work and make the overall experience better for the wearer.
Ocular dominance, how a person favors what they see out of one eye over another, is an important element of understanding how someone processes what they see. Depending on which eye the brain prefers to use as its main source of information, this could alter how they see a physical object or a scene compared to someone else with the opposite dominance.
The subject is especially important in the growing field of virtual reality and augmented reality, as games and applications produce two slightly different views of a digital scene or an item that gets presented to each eye, giving the effect of seeing a real-world view of a virtual environment.
Knowing which eye is dominant could give a number of benefits to applications, such as helping improve eye tracking by erring towards the positions looked at by the dominant eye versus the other. It would also be beneficial for instances where a user's vision is not normal, such as those with a lazy eye, as untracked systems may not take similar impairments into account when rendering the scene.
It could also be feasible for a developer to make a VR or AR system render more detail in the images seen by the dominant eye than the other, saving processing resources. For example, depth of field effects based on eye tracking could use the eyeline of the dominant eye while being applied.
In a patent application published by the US Patent and Trademark Office on Thursday, the "Eye tracking system and method to detect the dominant eye" uses images and eye tracking to score each eye of a user for accuracy. Ultimately the eye with the higher accuracy score is the dominant one, or if the eyes are relatively similar, they can be deemed equally dominant.
The filing suggests a plurality of images could be displayed to each eye, with deviations from a stared-at point monitored for each eye, measurements which are then used to determine a score. The measurements can be compiled to create an "eye dominance characterization" of the user, which can then be used to affect the images sent for each eye.
Using multiple points, it is feasible for Apple's system to calculate the 3D position and orientation of each of the eyes, and the gaze vectors of each eye, further helping calculations for gaze in general as well as for each eye's overall accuracy.
Once determined, processing systems for the AR and VR applications can weight decisions towards eye tracking measurements of the dominant eye over the weaker eye. For example, a graphical user interface that uses eye tracking for selection may see two separate items selected by each eye, but will only use the position selected by the dominant eye.
Patent filings are made by Apple on a weekly basis, but while they do indicate areas of interest for the company's research efforts, they are not a guarantee that such technologies or ideas will make their way into future products or services.
Apple is believed to be working on some form of VR or AR devices for a number of years, something which may come to fruition in 2020 with a pair of AR-based smart glasses. It has collected many patents over the years for related technologies, and has even taken public steps into the field, including demonstrating using VR on macOS and the introduction of ARKit for iOS.
For eye tracing in particular, Apple has helped itself along by acquiring SensoMotoric Instruments in 2017, a German hardware company specializing in the field.
In patents, Apple has looked into ways to build eye tracking systems that are capable of being mounted close to the user's face instead of directly ahead, making them more feasible for use in headsets. A 2018 patent application uses a "hot mirror" that reflects infrared light while allowing visible light to pass through, and the system could also feasibly pick up multiple other elements such as pupil dilation and eyelid closure.
In 2015, Apple was granted a patent for gaze-based eye tracking, which could allow a person to control an iOS or macOS device by looking at a screen, a system that also mitigates the issue of the Troxler Effect that could affect vision-based interactions.