VR and AR headsets could offer high refresh rates in their screens, Apple proposes, by using a 'foveated display' and an eye tracking system to monitor the gaze of the user and to optimize rendering to focus only where the user is actively looking on a display.
One of the key elements of a virtual reality or augmented reality headset is the use of displays with a high refresh rate. Screens that update quicker and allow for faster responses to user motion can minimize the potential of nausea from not updating the screen to match movements, as well as further selling the illusion of the virtual environment or digital object being placed in the world.
One of the problems with advancing the technology used for headset displays is that there is an expectation of using higher resolution screens. By adding more pixels to a display, that means there's more elements that have to be updated each refresh, and more data that needs to be created by the host device in rendering the scene.
In a patent application published by the US Patent and Trademark Office on Thursday, the filing for a "Foveated Display" aims to solve these problems by offering two different streams of data for a display to use, consisting of high-resolution and low-resolution imagery.
The core concept of a foveated display is that the screen does not have to use high-resolution imagery for the entire screen, only whatever the user is looking at. If the position can be determined, it is possible for a display to show the resource-heavy higher-resolution picture in the user's direct view, then use the lightweight low-resolution data for the remainder of the screen.
As the rest of the display would take advantage of the user's peripheral vision, which doesn't require detail, this can significantly cut down on the amount of work that needs to be accomplished each time the display needs refreshing.
In Apple's solution, a gaze-tracking system is used to find out the point on the screen the user's eyes are trained on. Knowing this data, a graphics processing unit then renders a high-resolution image for part of the scene where the user is looking, as well as a low-resolution version for the remainder of the picture.
Using timing controller circuitry and column driver circuitry, the former can provide the image data to the latter, which then implements the changes to the display. This circuitry is also used to switch between two buffers, providing high-resolution data and low-resolution data, with each employed for their respective regions of the display.
The use of interpolation and filter circuitry could be used to alter the pixel data before it is applied, such as in areas where low-resolution data is used alongside the high-resolution version to even out any apparent seams. Two-dimensional spatial filters could also be applied to the low-resolution data buffer.
Apple files numerous patent applications with the USPTO on a weekly basis, and while it does show areas of interest for the company, it doesn't guarantee that Apple will be using the technology in a future product or service.
Apple has been long rumored to be working on a VR or AR headset, one which could launch as early as 2020 or 2021 according to some analysts. The device, possibly under the codename "T288," may feature 8K-resolution eyepieces and use the 60-gigahertz WiGig wireless networking system.
There have also been a number of patents and applications relating to headsets that have surfaced over the years, including for thermal regulation, fitting the headset to the user's head, and even for glasses that hold an iPhone as a display.
Related earlier filings for the latest discovery include a previous "Predictive, Foveated Virtual Reality System" that used a similar method of different-resolution video feeds and selective rendering to minimize latency to the user. Apple has also explored a variety of eye tracking systems, including some using a "hot mirror" allowing the components to be close to the user's face rather than further away, making the headset more comfortable to wear for longer periods.