One of Apple's Mixed Reality Headset Teams is Focusing their work on advanced 'Gaze Prediction'
Earlier this morning Patently Apple posted a report titled "A new Apple Patent Focuses on Optical Modules for a Future Head-Mounted Display." In a second patent application published by the US Patent & Trademark Office today we learn more about Apple's gaze tracking and prediction system for a head-mounted displays device using high-end foveated displays that could be used to play video games and more.
A gaze tracking is key to a head-mounted display and Apple has published a series of patents on the subject for their future headset (01, 02, 03 and 04 and more). No one patent holds all of the keys to a gaze tracking system and Apple has various teams working on this to ensure that they get the best system for the right device and application.
The focus and priority of today's patent is on "Gaze Prediction," which is mentioned 24 times within Apple's 27 patent claims alone.
Apple notes in their application that electronic devices may be provided with displays and gaze tracking systems. In certain types of electronic devices, it may be desirable to display images for users over a wide angle of view. Displays that cover wide angles of view at high resolutions and/or visual quality may consume relatively large amounts of image data, consume lots of rendering power, and may therefore impose bandwidth, power, and computational burdens on the electronic devices.
These bandwidth, power, and computational burdens may be reduced by using a display scheme in which high resolution (and/or high quality) images are displayed in alignment with the user's current point of gaze and in which low resolution images are displayed in the user's peripheral vision.
The term "quality" may refer to the rendering condition (e.g., better texture rendering or triangulation) or the compression condition (e.g., more or less intense quantization). It is possible to exhibit different levels of rendering or compression quality at the same resolution. Display schemes such as these may sometimes be referred to as foveated display schemes.
The user's point of gaze can be tracked using gaze tracking (eye tracking) systems. The gaze tracking systems may gather information on a user's eyes such as information on the location of the centers of a user's pupils and information on corneal reflection locations (also known as Purkinje images), from which we can infer the direction in which the user is currently gazing.
The direction in which the user is currently gazing can be used in determining the location on the display where the user is focused (the user's on-screen point of gaze). The user's point of regard on the display in addition to the gaze-direction can be used as an input to foveated display schemes to help align the user's current point of regard to the high resolution and/or quality image regions.
Electronic devices that include foveated displays and gaze tracking systems may include, for example, head-mounted displays, see-through augmented-reality glasses, cellular telephones, tablet computers, head-up displays in vehicles and other environments, laptop computers, desktop computers, televisions, wristwatches, and other suitable electronic equipment.
At one point in the patent it states that "Display #14 [of FIG. 1] may be used to display content to a user for a wholly or partially simulated environment," like a video game. Apple notes that just as a video game controller or a user's gestures or screen taps or mouse clicks, a user's "gaze on the display such as titling, turning or rotating the head can be used as a form of user input to system 10," which is the HMD of FIG. 1 below.
Apple's patent FIG. 1 below is a schematic diagram of an illustrative electronic device with a foveated display while FIG. 2 is a diagram an illustrative gaze tracking system; FIG. 5 is a flow chart of illustrative steps for operating a foveated display with gaze prediction.
Apple notes in their filing that the human gaze is subject to rapid, jerky eye movements shifting from one fixation point to another, a phenomenon known as saccades. Saccadic movements of the human eye can make it more challenging for graphics processing unit (FIG. 1 #102 above) to render foveated image data that keeps up with the user's actual point of gaze.
In Apple's gaze prediction system #100 presented in patent FIG. 1 above helps the gaze tracking system #16 predict the saccadic landing position (the user's final point of gaze) during a saccade. The gaze prediction system predicts the saccadic landing position before the gaze tracking system identifies the user's actual final point of gaze at the end of the saccade.
Apple's patent application 20190339770 that was published today by the U.S. Patent Office was filed back in Q2 2019 with some work on this invention dating back to Q2 2018. For engineers wanting to drill down further into the details of this invention can do so here.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Andrew Watson: Chief Vision Scientist
Alex Berardino, Ph.D: Display Vision Engineer
Nicolas Bonnier: Manager Display, Color and Image Processing. Worked on HDR Display for iPhone X
Mehmet N. Ağaoğlu: Visual Experience Engineer.
Yashas Rai: Visual-Perception & Data Researcher
Kleeman; Elijah: Unable to find a LinkedIn profile.