Five Mixed Reality Headset Patents were published today covering Gaze Tracking, Lenticular Displays and more
Today the US Patent & Trademark Office published another series of patent applications from Apple that relate to their future Mixed Reality Headset. The first in this series relates to gaze tracking, and in particular, to systems, methods, and devices for gaze tracking using a light source direction of a light source used to produce one or more glints on the surface of the eye. The other patent applications cover Multimodal Inputs; Presenting Synthesized Reality Companion Content; and Optical Film Arrangements.
Some existing gaze tracking systems use light reflected off of the surface of the eye to estimate gaze directions. Such techniques may estimate the user's gaze direction using multiple glints to identify locations along the user's gaze (e.g., pupil center, eye center, and cornea center) or by identifying the user's eye shape, position, and orientation. Existing techniques may be unable to determine gaze direction from a single glint and may not be as accurate or efficient as desired.
Glint-Based Gaze Tracking using Directional Light Sources
Apple's patent covers various implementations that include devices, systems, and methods that determine a gaze direction based on a cornea center and either (a) a pupil center or (b) an eyeball center.
Apple states that in various implementations, gaze tracking is used to enable user interaction, provide foveated rendering, or reduce geometric distortion. A gaze tracking system includes a sensor and a processor that performs gaze tracking on data received from the sensor regarding light from a light source reflected off the eye of a user. In various implementations, the sensor includes an event camera with a plurality of light sensors at a plurality of respective locations that, in response to a particular light sensor detecting a change in intensity of light, generates an event message indicating a particular location of the particular light sensor.
An event camera may include or be referred to as a dynamic vision sensor (DVS), a silicon retina, an event-based camera, or a frame-less camera. Thus, the event camera generates (and transmits) data regarding changes in light intensity as opposed to a larger amount of data regarding absolute intensity at each light sensor. Further, because data is generated when intensity changes, in various implementations, the light source is configured to emit light with modulating intensity.
Apple's patent FIG. 6 below illustrates a functional block diagram illustrating gaze tracking using a diffuse light source; FIG. 5 is a flowchart representation of a method of gaze tracking.
Apple's patent FIG. 12 below illustrates a functional block diagram illustrating gaze tracking based on eyeball center.
In some implementations, gaze tracking may be performed based on an assumption (or measuring) that the eyeball center does not move so that the gaze direction may be determined without detecting the pupil. Apple's patent FIG. 12 above illustrates gaze tracking based on eyeball center in accordance with some implementations. In this example, the eyeball center (#1310) is determined by fitting a sphere on cornea center movements. The gaze direction (#1350) may then be determined by determining the vector from the eyeball center through the cornea center (#920).
In some implementations, two scanners may be used to determine the cornea center and the gaze direction is determined using either the pupil center or the eyeball center.
For example, FIG. 12 illustrates a functional block diagram illustrating gaze tracking using two variable angle light sources (e.g., a scanners 916a, 916b) at known positions. The cornea center is determined based on glints from the light source directed by scanners and the angles of the scanners measured by the encoders.
For finer details, review Apple's patent application number 20210068652 titled "Glint-Based Gaze Tracking Using Directional Light Sources."
Other Headset Related Patent Applications Published Today
01: Multimodal Inputs for Computer-Generated Reality
Apple's patent FIG. 3A above illustrates an example of facial expression tracking to initiate computer-generated reality recording; FIGS. 3B and 3C illustrate examples of tracking a gaze direction to initiate computer-generated reality recording.
02: Method and Device for Presenting Synthesized Reality Companion Content
03: Optical Film Arrangements for Electronic Device Displays
A lenticular display may be formed with convex curvature. The lenticular display may have a lenticular lens film with lenticular lenses that extend across the length of the display. The lenticular lenses may be configured to enable stereoscopic viewing of the display.
04: Pixel Arrangements for Electronic Device Displays
This is a second patent relating lenticular display. Apple's patent FIG. 13 of this patent illustrates a state diagram showing how the display may be operated in a two-dimensional display mode and a three-dimensional display mode
Considering that these are patent applications, the timing of such products and/or features coming to market is unknown at this time.
Comments