Apple Invents an HMD Dynamic Ambient Lighting Control System & Peripheral to ensure that Real & AR World Colors Match
Lady Gaga FaceTimes with Apple's CEO to Confirm Apple's $10 Million Donation for One World: Together at Home Benefit Concert

Apple Patent reveals next-gen Remote Sensor Fusion Eye Tracking Technology for advanced Head Mounted Display

1 cover eyetracking

 

Today the US Patent & Trademark Office published a patent application from Apple that relates to remote eye tracking for electronic devices, and in particular, to systems, methods, and devices for providing remote eye tracking for electronic devices that move relative to the eye.

 

HTC's Vive Pro Eye headset was one of the first to market last year with eye tracking for VR. Engadget's video review is provide below.

 

 

In the video, Senior Editor at Engadget Jessica Conditt notes that Eye tracking technology takes advantage of Foveated Rendering which is something that we've covered in a number patent reports: 01, 02 & 03. Apple's remote eye tracking technology relates to next-gen eye tracking technology.

 

Patent Background

 

Related art eye tracking falls into two different types. The first type is mounted eye tracking that includes a sensor that physically moves dependently along with the user (e.g., eyeball). For example, a head mounted display (HMD) moves with user and can provide eye tracking.

 

The second type of eye tracking is remote eye tracking that includes a sensor that physically moves with respect to the user (e.g., separate from or independently of the user).

 

Some implementations of the second type of remote eye tracking use two infrared (IR) light sources (e.g., active illumination) separated by a minimum baseline distance to create separate cornea reflections (e.g., separate, detectable glints on the cornea).

 

These remote eye tracking approaches know the extrinsic parameters of both (i) illumination and (ii) sensors. Existing computing systems, sensors and applications do not adequately provide remote eye tracking for electronic devices that move relative to the user.

 

include devices, systems, and methods that perform remote eye tracking for electronic devices that move relative to the user.

 

Sensor Fusion Eye Tracking

 

Apple's invention covers remote eye tracking which determines gaze direction by identifying two locations in a 3D coordinate system along a gaze direction (e.g., a cornea center and a eyeball-rotation center) using a single active illumination source and depth information.

 

In some implementations, a first location (e.g., the cornea center) is determined using a glint based on the active illumination source and depth information from a depth sensor and the second location (e.g., eyeball-rotation center) is determined using an RGB sensor (e.g., ambient light) and depth information.

 

In some implementations, a single sensor using the same active illumination source determines the first location (e.g., the cornea center) and the second location (e.g., eyeball-rotation center), and the single sensor determines both depth information and glint information. In some implementations, remote eye tracking is provided by mobile electronic devices.

 

In some implementations, remote eye tracking determines a head pose in a 3D coordinate system, determines a position (e.g., eyeball rotation center) of the eye in the 3D coordinate system, and then identifies a spatial relationship between the head pose and the position of the eye. In some implementations, the spatial relationship is uniquely determined (e.g., user specific transformation).

 

Some implementations determine a first location associated with the first attribute in a three-dimensional (3D) coordinate system based on depth information from a depth sensor.

 

Various implementations detect a second attribute of the eye based on a glint resulting from light of the illumination source reflecting off a cornea of the eye. These implementations determine a second location associated with the second attribute in the 3D coordinate system based on the depth information from the depth sensor, and determine a gaze direction in the 3D coordinate system based on the first location and the second location.

 

Apple's patent FIG. 3 below is a block diagram depicting a 3D representation of an eyeball to illustrate an example eyeball modeling implementation assisting in remote eye tracking; FIG. 4 is a flowchart showing an example method for remote eye tracking; FIG. 5B is a block diagram that shows an example of imaging arrays used to collect information for remote eye tracking in an electronic device.

 

2 - figs. 3  4 & 5B Apple HMD 2nd gen Remote Eye Tracking

 

Beyond eye tracking for an HMD, Apple's patent filing touches on eye tracking for Macs as well. One use of remote eye tracking is to identify a Point Of Regard (POR) on a device in the direction of the user gaze, e.g., where the gaze direction intersects the display of the device. A POR can be used to facilitate user interaction with the device. For example, a system may detect that the users gaze has reached the bottom of the display and, in response, automatically scroll down to display more content to the user. This applies to Macs as illustrated in-part in Patent FIG. 7 below.

 

3 imac using eyetracking

 

In addition, POR enabled by various implementations could support a privacy mode where a portion of the viewed content is not changed, but all other portions of the display are scrambled (e.g., reading text, scramble all words except the word being looked at).

 

Apple's patent application 20200104589 that was published Thursday by the U.S. Patent Office was filed back in Q3 2019. Considering that this is a patent application, the timing of such a product to market is unknown at this time. While the inventors came to Apple from the SMI SensoMotoric Instruments acquisition in 2017, this is an Apple patent which was filed in Q3 2019, well after their arrival to Apple.  

 

Apple Inventors

 

Hao Qin: Computer Vision Engineer: Hao came to Apple in the SMI SensoMotoric Instruments acquisition in 2017.

 

Tom Sengelaub: Engineering Manager. Tom also Joined Apple after SMI SensoMotoric Instruments acquisition.

 

SMI was ahead of the curve on gaze and eye tracking technology for smartglasses. The 2015 video below has AMD Rose showing off their eye tracking technology. One can only imaging the progress being made at present. 

 

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.