Polish Regulator investigating Apple's iOS App Tracking Transparency feature to see if it's a case of Exclusionary abuse of Market Power
Apple TV+ Releases the Official Season-3 Trailer for "Servant" which looks really Intense

A Major Gaze-Tracking Patent published last week was written by one of Apple's Mixed Reality, Computer Vision Superstars

1 x eye gaze tracking

 

Last Thursday the US Patent & Trademark Office published a patent application from Apple that relates to assessing characteristics of pupils of eyes, and in particular, to systems, methods, and devices for assessing pupil characteristics using light reflected off the eyes to estimate the user's gaze direction using multiple glints to identify the user's eye shape, position, and orientation. In various implementations, pupil characteristic assessment is used to facilitate gaze tracking, which may be used to enable user interaction.

 

Last week Apple analyst Ming-Chi Kuo stated that "Apple's future headset will include innovative human-machine UI technologies such as gesture control, object detection, eye/gaze tracking, iris recognition, voice control, skin detection, expression detection, and spatial detection."

 

Patently Apple has covered a number of eye and gaze tracking patents (01, 02, 03, 04 & 05) and it's clear that this technology will play an important role in Apple's future mixed reality headsets. Apple fans will appreciate this technology once they're able to see it in action. For now, Apple's patent on this technology is very technical and likely to be more appreciated by developers and engineers in this field.

 

Apple notes in their patent filing that existing pupil detection techniques may use an image sensor and integrate light intensity level over an exposure period to produce greyscale images and then attempt to detect the pupil using the greyscale images. The pupil is detected based on greyscale contrast between the pupil region and the surrounding iris region and thus rely on there being significant contrast between the pupil and the iris. Such pupil detection techniques may not be as accurate or efficient especially in circumstances in which the contrast between pupil and iris is less significant. This is what Apple's invention sets out to remedy.

 

Pupil Assessment using Modulated On-Axis Illumination

 

Apple invention covers a pupil characteristic assessing system that includes a light source, an image sensor, and a processor that performs pupil characteristic assessments on data received from the light sensor regarding light from the light source reflected off an eye of a user. In various implementations, a pupil characteristic is determined using on-axis illumination from the light source so that light from the light source is reflected off the retina of the eye to produce a bright pupil-type light pattern in data obtained by the image sensor.

 

The light may be modulated or otherwise pulsed at a frequency and frequency segmentation may be used to distinguish reflections through the pupil off the retina from reflections of light from other light sources. In some implementations, the image sensor is a frame-based camera and the method subtracts one image from the next image along a sequence of images to identify light pulses that occur at the frequency in the images.

 

In some implementation, the image sensor is an event camera and the amount of time between events corresponding to light reflected off the retina and through the pupil is used to determine events that occur at the frequency.

 

The event camera has light sensors at multiple respective locations. In response to a particular light sensor detecting a change in intensity of light, the light sensor generates an event message indicating a particular location of the particular light sensor.

 

An event camera may include or be referred to as a dynamic vision sensor (DVS), a silicon retina, an event-based camera, or a frame-less camera. Thus, the event camera generates (and transmits) data regarding changes in light intensity as opposed to a larger amount of data regarding absolute intensity at each light sensor.

 

Apple's patent FIG. 4 below is a block diagram of an exploded view of a head-mounted device (HMD); FIG. 7 illustrates a functional block diagram illustrating differences between a bright pupil effect and a dark pupil effect.

 

2 Apple patent figs

 

More specifically, the housing #401 of FIG. 4 also houses a pupil assessment system including one or more light sources #422, image sensor #424, and a controller #480. The one or more light sources emit light towards the eye of the user that reflects light (e.g., a directional beam) that can be detected by the sensor. Based on the reflections, the controller can determine pupil characteristics of the user.

 

As another example, the controller can determine a pupil center, a pupil size, gaze direction, or a point of regard. Thus, in various implementations, the light is emitted by the one or more light sources, reflects off the eye of the user, and is detected by the sensor. In various implementations, the light from the eye of the user is reflected off a hot mirror or passed through an eyepiece before reaching the sensor.

 

Apple's patent FIG. 8 below illustrates a functional block diagram illustrating use of a beam splitter to provide approximately on-axis illumination; FIG. 9 illustrates a functional block diagram illustrating use of light source ring near the optics of a light sensor to provide approximately on-axis illumination; and FIG. 10 illustrates a functional block diagram illustrating the light source ring of FIG. 9.

 

3 Apple patent figs

 

Apple's patent FIG. 11 above illustrates a functional block diagram illustrating combining pupil detection based on on-axis illumination with eye characteristic detection based on off-axis illumination; FIG. 12 illustrates a collection of event camera data obtained during on-axis illumination; FIG. 13 illustrates a close up view a portion of the event camera data of FIG. 12.

 

For deeper details behind Apple's patent application number 20210378509, click here.

 

The patent lists Walter Nistico as the sole inventor of this patent. Nistico is the Engineering Manager – Deep Learning and Computer Vision at Apple. Nistico came to Apple 4.5 years ago with the acquisition of SMI SensoMotoric Instruments GmbH. Nistico's LinkedIn profile states that Designed and shipped 7 products redefining state-of-the-art in Eye Tracking, used by all major university, research institutions. Prior to joining Apple, Nistico also worked on VR and AR projects with Google, Qualcomm, Sony, Nvidia and Intel.

 

Considering that this is a patent application, the timing of such a product to market is unknown at this time.

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.