Future Smartglasses will use advanced Optical Sensor modules including an Interferometric Sensor & extended Depth of Focus Optics
Apple continues to work on a possible next-gen Modular Computing System to provide users with a higher quality user experience

Apple files patent for Mapping a portion of the Retina and Tracking the Gaze of the user based on Enrollment Data

1 Apple Vision Pro GRAPHIC

Patently Apple began covering future eye-tracking systems going back to 2013 or a decade ago when Apple acquired the Israeli firm PrimeSense and gained their patents which some covered eye and head tracking systems. Over the years there have been dozens of patents covering eye-tracking. At times I'd think that this was another decade out; it was just too science fiction-like a technology. So when Apple introduced Vision Pro, my jaw dropped. It was mind-boggling to see how advanced Apple's eye-tracking was.

Today the US Patent & Trademark Office published a patent application from Apple that dives deep once again to explain accommodation tracking based on retinal-imaging and an enrollment system when setting up Apple Vision Pro to test parameters of the eye.

In Apple's patent background they note that existing eye-tracking techniques analyze glints that are reflected off of a user’s eye and captured via an image sensor. Some head mounted systems may include eye-tracking techniques to analyze glint’s using light projected from light sources located at an edge of a device (e.g., the frame of a pair of glasses). The eye-tracking system may lack accuracy on determining a depth of the viewer’s gaze and be able to track the user’s gaze depth in real-time. In this way, it may be desirable to provide a means of efficiently determining precisely which part of the scene (which distance, or “depth”) the user is concentrated on for assessing an eye characteristic (e.g., gaze direction, eye orientation, identifying an iris of the eye, etc.) for head mountable systems.

Accommodation Tracking based on Retinal-Imaging

Apple's patent covers various implementations of devices, systems, and methods that determines and tracks an eye characteristic (e.g., eye accommodation distance or depth) based on retinal imaging. Changes in eye accommodation induces two effects in the retinal image - scaling and defocus. In some aspects, a method acquires a retinal image, obtains an enrollment image, and determines user accommodation with respect to the enrollment image based on blurring (e.g., Point Spread Function (PSF)) and/or geometric scaling.

Accommodation may be used to improve eye tracking and enhance an extended reality (XR) experience by enabling determination (with increased accuracy) of which part of the scene (e.g., which depth) the user is concentrated on and how well the user accommodates.

Additionally, tracking accommodation can be used to better understand the user’s precise behavior in real-time and adjust the XR experience accordingly. In some aspects, tracking accommodation can be used to adjust the perceived depth of content (e.g., virtual content) to the depth towards which the user is currently accommodated.

In general, one innovative aspect of this patent filing can be embodied in methods that include the actions of, at an electronic device having a processor, producing light that reflects off a retina of an eye, receiving an image of a portion of the retina from an image sensor, the image corresponding to a plurality of reflections of the light scattered from the retina of the eye, obtaining a representation of the eye corresponding to a first accommodative state, the representation representing at least some of the portion of the retina, and tracking an eye characteristic based on a comparison of the image of the portion of the retina with the representation of the eye.

In some aspects, the representation of the eye includes a map of the at least some of the portion of the retina.

In some aspects, generating the map of the at least some of the portion of the retina includes obtaining enrollment images of the eye of a user while the user (i) accommodates the eye to an enrollment depth, and (ii) scans through a gaze angle space representative of a defined field of view, and generating the map of the at least some of the portion of the retina based on combining at least a portion of two or more of the enrollment images of the eye.

In some aspects, tracking the eye characteristic based on the comparison of the image of the portion of the retina with the representation of the eye includes estimating a degree of defocus of a feature. In some aspects, estimating the degree of defocus of the feature is based on focus pixels.

In some aspects, tracking the eye characteristic based on the comparison of the image of the portion of the retina with the representation of the eye includes sharpening the quality of the image of the portion of the retina, and determining an amount of defocus based on a lens movement required to sharpen the quality of the image of the portion of the retina.

Apple's patent FIG. 8 below is a block diagram of an example head-mounted device (HMD/Vision Pro) that incorporates their advanced eye-tracking system;  FIG. 2 illustrates an example eye-tracking system; FIG. 6 is a flowchart representation of a method for an enrollment process for tracking an eye characteristic.

2 Advanced Eye-Tracking

In regards to Apple's patent FIG. 2, illustrates an example environment #200 of an eye-tracking system. The system of the example environment uses a light source #210 such as a LED ring that produces IR light (e.g., light source #34 on an HMD or iPad.

Additionally, the eye-tracking system includes image sensor #220 (e.g., to observe light scattered off of the retina of the user's eye in order to acquire an image #230 of the retina #47. As illustrated, the acquired image provides a view of the blood vessels of the eye.

Additionally, or alternatively, in some implementations, the acquired image may provide additional information other than blood vessels such as other detectable retinal features. In some implementations, as illustrated, the image sensor #220 is embedded within or in line with the light source #210 (e.g., an LED ring on an HMD).

In some implementations, as illustrated in FIG. 2, the light source #220 (e.g., an LED or the like), illuminates a surface of the retina of the eye of the user as the user is accommodating his or her sight (e.g., viewing angle α 202a). The image sensor #210 then acquires retina-based gaze tracking images of the retina as the light is reflected off of a surface of the retina (e.g., portion #235 of image #230 shows the location viewing angle α #202b associated with the viewing angle α #202a).

For example, during an enrollment process, the user may be instructed to focus their gaze to a particular location that is off in the distance of the display (e.g., focus at a location with that is 1.5 m away). For example, the particular location may be on the display of the device. If the user is wearing the device on his or her head (e.g., an HMD), then the location may appear on the display at a very far away distance (e.g., stare off into a small point such that the gaze may be looking out into infinity). The light waves from the light source are then reflected off of the retina of the eye  and detected by a detector (e.g., image sensor) to acquire image data of the retinal surface. The lens of the image sensor may be focused to infinity such that when combined with the eye’s optics the retinal surface is sharply imaged onto the image sensor (e.g., when the eye is focused to infinity, which is the default for relaxed, healthy eye).

Apple's patent FIG. 3 below illustrates an optical effect of user accommodation for an example eye-tracking system; FIG. 4 illustrates a system flow diagram for tracking an eye characteristic.

3 Apple patent advanced eye tracking and enrollment system for Vision Pro

To review its full details, review patent application 20230309824. You could find a treasure trove of eye/gaze-tracking patents in our HMD archives

  Some of the Team Members on this Apple Project

  •  Itai Afek: Physicist, Electro-optics R&D at Apple Israel
  • Ariel Lipson: Optics Engineering Team Lead at Apple Israel
  • Roei Remez: Senior Optical Engineer at Apple Israel

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.