A new patent application from Apple this week describes Vision Pro’s Real-Time Retinal Imaging System that detects a user’s Focus
I’ve stated many times in the past that Apple’s Vision Pro’s eye-tracking technology is one of the key technologies that propels Apple’s Spatial Computer above all other headsets. This week the US Patent & Trademark Office published yet another patent application from Apple that relates specifically to systems and methods that provides Vision Pro with retinal imaging-based accommodation detection (i.e., focus).
In Apple’s patent background they note that various eye tracking techniques exist. For example, some gaze tracking techniques are based on detecting reflections on outer portions of the eye. Light is directed toward an eye to cause detectable reflections on the pupil and cornea and these reflections are tracked by a camera to determine gaze direction, for example, by determining a vector between the cornea and pupil that corresponds to gaze direction. Existing eye tracking techniques may not provide adequate tracking of eye accommodation (i.e., focus).
Retinal Imaging-Based Eye Accommodation Detection
Apple’s patent application covers systems, methods, and devices that use a retinal imaging technique to assess a user's eye accommodation (i.e., focus) during use of an electronic device, e.g., in real time.
One or more light sources direct lights toward one or more spots on the retina that are detectable via a sensor. The size and shape of the spot(s) depend upon the eye accommodation/focus and thus are used to identify an accommodation/focus change or measure the eye's accommodation/focus. Eye accommodation may be determined in real time while a user is viewing, interacting with, or otherwise experiencing electronic content via the electronic device.
Some implementations involve a method that is performed via a processer executing instructions stored in a non-transitory memory of an electronic device. Such a method may involve, at an electronic device having a processor and a light source, producing light (e.g., infrared or visible light) using the light source to direct light toward a spot on a retina of an eye.
For example, one or more collimated illuminators may be used to direct light toward one or more spots on the retina. The light sources may be oriented in an off-axis manner relative to an axis of the eye. The method receives sensor data (e.g., an image) at a sensor (e.g., a camera). The sensor data corresponds to the illuminated spot on the retina. For example, the sensor data may include one or more camera images having image portions, e.g., pixels that correspond to a retinal appearance including light reflected at an illuminated spot.
The method determines an eye accommodation characteristic based on the sensor data. In some implementations, the eye accommodation characteristic is determined based on a size or a position of the illuminated spot on the retina. Determining the eye accommodation characteristic may involve detecting a change in accommodation or finding an estimate (e.g., a numerical value representing) the accommodation of the eye. In some implementations, determining the eye accommodation characteristic involves comparing an image of the retina to a previous image of the retina to identify a change in size and/or position of an illuminated spot.
In some implementations, a spot is identified (e.g., its size, shaped, and/or position) and/or distinguished from other less-illuminated portions of the retina. A spot may be identified via an algorithm (e.g. based on a threshold) and/or using a machine learning (ML) model. A ML model may be used to assess/compare the size and/or position of the spot and/or a relationship between multiple spots.
Apple’s patent FIG. 8 below is a block diagram of an example head-mounted device (HMD) which we now know is Vision Pro; FIG. 3 illustrates one or more illuminators directing light toward a spot on a retina of an eye in a first accommodation/focus state; FIG. 4 illustrates an image of the retina including the spot illuminated in FIG. 3.
Apple’s patent FIG. 5 above illustrates the one or more illuminators of FIG. 3 directing light toward a spot on the retina of the eye in a second accommodation/focus state; FIG. 6 illustrates an image of the retina including the spot illuminated in FIG. 5.
In some implementations, Vision Pro includes an eye tracking system for detecting eye position and eye movements. The eye tracking system may include one or more illuminators (e.g., one or more infrared (IR) light-emitting diodes (LEDs)) that emit light that is reflected off of the user’s eye and captured via a sensor (e.g., a near-IR (NIR) camera).
In one example, the one or more illuminators of Vision Pro may emit NIR light directed toward one or more spots on the retina of the eye of the user and a sensor captures images of the eye of the user including the illuminated spot. In some implementations, images captured by the eye tracking system may be analyzed to detect position and movements of the eyes of the user to detect other information about the eyes such as gaze direction, and/or to detect accommodation/focus of the eye. Moreover, the point of gaze estimated from the eye tracking images may enable gaze-based interaction with content.
To review the patent’s full details, review patent application "20230329549."
Comments