A new Apple Patent covers another aspect of their Advanced Eye Tracking System for both Vision Pro & Future Smartglasses
On Tuesday Patently Apple posted a granted patent titled "Apple has Won a Key Patent that relates to the Vision Pro's Eye Tracking System." The patent specifically covered "Sensor Fusion Eye Tracking." Today the US Patent & Trademark Office published a new patent application from Apple titled "Eye Tracking using Efficient Image Capture and Vergence and Inter-Pupillary Distance History." The invention relates to both an HMD, in the form of Apple Vision Pro or future smartglasses.
Apple's invention covers implementations of tracking an eye characteristic (e.g., gaze direction or pupil position) of a user's eyes by staggering image capture of each eye and using a predicted relationship between the user's eyes between eye captures to predict that eye's eye characteristic between those eye captures.
Images of a user's eyes are captured in a staggered manner in the sense that the images of second eye are captured between the capture times of the images of the first eye and vice versa.
Some implementations provide a device that tracks gaze directions of a user's two eyes by staggering image capture of each eye and uses vergence history to predict the other eye's gaze direction for intermediate frames between captures.
For example, to determine gaze directions for both eyes at N fps rate, staggered images of each eye may be captured at N/2 fps with 1/N seconds in between image captures of the first and second eyes, and the intermediate gaze directions between frames for each eye may be predicted based on the other eye's gaze direction at each intermediate frame time and a predicted vergence at that time.
In one example, a device includes one or more image sensors configured to stagger capture of images of a first eye and a second eye of a user, where images of the first eye are captured at approximately a first frame rate at capture times and images of the second eye are captured at approximately the first frame rate. The images of second eye are captured between the capture times of the images of the first eye.
Some implementations provide a device that tracks pupil positions of a user's two eyes by staggering image capture of each eye and uses instantaneous IPD to predict the other eye's pupil position for intermediate frames.
For example, to determine pupil positions for both eyes at N fps rate, staggered images of each eye may be captured at N/2 fps and the intermediate pupil positions between frames for each eye predicted based on the other eye's pupil position at each intermediate frame time and a predicted instantaneous IPD at that time.
Apple's patent FIG. 1 below illustrates an example device for tracking a user's eyes. FIG. 1 illustrates tracking two eyes #125a and 125b of a user using a device (#100) that has two eye cameras #120a and #120b (one camera for each eye).
The device (e.g., an eye-glasses device or other head-mounted device (HMD)) includes the two eye cameras (#120a & #120b) two illuminators (#122a & #122b) and two content viewing portions (#130a & #130b). These components may be embedded within or attached to a housing or other portion of device 100. For example, if device #100 is an eyeglasses device, two eye cameras and illuminators may be embedded in a frame portion (not shown) of the eyeglasses device that surrounds or supports the two content viewing portions.
Apple's patent FIG. 2 above illustrates an example timeline of capturing eye images for eye tracking.
Apple's patent FIG. 7 below illustrates an example monocular horizontal gaze angle; FIG. 8 illustrates an example horizontal vergence; FIG. 9 illustrates example monocular vertical gaze angles used to determine a vertical vergence.
Apple's patent FIG. 10 above is a flowchart representation of a method of tracking an eye characteristic; FIG. 11 is a block diagram of components of the exemplary device of FIG. 1.
To dive deeper into Apple's invention, review Apple's patent application number 20230239586.
Two of the Team Members on this Apple Project
- Mehmet N. Ağaoğlu: Human Vision Scientist/Display Engineer
- Andrew B. Watson: Chief Vision Scientist