Apple wins a patent for a possible future CGR-based App aimed at selling custom-fit Clothing and more
Apple patent reveals more details behind a Touch Sensitive Ring to work with in-home devices, TVs, Macs, Apple Watch, iPad & more

A new Apple Patent does a Deep Dive into the inner workings of their future AR Glasses

1 cover Smartglasses

Today the US Patent & Trademark Office published a patent application from Apple that relates to systems, methods, and devices for detecting an eye gaze direction of users. The focus device of the patent is described as "AR Glasses" or "XR Glasses that include a transparent display to view the physical environment and be provided a display to view other content via retinal projection technology that projects graphical images within a view of a person's retina or onto a person's retina.

In Apple's patent background they note that existing eye-tracking techniques analyze glints that are reflected off of a user's eye and captured via an image sensor. Some head mounted systems may include eye-tracking techniques using near-infrared (nIR) illuminators and standard multi-pixel complementary metal oxide semiconductor (CMOS) cameras pointed at each eye. The combination of nIR illuminators and CMOS cameras may provide accurate and high frame rate (e.g., greater than 15 frames per second) tracking of each eye.

However, this typical nIR+CMOS configuration has a baseline power which doesn't scale well as frame rate is reduced and is not feasible in very low power applications to keep the CMOS imager and nIR LEDs on all the time. Therefore it may be desirable to provide a means of efficiently providing an eye tracking system that does not need to be continually tracked across the entire possible gaze space in order to provide power savings for assessing an eye characteristic (e.g., gaze direction, eye orientation, identifying an iris of the eye, etc.) for head mountable systems.

Retinal Reflection Tracking for Gaze Alignment

Apple's invention covers devices, systems and methods that detect eye gaze direction of a user that is approximately oriented towards a target area (e.g., a hot zone/corner).

Determining that an eye is gazing at a target area (e.g., a hot zone/corner) is based on detecting that a reflection of light off of an eye has certain properties, e.g., a spectral property indicative of a redeye-type reflection that is aligned approximately towards the target area.

An illuminator and/or detector may be optically aligned with an approximate direction from the eye to the target area, e.g., by positioning the illuminator/detector in and/or behind the target area of the display/lens or using an optical waveguide in the target area of the lens.

The detector may be a near infrared (nIR) transceiver sensitive to spectral reflections at narrow co-axial angles. The detector may use low-power hardware (e.g., a photodiode paired with a lens) that can be active when a high-power eye tracking hardware (e.g., a multi-pixel complementary metal oxide semiconductor (CMOS)) is off. For example, a user may glance at a hot corner to trigger an action, e.g., initiate an extended reality (XR) experience that uses full eye tracking, provide a response to a notification, and the like. The intended action may be initiated if the gaze satisfies a criterion, e.g., a glance lasting more than a threshold amount of time.

In general, one innovative aspect of the subject matter described in this patent can be embodied in methods that include the actions of, at an electronic device having a processor and a display, producing a reflection by directing light towards an eye using an illuminator, receiving sensor data from a sensor, wherein a direction of sensing by the sensor and a direction from the eye to a target area are approximately aligned, determining a reflective property of the reflection based on the sensor data, detecting that a gaze direction of the eye is approximately oriented towards the target area based on the reflective property, and initiating an action based on detecting that the gaze direction is approximately oriented towards the target area.

In some aspects, Apple notes that an action is initiated based on whether the gaze direction satisfies a criterion. In some aspects, the action provides a user response to a notification. In some aspects, the action includes initiating an extended reality (XR) experience.

Apple's patent FIG. 2A below illustrates AR or XR glasses that includes image sensors and projectors; FIG. 2B illustrates zones of a transparent substrate.  In particular, FIG. 2B illustrates an example of eight defined “hot corner zones” that the processes described herein may use for hot spot detection areas (e.g., zones/areas around the display FOV #230).

The example zones include the Superior Temporal (ST) zone #231, the Superior (S) zone #232, the Superior Nasal (SN) zone #233, the Temporal zone #234, the Nasal zone #235, the Inferior Temporal zone #236, the Inferior zone #237, and the Inferior Nasal zone #238.

Apple's patent FIG. 2C below illustrates an example hot corner detection in a zone of the AR Glasses. In particular, FIG. 2C illustrates an eye gaze location at target area #262 of a user in the ST zone #231 of the lens #215 of the device as illustrated by the hot zone area #260.

In an exemplary implementation, the hot-corners may be implemented using arrays of ultrasonic sensors, photo-sensitive detectors, or electro-oculography sensors distributed around the rim of the frame #212 of the device. The eye postures and gestures that may be utilized by this subsystem may be defined by fixations, saccades, or a combination thereof. In this way, the sensors found in the AR Glasses (e.g., detector #220) are able to provide an estimate of eye gaze velocity and gaze angles (e.g., angle θ).

2 Smartglasses patent figs 2a  b  3b  4b 5abc

Apple's patent FIG. 3B above illustrates an example environment 300B, of a hot-corner eye-tracking system. The hot-corner eye-tracking system uses a light source #320 such as a near-infrared (nIR) illuminator that produces IR light at an emission angle #322, and a light sensor #330 (e.g., a low-power sensor such as a photodiode).

Apple's patent 4B above illustrates an example hot-corner eye tracking system utilizing a transparent substrate, such as an optical waveguide #410 that includes an integrated sensor #420 capable of measuring co-axial reflections from the human eye fundus. In some implementations, the sensor may be a single sensor that covers multiple zones, as illustrated. Additionally, or alternatively, individual sensors (e.g., sensor 420) may be included for each zone

Apple's patent FIGS. 5A-5C above illustrate an example user experience for detecting an eye gaze direction through or on a lens #502. For example, FIGS. 5A-5C illustrate a user looking at or near a display of a device and initiating applications on a user interface based on their gaze towards a particular area (e.g., towards a hot corner). FIG. 5A illustrates a user looking to a particular area to initiate a clock application. In FIG. 5C you're able to see a series of apps a user can open using eye gazing techniques.

Lastly, Apple notes that each lens may be configured as a stack that includes a bias (+/−) for prescription lenses, a waveguide for housing or embedding a plurality of IR light sources and transparent conductors, and the like.

To review Apple's invention in full, review patent application 20230359273.

Team Members on this Apple Project 

  • Arthur Zhang: Senior Manager - System Architecture, Vision Products Group
  • Tong Chen: Optical Sensing Engineer
  • Sid Hazra: Engineer
  • Nicholas Soldner: Architect

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.