Apple won a patent for Advanced Controls related to their Vision Pro Headset that includes Eye Gaze, Siri, Touch & more
This week the U.S. Patent and Trademark Office officially granted Apple a patent that revealed what was coming to Apple's future headset in terms of user interfaces that would use a combination of eye gaze and touch controls along with hand and body gestures and even Siri. Prior to Apple introducing their Vision Pro, this was one of the most descriptive patents on this subject matter to date.
Apple's granted patent covers techniques for interacting with a Head Mounted Device (HMD) using eye gaze / eye tracking. A user use will be able to use their eyes to interact with user interface objects displayed on the HMD display. The techniques provide a more natural and efficient interface by, in some exemplary embodiments, allowing a user to operate the device using primarily eye gazes and eye gestures (e.g., eye movement, blinks, and stares).
Techniques are also described for using eye gaze to quickly designate an initial position (e.g., for selecting or placing an object) and then moving the designated position without using eye gaze, as precisely locating the designated position can be difficult using eye gaze due to uncertainty and instability of the position of a user's eye gaze.
Eye tracking techniques will later be applied to conventional user interfaces on devices such as Mac desktop computers, MacBooks, iPads, and iPhones. The techniques are also advantageous for computer-generated reality (including virtual reality and mixed reality) devices and applications.
As illustrated in patent FIGS. 19C and 19D below, the patent describes interacting with a virtual room #1902 and moving an object #1908 using gaze controls and touch controls that are built into the side surface of the HMD.
The HMD includes sensor(s) configured to detect various types of user inputs, including (but not limited to) eye gestures, hand and body gestures, and voice inputs. In some embodiments, input device includes a controller configured to receive button inputs (e.g., up, down, left, right, enter, etc.).
The virtual room/environment #1902 includes stack of photos #1908, which includes individual photos #1908a-1908e, lying on a table. Gaze #1906 seen in view #1902b indicates that user is looking at stack of photos.
Designation of photo #1908a below is indicated by focus indicator #1914, which includes a bold border around photo #1908a. In some embodiments, the focus indicator includes a pointer, cursor, dot, sphere, highlighting, outline, or ghost image that visually identifies the designated object. In some embodiments, the HMD un-designates the photo and returns the photos back to table in response to receiving further input (e.g., selection of an exit button or liftoff of a touch).
Other Moving Object Patent Figures
In the patent figures below Apple illustrates how using gaze and touch controls will be used to move items within a virtual environment like paintings/photos and even a coffee mug object.
Later in the patent, Apple notes that the HMD system includes image sensor(s), optionally includes one or more visible light image sensors such as charged coupled device (CCD) sensors, and/or complementary metal-oxide-semiconductor (CMOS) sensors operable to obtain images of physical objects from the real environment.
Image sensor(s) also optionally include one or more infrared (IR) sensor(s), such as a passive IR sensor or an active IR sensor, for detecting infrared light from the real environment. For example, an active IR sensor includes an IR emitter, such as an IR dot emitter, for emitting infrared light into the real environment. Image sensor(s) 108 also optionally include one or more event camera(s) configured to capture movement of physical objects in the real environment.
Image sensor(s) also optionally include one or more depth sensor(s) configured to detect the distance of physical objects from the HMD system. In some embodiments, the HMD uses CCD sensors, event cameras, and depth sensors in combination to detect the physical environment around the HMD.
To review the mountainous amount of details for this invention, review Apple's granted patent 11714592. This week's granted patent was an update. Apple began work on this patent back in 2017. It's one of Apple's 5,000 patents covering Apple Vision Pro.
Other Granted Patents related to XR Environments:
Granted Patent 11715271: XR Preferred Movement Along Planes
Granted Patent 11714519: Moving About A Setting
Comments