Apple won a patent for In-Air Gestures for Vision Pro that reveals possible future gestures like 2-Hand Gestures, Head Nods+
User Interface Response Based On Gaze-holding Event Assessment
Various implementations disclosed in Apple's granted patent include devices, systems, and methods that assess user interactions to trigger user interface responses. Apple notes that using gaze-holding events can facilitate accurate gesture-to-gaze association-based input responses.
In one example, this involves associating single hand gestures, such as a pinches, gestures spreading of all five fingers on one hand, or multi-finger swipe gestures, with users intentionally gazing at user interface (UI) objects, while not associating such activities with objects that happen to be gazed upon during saccade-related or other unintentional behaviors.
In another example, this involves associating multi-hand gestures, such as both hands pinching at the same time or the hands moving away from one another, with users intentionally gazing at UI objects, while not associating such activities with objects that happen to be gazed upon during saccade-related or other unintentional behaviors.
In another example, this involves associating head movement, such as nodding, shaking, or tilting of the head, with users intentionally gazing at UI objects, while not associating such activities with objects that happen to be gazed upon during saccade-related or other unintentional behavior.
It should be noted that nodding gestures were recently introduced with Apple's latest AirPods Pro 2.
In some implementations, a gaze is associated with one or more of a hand gesture, head gesture, torso-based gesture, arm gesture, leg gesture, or whole-body movement (e.g., associating a gaze with a combined hand/head gesture).
A gaze may additionally, or alternatively, be associated with input provided via a physical device, such as a keyboard, mouse, hand-held controller, watch, etc.
In some implementations, gaze-holding events are used to associate a non-eye-based user activity, such as a hand or head gesture, with an eye-based activity, such as the user gazing at a particular user interface component displayed within a view of a three-dimensional (3D) environment. For example, a user's pinching hand gesture may be associated with the user gazing at a particular user interface component, such as a button, at around the same time (e.g., within a threshold amount of time of) as the pinching hand gesture is made.
These associated behaviors (e.g., the pinch and the gaze at the button) may then be interpreted as user input (e.g., user input selecting or otherwise acting upon that user interface component).
Apple's patent FIGS. 3A, 3B, and 3C illustrate hand engagement, indirect selection, and indirect gestures based on hand and gaze, in accordance with some implementations; FIGS. 4A, 4B, 4C, and 4D illustrate various anomalies associated with a user's gaze direction relative to a user interface element; FIG. 5 illustrates an exemplary interaction tracking flow.
Apple's patent FIG. 6 above illustrates associating a pinch with a gaze event on a chart showing gaze velocity over time. Other patent figures cover "pinches with a gaze event on a chart showing gaze velocity over time" and "a pinch with a gaze-holding event rather than a saccadic event."
Numerous types of electronic systems may allow a user to sense or interact with an XR environment. A non-exhaustive list of examples includes lenses having integrated display capability to be placed on a user's eyes (e.g., contact lenses), heads-up displays (HUDs), projection-based systems, head mountable systems, windows or windshields having integrated display technology, headphones/earphones, input systems with or without haptic feedback (e.g., handheld or wearable controllers), smartphones, tablets, desktop/laptop computers, and speaker arrays. Head mountable systems may include an opaque display and one or more speakers. Other head mountable systems may be configured to receive an opaque external display, such as that of a smartphone.
This is a patent partially fulfilled. To review the full details of this invention, check out patent application 12099653.