A new Apple Patent deeply describes the use of In-Air Gesturing to control a new kind of Mixed Reality Headset Interface
Today the US Patent & Trademark Office published a patent application from Apple that relates to systems, methods, and GUIs that are designed to provide improved ways for an electronic device to interact with and manipulate objects in a three-dimensional environment, especially in context with a future Mixed Reality Headset.
Apple notes that in some embodiments, an electronic device, such as a Mixed Reality (MR) headset, facilitates interactions with selectable user interface elements. In some embodiments, the device presents one or more selectable user interface elements in a three-dimensional environment. In response to detecting the gaze of the user directed to a respective selectable user interface element, the electronic device updates the appearance of the selectable user interface element.
In some embodiments, the MR headset selects a user interface element and performs an associated action in response to a user input that includes one or more of detecting gaze of the user and detecting the user performing a predetermined in-air gesture with their hand. Enhancing interactions with selectable user interface elements in this way provides efficient and intuitive ways of making selections and performing actions with an electronic device.
In some embodiments, an MR headset enhances interactions with a virtual slider user interface element. In some embodiments, the slider user interface element includes an indication of the current input state of the slider user interface. In some embodiments, in response to detecting the gaze of the user on the slider user interface element, the electronic device updates the slider user interface element to include indications of a plurality of available input states of the slider user interface element.
In some embodiments, an HMD headset moves virtual objects in a three-dimensional environment and facilitates accessing actions associated with virtual objects. In some embodiments, the device displays a user interface element associated with a virtual object in a virtual environment.
In some embodiments, in response to detecting a first input directed towards the user interface element, the electronic device initiates a process for moving the associated virtual object in the virtual environment.
Apple's patent FIG. 5 below is a block diagram illustrating an eye tracking unit of a computer system that is configured to capture gaze inputs of the user; FIG.7A illustrates the electronic device detecting the user's gaze and via animation shows what the gaze has chosen to activate, move or delete on a user interface.
Embodiments of the gaze tracking system as illustrated in FIG. 5 above may, for example, be used in computer-generated reality, virtual reality, and/or mixed reality applications to provide computer-generated reality, virtual reality, augmented reality, and/or augmented virtuality experiences to the user.
In some embodiments, the eye tracking device is part of a head-mounted device that includes a display (e.g., display #510), two eye lenses (e.g., eye lens(es) #520), eye tracking cameras (e.g., eye tracking camera(s) #540), and light sources (e.g., light sources #530 (e.g., IR or NIR LEDs), mounted in a wearable housing.
Apple's patent FIG. 6 below illustrates a glint-assisted gaze tracking pipeline.
(Click on image to Enlarge)
Apple's patent FIGS. 9A-9D illustrate examples of how an HMD enhances interactions with slider user interface elements using in-air gestures + gaze or eye tracking.
In Apple's patent FIG. 11A below we see that a user wearing an HMD will be able to grab objects presented on a GUI and move them by using various in-air gestures such as pinch and others.
While Apple's patent definitely illustrates some of their in-air gesturing examples applying to use with an iPad or iPhone, in-air gesturing is more applicable to a Mixed Reality Headset device where a user doesn't have access to a mouse or Magic Trackpad to manipulate objects on screen.
Considering that In-Air Gesturing will use next-generation eye gaze and/or eye tracking technology to assist users navigate a new kind of user interface, the details are deeply described. To review Apple's patent application 20220121344, click here.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.