Apple reveals more about Mixed Reality Headset GUI's, how to control menus with Eye Tracking & in-air Micro Gesturing
Today the US Patent & Trademark Office published a patent application from Apple that relates to computer systems with a display generation component and one or more input devices that provide computer generated experiences, including but not limited to electronic devices that provide virtual reality and mixed reality experiences via a display. In large part, Apple's invention dives into a future Mixed reality Headset that will work in conjunction with micro hand gestures and eye tracking that could be useful in playing video games, in navigating menus and controlling media playback.
While displaying a three-dimensional environment, a computer system detects a hand at a first position that corresponds to a portion of the three-dimensional environment. In response to detecting the hand at the first position: in accordance with a determination that the hand is being held in a first predefined configuration, the computer system displays a visual indication of a first operation context for gesture input using hand gestures in the three-dimensional environment; and in accordance with a determination that the hand is not being held in the first predefined configuration, the computer system forgoes display of the visual indication.
In some embodiments, a computer system allows a user to use micro-gestures performed with small movements of fingers relative to other fingers or parts of the same hand to interact with a three-dimensional environment (e.g., a virtual or mixed reality environment).
The micro-gestures are detected using cameras (e.g., cameras integrated with a head-mounted device or installed away from the user (e.g., in a CGR room)), e.g., as opposed to touch-sensitive surfaces or other physical controllers.
Different movements and locations of the micro-gestures and various movement parameters are used to determine the operations that are performed in the three-dimensional environment. Using the cameras to capture the micro-gestures to interact with the three-dimensional environment allow the user to freely move about the physical environment without be encumbered by physical input equipment, which allows the user to explore the three-dimensional environment more naturally and efficiently.
In addition, micro-gestures are discrete and unobtrusive, and are suitable for interactions that may occur in public and/or require decorum.
The ready state configuration of the hand is used by a computer system as an indication that the user intends to interact with the computer system in a predefined operation context that is different from the currently displayed operation context. For example, the predefined operation context is one or more interactions with the device that is outside of the currently displayed application (e.g., game, communication session, media playback session, navigation etc.).
Apple's patent FIG. 4 below illustrates hand tracking unit #243. In some embodiments, an eye tracking unit is configured to track the position and movement of the user's gaze in respect to the user's hand. Patent FIG. 4 further includes a schematic representation of a depth map #410 captured by the image sensors #404.
Patent FIG. 4 also schematically illustrates a hand skeleton #414 that a controller ultimately extracts from the depth map of the hand #40. In FIG. 4, the skeleton is superimposed on a hand background #416 that has been segmented from the original depth map.
In some embodiments, key feature points of the hand (e.g., points corresponding to knuckles, finger tips, center of the palm, end of the hand connecting to wrist, etc.) and optionally on the wrist or arm connected to the hand are identified and located on the hand skeleton #414.
In some embodiments, location and movements of these key feature points over multiple image frames are used by the controller 110 to determine the hand gestures performed by the hand or the current state of the hand.
Apple's patent FIGS. 7A above presents block diagrams illustrating user interactions with a three-dimensional environment.
Apple's patent FIG. 7B below illustrates an example user interface context showing menu #7170 that includes user interface objects #7172-7194. The menu is displayed in a mixed reality environment (e.g., floating in the air or overlaying a physical object in a three-dimensional environment, and corresponding to operations associated with the mixed reality environment or operations associated with the physical object); FIG. 7C menu 7170 is displayed by a display of a display of a device (e.g., device 7100 (FIG. 7C) or an HMD) with (e.g., overlaying) at least a portion of a view of a physical environment captured by one or more rear-facing cameras of device #7100. In some embodiments, the menu is displayed on a transparent or semi-transparent display of a device (e.g., a heads-up display, or an HMD) through which the physical environment is visible.
Apple's patent FIG. 7G above illustrates example gestures performed with a hand in a ready state and example responses of a displayed three-dimensional environment that are dependent on a user's gaze.
In some embodiments, the user's gaze directed to a virtual object in a three-dimensional environment that is responsive to gesture inputs causes visual indication of one or more interaction options available for the virtual object to be displayed only if the user's hand is also found to be in a predefined ready state for providing gesture inputs.
Apple's patent FIG. 5 below is a block diagram illustrating an eye tracking unit of a computer system that is configured to capture gaze inputs of the user.
Apple's patent application 20210096726 titled "Devices, Methods, and Graphical User Interfaces for Interacting with Three-Dimensional Environments," is a massive patent. For those involved with the field of AR, VR and MR or just a fan of Apple's coming headset, you could dive in and review the full patent here.
Comments