Apple has won another Key Hand-and-Eye Tracking System Patent relating to Interacting with 3D Environments
Yesterday Patently Apple posted a report titled "Apple's Mixed Reality Headset will work with a 3D-Like version of iOS, use Patented Eye and Hand-Tracking technologies and more." The report noted Bloomberg's Mark Gurman noting that "Apple’s goal is to bring something new to the table. Eye and hand-tracking capabilities (covered in patents 01 and 02) will be a major selling point for the device, according to people familiar with the product." Today, Apple was granted yet another crucial patent relating to hand and eye tracking technology.
GUI's for Interacting with 3D Environments
Apple's patent generally relates to computer systems with a display generation component and one or more input devices that provide computer generated reality (CGR) experiences, including but not limited to electronic devices that provide virtual reality and mixed reality experiences via a display such as a Mixed Reality Headset. A key aspect of the patent focuses on hand and eye-tracking systems.
Apple's patent FIG. 1 below is a block diagram illustrating an operating environment of a computer system for providing CGR experiences; FIG. 4 is a block diagram illustrating a hand tracking unit of a computer system that is configured to capture gesture inputs of the user. Although hand-tracking began with a focus on desktops and televisions via PrimeSense technology for the Xbox Kinect device, Apple acquired PrimeSense and the technology will be used in their future HMD and handhelds.
Apple's patent FIG. 5 above is a block diagram illustrating an eye tracking unit of a computer system (HMD) that is configured to capture gaze inputs of the user.
Apple's patent FIG. 6 below is a flowchart illustrating a glint-assisted gaze tracking pipeline.
Beyond the foundation of providing Hand and Eye-Tracking technology, the patent takes a deep dive into interacting with 3D Environments. Apple notes that computer systems with display generation components are provided with improved methods and interfaces for interacting with a three-dimensional environment and facilitating the user's use of the computer systems when interacting with the three-dimensional environment, thereby increasing the effectiveness, efficiency, and user safety and satisfaction with such computer systems.
Such methods and interfaces may complement or replace conventional methods for interacting with a three-dimensional environment and facilitating the user's use of the computer systems when interacting with the three-dimensional environment.
In some embodiments, the computer system changes the level of immersion with which a computer-generated experience (e.g., visual experience, audio-visual experience, virtual reality experience, augmented reality experience, etc.) is presented to a user in accordance with biometric data corresponding to the user.
For example, when the user is adjusting their physical and emotional states, e.g., proactively or under the influence of the computer-generated content, after the computer-generated experience is started, the computer system may detect changes in the biometric data (e.g., heart rate, blood pressure, breathing rate, etc.) corresponding to the user. In accordance with the changes in the biometric data relative to respective sets of preset criteria associated with different levels of immersion, the computer system increases or decreases the level of immersion with which the computer-generated experience is provided to the user (e.g., by changing the visual prominence (e.g., including spatial extent, visual depth, color saturation, visual contrast, etc.) of virtual content relative to the visual prominence of the representation of the physical environment (e.g., by enhancing complexity, spatial extent, and/or visual characteristics of the virtual content, and/or reducing the visual clarity, blur radius, opacity, color saturation, etc. of the representation of the physical environment, etc.).
Adjusting the level of immersion with which a computer-generated experience is provided to a user based on changes in the biometric data corresponding to the user helps the computer system to provide a smoother transition between a less immersive experience and a more immersive experience that better corresponds to the perceptive state of the user for the computer-generated experience, thereby reducing user confusion and improving the efficacy of the computer-generated experience.
For more details, review Apple's granted patent US 11562528 B2.
Comments