Apple has Won a Patent relating to Hand Tracking and Finger Gesturing which is a Major part of Apple Vision Pro
Back in 2017, Apple acquired a patent from Montreal Based Vrvana relating to hand tracking technologies. Today the U.S. Patent and Trademark Office officially granted Apple a patent relating to a major feature of the Apple Vision Pro: Hand Tracking and Finger Gesturing.
Apple's granted patent notes that the quality of immersion is subject to several important factors. For instance, characteristics of the display such as image quality, frame rate, pixel resolution, high dynamic range (HDR), persistence and screen-door effect (i.e., the visible lines between pixels on the screen).
The quality of the immerse experience decreases when the displayed field of view is too narrow or if the various tracking functions are slow and/or inaccurate (leading to disorientation and nausea; otherwise known as simulation sickness).
Immersion is also impacted by the camera system performance such as the image quality (noise, dynamic range, resolution, absence of artifacts) and the coherence between the virtual graphics (3D modeling, textures and lighting) and the pass-through images. In mixed reality (MR), virtual elements are composited in real-time into the real-world environment seen by the user. Physical interaction between the virtual elements and real-world surfaces and objects can be simulated and displayed in real-time.
Tracking of various elements is generally recognized as an essential prerequisite for achieving a high-end VR and MR application experience. Among these elements, positional head tracking, user body tracking and environment tracking play a key role in achieving great immersion.
Positional head tracking (referred to as positional tracking from here on), which aims to estimate the position and orientation of the HMD in an environment, has to be both low latency and accurate. The reason for this being that the rendered graphics must closely match the user's head motion in order to produce great immersion in VR and the need to correctly align the virtual content in the real world in MR.
User body tracking estimates the position and orientation of the user's body (in particular, but not limited to hands and fingers) relative to the HMD. It can provide in both VR and MR, a means of user input (e.g. hand gestures) enabling interaction with virtual elements. While some positional tracking methods can be used for hand tracking as well (e.g. an IR camera with an array of LEDs on hand-held controllers), other methods take advantage of a smaller analysis space, typically within one meter from the HMD, to increase the robustness of the hand and finger tracking algorithms.
For instance, close-range Time-of-Flight (ToF) cameras can be integrated with or in the HMD. These cameras can yield a depth map of the hands from which a skeletal model of the hands can be constructed. Another approach uses an IR LED flood light together with cameras to segment out and estimate 3D points on the hands and fingers.
While some positional tracking methods can be used for hand tracking as well (e.g. an IR camera with an array of LEDs on hand-held controllers), other methods take advantage of a smaller analysis space, typically within one meter from the HMD, to increase the robustness of the hand and finger tracking algorithms. For instance, close-range Time-of-Flight (ToF) cameras can be integrated with or in the HMD. These cameras can yield a depth map of the hands from which a skeletal model of the hands can be constructed. Another approach uses an IR LED flood light together with cameras to segment out and estimate 3D points on the hands and fingers.
Apple's patent FIG. 1 below is a schematic representation of a user wearing head-mounted display (HMD) provided with several cameras and infrared (IR) emitters; FIG. 2A is a schematic top view of an exemplary embodiment of the optics, display and cameras used to achieve both virtual and mixed reality.
Apple's patent FIG. 4A below shows the front view of a first exemplary embodiment of the HMD device, with two RGB cameras optimized for pass-through purposes (MR) and two IR cameras that provide visual data for tracking; FIG. 5 is a flow diagram of the processing steps to achieve VR with positional and user body tracking; FIG. 8 is a flow diagram of an exemplary process to achieve user body tracking. It also includes gesture recognition.
For more details, review Apple's granted patent 11693242.
Comments