Apple invents an eye and extremity tracking system for HMDs that will allow users to accurately touch buttons or icons in VR Worlds+
Today the US Patent & Trademark Office published a patent application from Apple that primarily relates to a future mixed reality Headset working with newly integrated extremity and eye-tracking capabilities so that when a user in a virtual world or is in augmented reality mode touches a button in a game or chooses an icon on an iPad, the accuracy of the touch is guaranteed. Hand tracking alone has proven to be far less accurate when trying to pinpoint an exact position. Apple's patent walks us through their new solution. Apple also introduces a unique Privacy feature built right into their eye-tracking system.
Apple notes in their summary that some systems utilize hand tracking in order to determine a spatial selection point within an augmented reality (AR) or virtual reality (VR) environment. For example, the hand tracking determines a hand tracking region associated with a hand of a user, and the system may select the center point of the hand tracking region as the spatial selection point.
However, because of inaccuracies of the hand tracking, the spatial selection point often does not correspond to the actual location of the hand of the user. In some circumstances, the offset between the spatial selection point and the actual location results in the system erring in determining which, of a plurality of displayed virtual elements, the user is attempting to select.
By contrast, various implementations disclosed include positioning a user-controlled spatial selector within a computer-generated reality (CGR) environment based on extremity tracking information and eye tracking information.
By using the extremity tracking information and the eye tracking information, an electronic device determines a more accurate assessment as to whether the user is selecting the virtual object than previous systems that do not utilize eye tracking.
In some implementations, the electronic device determines respective confidence levels associated with extremity tracking and eye tracking, and uses the respective confidence levels to position the user-controlled spatial selector.
For example, the eye tracking characteristics include characteristics of a CGR environment (e.g., brightness level of the CGR environment, contrast level between the CGR environment and a CGR object, likelihood that the CGR object is selected), whether user is looking in periphery, historical data of the user's eye gaze location, and/or the like.
In some implementations, the electronic device determines extremity tracking characteristics (e.g., user is holding a pencil, user's extremity is shaky) and uses the extremity tracking characteristics to determine a confidence level associated with the extremity tracking data.
Apple's patent FIG. 1 below is a block diagram of an example of a portable multifunction device such as an iPad that includes new eye tracking sensors and eye tracking controller along with new extremity tracking sensor and associated controller; FIG. 3B is an example of positioning a user-controlled spatial selector within a CGR environment based on extremity tracking and eye tracking in accordance with some implementations.
In Apple's patent FIG. 3D above, we see an exploded view of the icons on the iPad shown in FIG. 3B. The iPad (or HMD electronic device #320) implements an eye tracking function with respect to the finger (#331) of the right hand (#314) in order to determine a second candidate virtual spatial location (#342) of the CGR environment. The system is meant to accurately assess what icon (or other affordance like a button in a game) is being chosen by the user while wearing a HMD.
Apple notes that the eye tracking sensor(s) detect eye gaze of a user of the iPad (electronic device) and generates eye tracking data indicative of the eye gaze of the user. In various implementations, the eye tracking data includes data indicative of a fixation point (e.g., point of regard) of the user on a display panel, such as a display panel within a head-mountable device (HMD), a head-mountable enclosure, or within a heads-up display.
The extremity tracking sensor obtains extremity tracking data indicative of a position of an extremity of a user. For example, in some implementations, the extremity tracking sensor corresponds to a hand tracking sensor that obtains hand tracking data indicative of a position of a hand or a finger of a user within a CGR environment.
In some implementations, the extremity tracking sensor utilizes computer vision techniques to estimate the pose of the extremity based on camera images.
Apple later notes that in some implementations, the electronic device #320 shown in FIG. 3B corresponds to a head-mountable device (HMD) that includes an integrated display (e.g., a built-in display) that displays a CGR environment, such as an AR environment or a VR environment. In some implementations, the electronic device 320 includes a head-mountable enclosure.
In various implementations, the head-mountable enclosure includes an attachment region to which another device with a display can be attached.
In various implementations, the head-mountable enclosure is shaped to form a receptacle for receiving another device that includes a display (e.g., the electronic device 220 illustrated in FIG. 2). For example, in some implementations, the electronic device slides/snaps into or otherwise attaches to the head-mountable enclosure.
While Apple will eventually have high and low-end HMD solutions, the latter that Apple is referring to above relates to their patent figure below from previous patents like this one.
Apple's patent FIG. 4 below is an example of a block diagram of a system #410 for positioning a user-controlled spatial selector within a CGR environment based on extremity tracking and eye tracking.
Apple's patent FIG. 5 above is a flow diagram of a method #500 of positioning a user-controlled spatial selector within a CGR environment based on extremity tracking and eye tracking. In various implementations, the method or portions thereof are performed by an HMD or other device (iPad, iPhone).
Privacy Baked Right into Future Eye-Tracking Systems
It was both surprising and interesting to learn that Apple's position on privacy will extend to their eye tracking system. Apple notes the following:
"In various implementations, the electronic device includes a privacy subsystem [#170 of FIG. 1 above] that includes one or more privacy setting filters associated with user information, such as user information included in the eye gaze data and/or body position data associated with a user.
In some implementations, the privacy subsystem selectively prevents and/or limits the electronic device or portions thereof from obtaining and/or transmitting the user information.
To this end, the privacy subsystem receives user preferences and/or selections from the user in response to prompting the user for the same. In some implementations, the privacy subsystem prevents the electronic device from obtaining and/or transmitting the user information unless and until the privacy subsystem obtains informed consent from the user.
In some implementations, the privacy subsystem anonymizes (e.g., scrambles or obscures) certain types of user information. For example, the privacy subsystem receives user inputs designating which types of user information the privacy subsystem anonymizes.
As another example, the privacy subsystem anonymizes certain types of user information likely to include sensitive and/or identifying information, independent of user designation (e.g., automatically).
Apple's patent application number 20210216146 was published today by the U.S. Patent Office. The patent was originally filed in January 2021.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Comments