Hand-Tracking Technology to be used in controlling Future MR Headset User Interfaces is Closer to Market than you Think
On Wednesday May 11, Patently Apple posted a report titled "A new Google patent reveals future AR Glasses will work in sync with accessory devices to capture In-Air Gestures to control UIs+." Our report noted that next-gen Mixed Reality headsets and AR smartglasses being developed by Apple, Google, Facebook and many other players have a common feature in development relating to hand-tracking and interfaces that could be used in sync with in-air hand or finger gestures to control UIs, move virtual objects and more.
To date, Patently Apple has covered 59 Apple patents covering in-air gestures. One of Apple's latest patents on this was posted last month titled "A new Apple Patent deeply describes the use of In-Air Gesturing to control a new kind of Mixed Reality Headset Interface." The technology is closer to reality than most may think.
Last weekend I discovered two Facebook patents showing some of the work that their engineers are working on, but more importantly, I stumbled on a developer video showcasing a Facebook SDK on hand tracking. The snippet below will help you visualize, beyond mere patent graphics, how far along hand-tracking really is for future MR Headsets for controlling next-gen user interfaces and beyond.
Note in the bottom right corner of the video you'll see the developer moving his hands and in the larger display area you'll see his hand-tracked virtual hands interacting with various parts of a potential Headset User Interface.
If you want to see more, check out the full video by Dilmer Valecillos here.
While Apple never releases prototype videos of what they're working on, you can be sure that Apple's engineers are far beyond filing mere patents and have working hand-tracking models in development as well. It'll be really exciting to see this unfold sometime in the future.
One of the two new Facebook patents published by the U.S. Patent Office is titled "Hand Presence over Keyboard Inclusiveness" dated April 21, 2022. Here, Facebook engineers show their invention of hand-tracking specifically for use with a virtual keyboard which will take incredible accuracy and dexterity.
Facebook's patent FIGS. 5A and 5B below illustrate examples of a user's hands #210 displayed over a rendered virtual keyboard #420 in a virtual reality environment #500. In patent FIG. 5B the display states: "Hello, I'm typing in VR!.
(Click on image to Enlarge)
(Click on image to Enlarge)
See Facebook's patent application 20220121343 for details.
In Facebook's second patent application filed in Europe last week titled "Artificial Reality System having a sliding menu" it covers an AR Headset menu UI that could be manipulated by hand tracked gestures in choosing menu items.
Facebook's patent abstract states: "An artificial reality system is described that renders, presents, and controls user interface elements within an artificial reality environment, and performs actions in response to one or more detected gestures of the user. The artificial reality system can include a menu that can be activated and interacted with using one hand. In response to detecting a menu activation gesture performed using one hand, the artificial reality system can cause a menu to be rendered. A menu sliding gesture (e.g., horizontal motion) of the hand can be used to cause a slidably engageable user interface (UI) element to move along a horizontal dimension of the menu while horizontal positioning of the UI menu is held constant. Motion of the hand orthogonal to the menu sliding gesture (e.g., non-horizontal motion) can cause the menu to be repositioned. The implementation of the artificial reality system does require use of both hands or use of other input devices in order to interact with the artificial reality system."
In Facebook's patent FIG. 2 below we see how a camera (#210( mounted on an HMD is used to track a users hand gestures that are used in sync with an AR user interface on the inside display of the HMD; FIG. 6 is a flowchart illustrating operations of an example method for positioning and interacting with a UI menu.
In Facebook's patent FIGS. 7G and 7E below we see a couple of examples of a user manipulating a menu by selecting "Settings" and/or a "Social" media button using a user's finger gestures captured by the HMD's camera system.
You could review the European patent filing EP3980870 here for more details.
One of the key battles in the next generation of Head Mounted Devices will be in providing users with a sophisticated menu to control content and games using eye and hand-tracking. Apple, Facebook, Sony and others are deeply engaged in developing systems and the video of Facebooks SDK really helps us see how cool this could be in the future.
When you look back at the original iPhone's evolution from 2007 until now, you know Apple's HMD will likewise have a long evolutionary line of developments. A new report by the Korea Herald posted Friday covered Samsung's Research Chief, Sebastian Seung, describe their initial work on 6G technology. The 6G era, likely a decade out, will be critical for XR (Extended Reality), holograms and digital replica.
Seung stated that in order "To enable such services, 6G should support a tremendous amount of real time data processing, hyperfast data rate and extremely low latency." Seung added that now is the right time to start preparing for 6G, echoing suggestions in its white papers that 6G will be 50 times faster than 5G with one tenth the latency.
In the shorter term, the industry is racing to develop and deliver next-gen eye and hand tracking technologies taking the HMD experience to the next-level. Controlling HMD specific user interfaces will enhance immersive gaming while allowing users to perform basic functions of engaging with social media and even tasks for work like typing on virtual computer devices.