A new Apple Glass Patent Describes some of the benefits of AR Glasses and use of Hand Gesturing to control a UI and more
Last week the US Patent & Trademark Office published a patent application from Apple that relates to their future Augmented Reality Glasses that will provide users with information only seen when wearing these AR Glasses. For example, a user visiting a museum will be presented with information about a painting or sculpture only visible using Apple's AR Glasses. In another example, a poster in a theater lobby may present coming attractions. The poster may contain information that the AR Glasses could use to present the user with a trailer of the coming movie. That's one side of the patent. The side relates to how Apple's AR Glasses will work with hand gesturing and more.
Apple clarifies that optical see-through HMDs are provided as wearable glasses with separate transparent lenses for each eye.
Hand Gesturing and More
Apple's patent pending invention focuses on detailing the use of hand gesturing as being one of the ways a user will be able to communicate with Apple Glasses.
In the big picture, Apple's invention covers devices, systems, and methods for providing computer generated reality (CGR) content that includes virtual content that is displayed based on a surface of a specifically detected object such as the user's hand.
The virtual content is seen (e.g., by the user) in combination with the physical environment. The visibility of the virtual content is improved in these and other circumstances by selectively displaying the virtual content in positions relative to surfaces having characteristics that may make the virtual content easier to see or understand. For example, displaying the virtual content over the relatively consistent and flat surface of the user's hand may make a virtual object easier to see than displaying the virtual content in front of a tree of variable color and moving leaves.
Some implementations detect and track the hand's position with respect the HMD using a sensor located at or in communication with the HMD. In various implementations, detecting and tracking the hand's position involves correlating a hand color or a hand model to a depth map or the like to detect and track the hand shape, and pose (e.g., orientation, and position) with respect to the HMD.
Apple's patent goes into great detail about the use of hand gestures to communicate with glasses. Apple notes at one point that "In some implementations, hand gestures are used to initiate or terminate the functionality at the HMD of display of virtual content at the user's hand. For example, rotation of a closed fist clockwise or counterclockwise turns on or off display of virtual content at the user's hand at the HMD.
In some implementations, the user can control (e.g., allow or prevent) displaying of available virtual content at the HMD user's hand based on the physical environment (e.g., display advertising when shopping or museum/historical information when sightseeing).
In some implementations, the user controls selection among available virtual content for display at the user's hand based on the physical environment. For example, only advertising directed to food or restaurants is allowed to be displayed at the user's hand when walking down the street.
In some implementations, individual CGR virtual content visible to the HMD user can be manipulated as desired. For example, selecting (e.g., encircling with the fingers) virtual content visible at the HMD, re-displays that selected virtual content at the HMD user's hand. Alternatively, a hand gesture of "grabbing" visible virtual content with the left-hand re-displays that virtual content at the user's right hand.
The user may view and otherwise experience a CGR-based user interface that allows the user to select, place, move, and otherwise present a CGR environment, for example, based on the virtual content location via hand gestures, voice commands, input device inputs, etc.
Some communication can be displayed on the palm of a user's hand. In one instance as noted in patent FIGS. 4A and 4B below we see the user being able to communicate using a card in the user's hand.
As shown in FIG. 4A, an object #414 and the user's arm #410 holding the object are depicted in (and may be detected in) the physical environment #405. In FIG. 4B, virtual content #470 is displayed to the user in front of the object in the CGR environment #430.
In one example, the virtual content may identify or provide information about something in the physical environment. In another example, the virtual content may relate to something unrelated to the physical environment. In one example, upon receipt of virtual content (e.g., a birthday icon), that virtual content is displayed to the user in front of the object in the CGR environment.
Apple's patent FIG. 5 below is a flowchart illustrating an exemplary method of determining a suitable background condition or background location in the current physical environment to overlay virtual content in CGR environments according to some implementations.
Beyond the focus on using a user's hand, Apple adds that "A person may sense and/or interact with a CGR object using any one of their senses, including sight, sound, touch, taste, and smell. For example, a person may sense and/or interact with audio objects that create 3D or spatial audio environment that provides the perception of point audio sources in 3D space." While fascinating, Apple unfortunately failed to elaborate as to what an "audio object" actually translates to.
If you're interested in Apple Glasses, then there's a lot of details worth exploring in Apple's patent application number 20200401804 titled "Virtual Content Positioned based on Detected Object." Apple's patent was published last Thursday by the U.S. Patent Office. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Anselm Grundhöfer: Engineering Manager has been with Apple for 2.6 years. Prior to Apple, Grundhöfer spent 7 years with Disney with 2 years spent as the Principal Research Engineer, Head of Projection Technology.
Rahul Nair: Prototyping Engineer