Apple Invents a Version of Apple Watch that is able to Understand Hand Gestures in VR Games & more
Yesterday the US Patent & Trademark Office published a patent application from Apple relating to Apple Watch being configured to capture one or more images of a user's veins and automatically determine a gesture and/or finger positions using images of the user's veins and receiving the gesture and/or finger positions as input to Apple Watch without requiring touch and voice input. The hand gestures could be used for various applications including VR games.
According to Apple's patent application, Apple Watch could one day include one or more sensors (e.g., a camera) to capture one or more images of a user's hand and convert the image(s) to digital representations and could correlate the digital image(s) of the veins to one or more poses.
From the pose(s), Apple Watch will be able to determine a user's hand movements (e.g., finger movements), and one or more gestures and/or finger positions could be determined from the hand movements (e.g., finger movements).
Apple Watch will then interpret the gestures and/or finger positions as one or more input commands, and perform an operation based on the input command(s).
By detecting movements of the user's veins and associating the movements with input commands, Apple Watch will receive user input commands through another means in addition to, or instead of, voice and touch input, for example.
Examples of the disclosure include using the user input commands in virtual reality (VR) (including augmented reality (AR) and mixed reality (MR)) applications.
Apple's patent FIG. 5C below illustrates a perspective view of an Apple Watch having one or more sensors located on the strap; FIGS. 6B-6C illustrate exemplary digital representations of images of the user's veins.
Apple's invention further includes systems and methods for scanning the user's veins using near-IR and/or IR sensors. Scanning the user's veins could be used for, e.g., detecting one or more gestures and/or finger positions, where the one or more gestures and/or finger positions could be gestures that may not use touch or audible input.
Apple further notes that examples of Apple Watch functions via gestures could translate to making a phone call, turning on an application, performing an operation associated with an application, recording a new gesture and/or finger positions, displaying a message on the display, and interacting with virtual objects in virtual reality (VR) applications / games.
Interestingly, one of the inventors listed on this invention is Michael Brennan, a Machine Learning Software Engineer / CoreML / Game Technologies Engineer.
In Apple's patent FIGS. 4A and 4B above we're able to see top views of an exemplary user's hand #401 that includes a palmar side, shown as hand 401A, including a plurality of veins 403A and a dorsal side, shown as hand 401B, including a plurality of veins 403B.
One or more portable electronic devices could utilize one or more sensors (e.g., a camera) to capture a plurality of images of the user's veins. In some examples, the plurality of images could be taken at different times (e.g., consecutive time frames). The device can correlate the veins shown in the plurality of images to the user's joints and one or more poses (e.g., hand poses).
From the pose(s), the user's hand movement (e.g., finger movements) could be determined and match one or more gestures and/or finger positions (e.g., by comparing to a statistical model) to the user's hand movement (e.g., finger movements) and could perform one or more device functions (e.g., make a phone call) associated with the determined gesture(s) and/or finger positions.
Apple's patent FIG 5A below illustrates a top view of an underside of an Apple Watch for determining a PPG signal. Apple Watch could include new light emitter's #506 and #516 and a light sensor #504. One or more light emitter-light sensor pairs could be used additionally or alternatively for capturing one or more images of the user's hand (e.g., user's hand #401 illustrated in FIGS. 4A-4B).
Apple's patent FIG. 7B above is a flow chart of an exemplary process for predicting one or more gestures and/or finger positions and recording the prediction.
Apple's patent application that was published today by the U.S. Patent Office was originally filed back in Q3 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
For more sophisticated VR games, Apple has invented specialized VR gloves that we reported on yesterday in an IP report titled "Apple invents advanced VR Gloves for Games, Education and Military Training," that you could review here.
About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.
Comments