Apple is pausing sales of the Apple Watch Series 9 & Apple Watch Ultra 2 from Apple.com on Dec. 21 and from Apple retail locations after the 24th
One of the last Senior Industrial Designers at Apple from the Jony Ive era is stepping down after 20 years of service

Google invents a new 'Soli' radar-based solution that will recognize a user's hand gestures to control a future Google OS for AR/XR HMD's

1 cover Google Smartglasses patent report - patently mobile - patently apple . com

After Apple introduced their advanced spatial computer branded 'Apple Vision Pro' Google, Qualcomm and Samsung went into high gear to put together a new ecosystem for an Android-based OS and, advanced chips and hardware to compete with Apple Vision Pro so as to not allow it to dominate this space in 2024 and beyond. The alliance was first announced in a rushed way on January 2023

3 XR HEADSET ALLIANCE PARTNERS

Then in late June, Google shut down their smartglasses team, known as "Iris" so as to not be seen as a hardware competitor to Android partners developing  next-gen XR headsets.

Although Google shut down the hardware side of their smartglasses team doesn't mean that their associated intellectual property  just vanishes into thin air. Google will either sell their XR/VR related patents to partners or roll the concepts that they've developed into features for this yet unannounced XR/VR OS of theirs for HMD's that will surface at some point in 2024 (or later).  

One of the killer features of Apple Vision Pro is the use of simple hand gestures to control aspects of visionOS. Cameras built into the bottom area of the headset are able to read the specific user's hand gestures to move objects, scroll a webpage, tap on an icon to open an app and so forth as noted in-part below.

(Click on image below to Greatly Enlarge) 2A APPLE WWDC 23 HAND GESTURE RECOGNITION

This type of feature can't be brought to smartglasses without making them overly bulky with integrated camera and so Google's engineering teams had decided to use a form of their 'Soli' radar technology to assist users. The user's hand gestures could be interpreted by a smartwatch that could then send signals/commands to the smartglasses OS to enable interaction with the smartglasses OS such as clicking on an icon to open an app, scroll a webpage and so forth.

Hand Gesture Recognition based on Detected Wrist Muscular Movements

Google's patent background notes that some augmented and virtual reality systems interpret gestures from users as commands for interacting with virtual objects. Along these lines, transmitters and receivers of electromagnetic radiation used in some augmented and virtual reality systems can track movements of hand, arm, wrist, or other body parts that form gestures. These movements may be represented to a processor running an augmented or virtual reality system as waveforms generated by a receiver in response to receiving radiation from the transmitter and reflected off the body parts.

Examples of transmitters used in conventional augmented and virtual reality systems include wearable devices and electromyograms (EMGs). However, EMGs also require bulky devices worn on an arm that are cumbersome to calibrate and can be too sensitive to contact. In addition, such conventional systems may expose a user's personally identifiable information (PII) and use an excessive amount of system power.

In contrast to the conventional approaches to solving the above-described technical problems, a technical solution to the above-described technical problems includes detecting and classifying inner-wrist muscle motions at a user's wrist using micron-resolution radar sensors. 

For example, a user of an AR system may wear a band or smartwatch around their wrist. When the user makes a gesture to manipulate a virtual object in the AR system as seen in a head-mounted display (HMD), muscles and ligaments in the user's wrist make small movements on the order of 1-3 mm.

The band contains a small radar device that has a transmitter and a number of receivers (e.g., three) of electromagnetic (EM) radiation on a chip (e.g., a Soli chip); this chip is a small distance from the wrist. The EM radiation is a small wavelength such as millimeter wave so that such small movements are detectable.

Moreover, the EM radiation is emitted as chirped frequency-modulated continuous wave (FMCW) in bursts each of about 30 chirps having a beginning frequency of about 60 GHz and a bandwidth of about 4.5 GHz. This radiation reflects off the wrist muscles and ligaments and is received by the receivers on the chip in the band. The received reflected signal, or signal samples, are then sent to processing circuitry for classification to identify the wrist movement as a gesture. The AR system, having identified the gesture, then performs a virtual object manipulation operation to manipulate a virtual object based on the gesture.

Google's patent FIG. 1 below  is a diagram that illustrates an example scenario involving a user of an AR system using a wristband for detecting small wrist muscle movements; FIG. 1B is a diagram that illustrates an example configuration of a FMCW radar interacting with a human wrist.

2 Google Gesture patent for smartglasses using radar

Google's patent FIG. 2 above is a diagram that illustrates an example electronic environment for implementing the technical solution described in Google's patent; FIG. 4 is a flow chart that illustrates an example process for calibrating microphones of a microphone array according to the technical solution.

For more details, review Google's patent that was published on December 7th here.

10.0x35 Patently Mobile



Comments

The comments to this entry are closed.