Apple is back with their marketing message that "Your next computer is not a computer" – it's an iPad Air
Last night Apple's Film 'CODA' won the Darryl F. Zanuck Award for Outstanding Producer of Theatrical Motion Pictures at the PGA Awards

7 AirPods Max patents surfaced Thursday + 2 interesting Google patents covering Hand Tracking & Skin Interfaces for Wearables

1 x Cover AirPods Max ++

 

On Thursday the US Patent & Trademark Office published a series of AirPods Max patent applications from Apple that headphone engineers and technophiles may enjoy reviewing. In addition, if you happen to like reading about new computer technologies in general, then we discovered two Google patents this morning covering new skin interfaces for wearables and scalable real-time hand tracking. Apple also has many patents on the future use of in-air hand-gesturing input for Macs and mixed reality headsets – so on that front both tech companies are attempting to make hand-gesturing controls viable. Google's first attempt at this using Soli technology flopped and they dropped it, vowing that it would return in the future. Today we get to see an update to their hand-tracking technology.  

 

7 AirPods Max Patents

 

As for Apple fans and consumers in general, YouTube reviews of Apple's AirPods Max cover most of the points that we want to know about. But for Headphone engineers and die-hard techies, the more technical details that they could find, the better. So, for that niche, the US Patent Office published seven AirPods Max patents as linked to below.

 

The one that struck my interest most is the first one listed below titled "Headphone Earcup with Adsorptive Material." I couldn't find any review or technical report covering this feature and so I wonder if this is something that Apple may be considering to adopt in the future.

 

Apple's patent application 20220086561 Titled "Headphone Earcup with Adsorptive Material" relates to a headphone earcup having an adsorptive material to acoustically enlarge an acoustic cavity of the earcup, and more.

 

Patent FIG. 1 below: In one aspect, adsorbent member #120 may be positioned within the earcup cushion #112. In this position, adsorbent member may help to dampen the standing waves #105B within the cushion region of the cavity #104. Representatively, adsorbent member may be incorporated into the foam material of earcup cushion. Representatively, the adsorbent member could be an adsorbent material (e.g., zeolite) embedded within the foam material.

 

2 x AirPods Max patent fig. 1

 

The second AirPods Max invention could be found under patent application 20220084494 titled "Headphone with Multiple Reference Microphones ANC and Transparency." The patent relates to headphone audio systems, and more particularly to headphones having digital audio signal processing for acoustic noise cancellation, ANC, and transparency using multiple reference microphones in a single ear cup.

 

The third AirPods Max invention could be found under patent application 20220084495 Titled "Headphone with Multiple reference Microphones and Oversight of ANC and Transparency."

 

The other four AirPods Max patents cover: Headphone Earcup Structure; Earpiece with Cushion Retention; Support Structure for Earpiece Cushion; and lastly, Headphones with Off-Center Pivoting Earpiece.

 

Google Patent: "Skin Interface for Wearables: Sensor Fusion to Improve Signal Quality."

 

The first of two Google patents of interest this week was published in Europe under WO2022046047 titled "Skin Interface for Wearables: Sensor Fusion to Improve Signal Quality." Google's Pixel Buds A-Series eliminated the swipe on buds-based touch controls to raise or lower volume.

 

Google's latest patent addresses this issue with a new Skin Interface wherein the user is able to swipe left to right and up and down on skin areas close to the ear or on the left and right side of a Google's future Pixel Watch as highlighted in the patent figures below.  

 

3 Google Skin interface for Wearable patent figs 2  3  5a  6a  Patently Apple - Mobile Mar 20  2022

 

Google's invention provides systems and methods for determining an input command based on a gesture of a user using one or more accelerometers within the wearable device. The gesture may be a swipe gesture or a tap gesture performed on the skin of the body of the user in a region near a wearable device.

 

The gesture may create a mechanical wave that propagates through the portion of a body of the user between an input region and the wearable device. The one or more accelerometers may detect movement of the device due to the mechanical wave and determine a type of gesture based on the detected movement.

 

One aspect of the disclosure includes a wearable electronic device comprising one or more accelerometers and one or more processors in communication with the one or more accelerometers. The one or more processors may be configured to receive, at the one or more accelerometers, an input based on a gesture of a user on a region of the user’s skin near the wearable electronic device, detect, based on the received input, a movement of the device, and determine, based on the movement of the device, an input command. At least one of the one or more accelerometers may be an internal measurement unit (“IMU”) accelerometer. At least one of the one or more accelerometers may be a voice accelerometer. The gesture may be a swipe gesture or a tap gesture.

 

The one or more processors may be further configured to compare the detected movement of the device and one or more stored waveforms, and determine, based on the comparison of the detected movement and the one or more stored waveforms, a type of gesture. The input may be a mechanical wave that propagates through a portion of a body of the user between an input region and the wearable device. The mechanical wave may be based on an external force exerted on the body of the user.  For more technical, review Google's patent filing here.

 

Google Patent: Scalable Real-Time Hand Tracking

 

The second Google patent of interest this week was published in the U.S. under the title "Scalable Real-Time Hand Tracking." In 2019 Google won an in-air gesturing patent based on their Soli radar technology. In 2021 Google filed for a patent relating to hand gestures using a smart ring.

 

Google's latest patent filing describes real-time hand-tracking using either LiDAR or Radar in conjunction with Machine Learning. Google states in their filing that their invention covers computing systems and methods for hand tracking using a machine-learned system for palm detection and key-point localization of hand landmarks.

 

In particular, example aspects of the invention are directed to a multi-model hand tracking system that performs both palm detection and hand landmark detection.

 

Given a sequence of image frames, for example, the hand tracking system can detect one or more palms depicted in each image frame. For each palm detected within an image frame, the machine-learned system can determine a plurality of hand landmark positions of a hand associated with the palm. The system can perform key-point localization to determine precise three-dimensional coordinates for the hand landmark positions. In this manner, the machine-learned system can accurately track a hand depicted in the sequence of images using the precise three-dimensional coordinates for the hand landmark positions.

 

This three-dimensional hand landmark tracking can facilitate the ability to both perceive the shape and motion of hands in order to provide viable solutions in a number of technological domains and platforms. By way of example, a machine-learned hand tracking system as described may be used to implement sign language applications and hand gesture controls, as well as to overlay digital content and information on top of the physical world in augmented reality applications. The machine-learned hand tracking system can provide accurate and robust real-time hand perception and tracking even in the presence of occlusions and a lack of high contrast patterns which are often present with the depiction of hands in imagery.

 

Google's patent FIG. 15 below depicts a flowchart illustrating an example method of training a hand tracking system; FIG. 1 depicts a block diagram of an example hand tracking system; and FIG. 13 depicts a block diagram of an example hand tracking system including a machine-learned palm detection model, a machine-learned hand landmark model, and a gesture recognition system.

 

4 Google Hand Tracking patent figs. 1  13 & 15 Mar 2022

 

For more on this, review Google's patent application 20220076433.

 

Apple also has a number of in-air hand gesturing patents on record, covered in our archives here. The race is on in making hand gesturing a viable input methodology for future Macs and mixed reality headsets.

 

10.51FX - Patent Application Bar

Comments

The comments to this entry are closed.