Beyond the battery pack accessory for Vision Pro, Apple envisions other attachable accessories like cameras, bio-sensors+ in the Future
In the Blink of an Eye, Ōura Ring 4 has just been released

A new Apple patent Covers Hands-Free Siri Experiences in AirPods Pro 2 and how it could apply to Apple Music in the Future

1

Today the U.S. Patent and Trademark Office officially published a patent application from Apple that relates to interacting with audio data via motion inputs that was just announced in-part with AirPods Pro 2. I say in-part because the patent behind this feature describes that motion input could apply to Apple Music in the future.

Apple's invention covers techniques that provides audio output devices with faster and more efficient methods for interacting with audio data. Such methods optionally complement or replace other methods for interacting with audio data. Such methods and interfaces reduce the cognitive burden on a user and produce a more efficient human-machine interface.

In accordance with some embodiments, a method, performed at one or more audio output devices is described. The method includes: outputting a first audio notification; subsequent to outputting the first audio notification, a motion input, based on one or more sensor measurements from one or more sensors in the one or more audio output devices, is detected; and in response to the detected motion input and in accordance with a determination that a first set of criteria are met, wherein the first set of criteria includes a first criterion that is met when the motion input is detected within a threshold time period of outputting the first audio notification, causing performance of a first operation associated with the first audio notification.

Apple's patent FIG. 8 below is a block diagram of a method for providing audio feedback for detected motion gestures; FIGS. 9A/B/H illustrate example methods for detecting motion inputs in spatial audio arrangements in context with a phone call.

2

Apple's patent FIGS. 9K/L/M below relate to head gestures in context with Apple Music. More specifically, In FIG. 9K, while user #602 is oriented (e.g., facing the direction) toward spatial region #904b, AirPods (Pro - device 600) detects head nod gesture #936b. In response to detecting motion gesture #936b while user is in a right facing orientation, the AirPods initiates playback of “Playlist 1”.

In some embodiments, AirPods transmits a signal to an iPhone to initiate playback of “Playlist 1”, which is a playlist of media files (e.g., songs) stored on iPhone, and the associated audio from “Playlist 1” is output on AirPods.

Apple's patent FIG. 9L represents an alternative embodiment to FIG. 9K in which AirPods detects tilt gesture #938 (e.g., instead of head nod gesture #936b) while the user is facing spatial region #904b, to navigate to a menu of selectable options within selectable option # 916b (e.g., a list of playable songs or “Tracks” within Playlist 1). In response to detecting tilt gesture #938 while the user is facing spatial region #904b, AirPods invokes (e.g., produces via a spatial audio experience) spatial audio arrangement #920 (e.g., a sub-menu of selectable songs).

3

Turning to FIG. 9M above, after invoking spatial audio arrangement #920, AirPods outputs simulated sound #948 at spatial region #942. In this example simulated sound is an announcement (e.g., by a virtual assistant associated with AirPods) of selectable option #946, which is an option to “Play Track 1” of “Playlist 1.” If AirPods does not detect a motion gesture from the user within in a threshold time period of outputting simulated sound (e.g., while AirPods is still announcing the selectable option and/or before the next selectable option is announced), AirPods proceeds to output the next simulated sound in the menu of options.

Review Apple's patent application 20240329922 that is deeply detailed.

10.51FX - Patent Application Bar