Apple Invents Over-Ear Headphones with Spatial Audio driven Virtual Controls that Mimic Physical Control Sounds
Patently Apple posted a granted report in February titled "Apple Wins Patent for Future Over-Ear Headphones with Touch Gesture Audio Controls on each Ear Cup." Today the US Patent & Trademark Office published a patent application from Apple that takes their granted to next level by adding control sounds that are spatially rendered to mimic or augment physical sounds made by buttons when using over-ear headphones that offer noise cancellation sound proofing from the outside world and the outside of the earcup.
Sounds created by controls can provide useful feedback to a user that the user's input has been properly administered. This feedback can improve the overall user experience of a device.
Controlling these sounds, however, can be difficult due to construction restraints of a control and the interaction of the control with other components. In some cases, when a physical sound is not audible due to the construction of the control (e.g., noise cancellation), a device can generate a control sound through speakers to provide audible feedback to a user.
In some cases, where the control does make a physical sound on a head-worn device, the sound can be muffled because the construction of the over-ear headphones.
Speakers of the over-ear headset can be used to play binaural audio cues to mask or augment the sound of the control to provide direct control sound feedback.
The control sound played by the headset can be more pleasant than the passive sound generated by the physical control, because the control sound will not sound occluded.
When a button is a silent touch-sensitive button, generating a control sound at a virtual location perceived to indicate a physical location of the touch-sensitive button can provide feedback to a user that the user's input through the control has been sensed and will be processed.
Control sounds can be spatially rendered to mimic or augment physical sounds made by buttons for a device (e.g., a head-worn device). The close interaction between a user pressing the button and the spatial audio cues matching that button's location can reinforce that a device-generated virtual control sound is associated with the control.
A virtual location of the control sound created by the spatial audio cues can vary, based on a physical location of a corresponding physical control. Some controls, such as sliders and dials (rotary controls) can provide a continuum of input (e.g., up/down, louder/quieter, etc.).
Virtual control sounds can be generated to model such a continuum, for example, by moving the location of the virtual control sound from one location to another, closer and farther, in a perceived rotational direction, etc.).
Apple's patent FIG. 4 below illustrates that audio control button #62 can be located on over-ear headphones, an iPhone/iPad (shown) and an HMD (not shown).
Audio controls can be integral to over-ear headphones (#60) and/or have a fixed location relative to the user. Thus, the spatialized control sounds can be predetermined based on where the control is located relative to the user (e.g. is it on a left side of the user's head or a right side of the user's head). In one aspect, the headphones can be a head-mounted display (HMD). Controls can be virtual controls in mixed reality, virtual reality, or augmented reality.
As shown in Apple's patent FIG. 3 above, the driver signals can be generated in real-time or dynamically, with a spatial renderer in response to the control input (#42). It should be understood that in 'real-time' means that the driver signals are generated with minimal delays which can include processing delays, buffering, and communication latency.
Control audio assets (e.g., data files that contain encoded sound data relating to a control) and spatial information (e.g., azimuth, elevation, or distance) can be used by the spatial audio processor to generate driver signals having the control sound with the spatial auditory cues.
The spatial information and a mapping between the control, the spatial information, and/or the control audio asset can be stored in memory. When a control input is received, the processor can select the spatial information and control audio asset that corresponds to the control. The spatial auditory cues generated from the spatial information, along with the control sound, can provide a rich user experience.
Movements of the virtual control can indicate behavior of the control based on user inputs. For example, movements can be: from one side to another side, up and down, between near and far, or in a rotational arc, relative to the user. A control sound can be coordinated with changes in the virtual location to simulate a virtual rotation or movement of the control.
Apple's patent FIG. 5 above shows, in one aspect, the virtual control can be a rotary-styled control that is capable of rotation. The virtual location of a control sound can move along a line #107, for example sweeping back and forth. The axis can be virtualized at or near the control. In one aspect, the control sound can sweep in a rotational path #109, indicating a movement and/or position of the rotary control.
Similarly, as shown in FIG. 6, a sliding control #120 can have a control sound #114 that moves (e.g. up and down, back and forth, or near and far) based on the manipulation of the sliding control.
Movements of the control sound, which are implemented through spatial auditory cues, can be synchronized with the manipulation of the respective controls.
In one aspect, the control sound can include interval sounds. Intervals (e.g., every 5 degrees of a rotary control, or every millimeter of travel of the slider) can be indicated with interval sounds such as ticks, clicks, beeps, or other sounds.
These interval sounds can be spatialized. A user can be provided with auditory feedback to gauge how much a value is being changed in response to the control input.
Controls can be variable controls such as rotary controls or sliders that can provide a range of states or positions, or controls can be buttons with a press state (e.g., a finger is currently on the button), a pressed state (the button has been pressed), and/or an un-pressed state (the button is not pressed). These examples are illustrative, as controls can take other forms.
Apple's patent application number 20200356341 that was published today by the U.S. Patent Office was filed back in Q2 2020. A provisional patent associated with Apple's invention was filed in Q2 2019. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Comments