Apple has Won a Directional Haptic Output System for Future AirPods, Glasses and Headbands
Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to wearable electronic devices, such as AirPods Pro and smartglasses, that could produce haptic outputs that could be felt by wearers. More importantly, the haptics could provide users with a distinct directional sensation that would direct the user's attention to look in a direction in a virtual teleconference and beyond.
Apple's granted patent is generally directed to wearable electronic devices that include haptic actuators, and more particularly, to haptic outputs that are coordinated with a position of a virtual object (which may correspond to or represent a person, an audio source, an instrument, a graphical object, etc.) relative to the wearer of the electronic device.
The wearable electronic devices may include an array of haptic actuators (e.g., two or more haptic actuators) that can be actuated according to an actuation pattern in order to direct the wearer's attention in a particular direction.
For example, an array of haptic actuators in contact with various locations on a wearer's head may be actuated in a pattern that produces a sensation having a distinct directional component. More particularly, the user may feel the pattern moving left or right. The user may then be motivated to turn his or her head or body in the direction indicated by the haptic pattern.
Indicating a direction via directional haptic outputs may be used to enhance various types of interactions with audio and/or visual content, and in particular to enhance interaction with content that has a real or virtual position relative to the wearer, and/or content that has a visual or audible component.
For example, and as described in greater detail herein, directional haptic outputs may be used to direct a wearer's attention along a direction towards a virtual location of a participant in a multi-party telephone conference (see patent FIGS. 10AB, 11 below)
As another example, a directional haptic output may be used to direct a user's attention towards the position of a graphical object in a virtual or augmented reality environment (see patent FIGS. 12A-B below)
Haptic outputs provided via a wearable electronic device may also be used to enhance an experience of consuming audio or video content. For example, haptic outputs may be synchronized with certain audio features in a musical work or with audio or visual features of video content.
In the context of music, the haptic outputs may be synchronized with notes from a certain instrument or notes having a certain prominence in the music. In some cases, the position of the wearer relative to a virtual position of an instrument may also affect the haptic output provided to the user. In the context of video, the haptic outputs may be synchronized with some visual and/or audio content of the video, such as by initiating a haptic output when an object appears to move towards or near the viewer.
While the patent covers future AirPods Pro, the patent also points to a future pair of glasses may include haptic actuators (e.g., on the temple pieces and/or nose bridge).
As yet another example, a headband, hat, or other head-worn object may include haptic actuators. In some cases, these wearable device(s) include an array of two or more haptic actuators, which may facilitate the production of directional haptic outputs by using different types of actuation patterns for the various actuators in the array.
Apple's patent FIGS. 2A-B, 3A-B, 4A-C represent three different head-mounted haptic accessory wearable devices (AirPods, Glasses & Headband); FIG. 9 depicts an example chart showing differences between various head-mounted haptic accessories.
Apple's patent FIGS. 10, 10B and 11 relate to a teleconferencing scenario.
More specifically, Apple's patent FIG. 10A-10B illustrate an example use case in which a directional haptic output is used to direct a user's attention to a particular audio source in the context of a teleconference. For example, a user 1000 may be participating in a teleconference with multiple participants, 1002-1, 1002-2, and 1002-3 (collectively referred to as participants 1002).
The participants may each be assigned a respective virtual position relative to the user (e.g., a radial orientation relative to the user and/or the user's orientation and optionally a distance from the user), as represented by the arrangement of participants and the user in FIGS. 10A-10B.
When it is detected that one of the participants #1002-3 is speaking, the earbuds #1001 may produce a directional haptic output #1006 that is configured to direct the user's attention to the virtual position of the participant #1002-3 from which the audio is originating.
For example, a directional haptic output may be produced via the earbuds to produce a directional sensation that will suggest that the user reorient his or her head or body to face the participant #1002-3 (e.g., a left-to-right sensation, indicated by arrow #1004, or any other suitable haptic output that suggests a left-to-right reorientation). FIG. 10B illustrates the user after his or her orientation is aligned with the virtual position of the audio source (the participant #1002-3).
Apple's patent FIGS. 12A-12B below depict a user engaged in a virtual-reality environment; and FIGS. 14A-14B depict a spatial arrangement of a user and two audio sources.
For more details, review Apple's granted patent 10,966,007
Comments