A new Apple Patent points to In-Air Gesturing going beyond Vision Pro to other devices such an iPhone, iPad & MacBook
In-Air gestures have been mentioned in Apple patents for over decade, yet it was first utilized with Vision Pro in dramatic fashion. This week in a new patent application, we see that Apple is now considering to bring this feature forward to devices such as the iPhone, iPad and MacBooks.
Apple's patent application covers techniques for improved recognition of in-air gestures. An interaction may be classified, and then gesture recognition on the scan may be controlled based on the classification.
In an aspect, the likelihood of a user's intention to communicate certain gestures may change depending on what type of interaction is occurring in the scan between a body part and another object. For example, a pinch gesture may include movement of a finger and thumb of the same hand toward each other (and/or touch each other). When a user is holding an object, such as holding a pen in their hand, the user may not intend to signal a pinch gesture, but the touching thumb and finger while holding the pen might confuse a hand gesture recognizer into detecting a pinch gesture.
In this example, if the interaction between the hand and pen may be classified as “holding,” then disabling recognition of the pinch gesture may improve gesture detection reliability.
In one aspect, all gesture recognition may be disabled when a holding interaction is detected. In other aspect, when a hand is holding an Apple Pencil, recognition of the pinch gesture may be disabled for that hand while other gestures may not be disabled.
For example, recognizing a wave gesture by a hand holding a pen may not be disabled, and a pinch gesture performed by a separate hand that is not holding a pen may not be disabled.
Apple's patent FIG. 1 below illustrates an example scenario #100 for gesture recognition. In this scenario a device #102 (an iPhone) include a sensor #104 for scanning a proximate area #110 including a subject user's body part #120 and another object #122. In aspects, the object may be any object in proximate area #1120 other than the body part. For example, the body part may be a hand, and the object may be a pen or a different hand. The body part and the other object may interact in a scan of the proximate area.
In an aspect, a detected body part may include one or more fingers, a hand, an arm, a face, or any other body part capable of gesturing. Gesture recognizer may recognize gestures performed by the detected body part. For example, a finger may perform a pointing gesture; a hand may perform a pinch gesture (finger and thumb moving toward each other) or a release gesture (finger and thumb moving away from each other); an arm may perform a wave gesture; and a face may perform a smile gesture.
A gesture recognition controller may disable detection of a gesture based on an interaction between the detected body part and another object. For example, if a finger is pushing a button, a pointing gesture may be disabled; if a hand his holding a pen, both a pinch gesture and a release gesture may be disabled; if an arm is shaking hands with another person's arm; a wave gesture may be disabled; if an apple is being eaten by a face; a smiling gesture may be disabled.
n an aspect, an iPhone may include a camera or motion sensor or another type of sensor such as a lidar sensor that may scan proximate area #110 by capturing data regarding location and distance from a device to an and a body part.
For full details, review Apple's patent application 20240094825. 11 engineers were listed as inventors.
Comments