Apple Invents Finger Input Devices that use in-air and surface gestures to control a Mac's Display instead of a Mouse+
Apple has been working on new finger input devices that are mostly in the form of a ring for some time now and we've covered a number of their previous patents (01, 02 & 03). Today the US Patent & Trademark Office published a patent application from Apple revealing another dimension to this project. This time the finger devices could be used in context with Macs. The finger devices could manipulate what's on a Mac's display using in-air and surface gestures instead of using a mouse.
Electronic devices such as computers can be controlled using computer mice and other input accessories. Devices such as these may not be convenient for a user and may be cumbersome or uncomfortable.
Apple's invention covers an electronic device such as an iMac that could be used with one or more finger devices instead of mouse. The electronic device may have a display and the user may provide finger input to the finger device to control the display. The finger input may include pinching, tapping, rotating, swiping, pressing, and/or other finger gestures that are detected using sensors in the finger device.
The finger device may be worn on a finger of a user while leaving a finger pad at the tip of the finger exposed. The finger device may include sensors that detect movement of the finger wearing the finger device and/or sensors that detect input from adjacent fingers.
For example, to detect movement of the finger wearing the finger device, the finger device may include a motion sensor, a force sensor that measures how forcefully the finger is pressed against one or both sides of the finger device as the finger contacts a surface (e.g., a surface of an object or a surface of another finger such as a thumb finger), and/or a distance sensor such as an optical distance sensor that measures changes in distance between the finger and the sensor.
By detecting the small movements of the finger wearing the finger device, the finger device may be used to detect finger gestures such as pinching and pulling, pinching and rotating, swiping, and tapping.
A touch sensor on a finger device may include a one-dimensional or two-dimensional array of sensor elements that detect touch input on the outside of the finger device (e.g., from an adjacent finger and/or a finger on the opposing hand).
The sensor elements may be capacitive sensor electrodes or touch sensor elements based on optical sensing, ultrasonic sensing, or other types of sensing.
Sensor data related to finger gestures (finger gestures to an input region in air or on a surface, finger gestures may with one, two, three or more fingers, finger gestures associated with touch input to the touch sensor on the exterior of the finger device) may be combined with user gaze information to control items on the display (e.g., to navigate a menu on a display, to scroll through a document, to manipulate computer-aided designs, etc.).
Apple's patent FIG. 2 below is a top view of an illustrative finger of a user on which a finger device has been placed; FIG. 3 is a cross-sectional side view of an illustrative finger device on the finger; FIG. 6 is a perspective view of an illustrative finger device being used to detect input on the finger device; FIG. 7 is a perspective view of an illustrative finger device being used to detect an adjacent finger.
Apple's patent FIG. 8 below is a perspective view of an illustrative electronic device with which a user may interact using one or more finger devices with a desktop (iMac) computer; FIG. 9 is a perspective view of an illustrative finger device being used to detect finger input to a user's hand.
Apple's patent FIG. 10 above is a perspective view of an illustrative finger device being used to detect finger input to a surface; FIG. 11 is a perspective view of an illustrative finger device being used to detect finger input as the user holds an Apple Pencil or other device.
Apple's patent FIGS. 12, 13, and 14 below are perspective views of an illustrative electronic device having a display and a finger device being used to provide input to the display.
Apple further outlined that, if desired, user input may include air gestures (sometimes referred to as three-dimensional gestures or non-contact gestures) gathered with sensors 18 (e.g., proximity sensors, image sensors, ultrasonic sensors, radio-frequency sensors, etc.). Air gestures (e.g., non-contact gestures in which a user's fingers hover and/or move relative to the sensors of the device and/or in which the device hovers and/or moves relative to external surfaces) and/or touch and/or force-based input may include multi-finger gestures (e.g., pinch to zoom, etc.).
In some arrangements, a user may wear one or more finger devices on both hands, allowing for two-hand tracking. For example, finger devices on one hand may be used for detecting click or tap input and finger devices on the other hand may be used for detecting more complex finger gestures.
In some embodiments, a user may wear multiple finger devices on one hand (e.g., on a thumb and index finger) and these devices may be used to gather finger pinch input such as pinch click gesture input, pinch-to-zoom input, and/or pinch force input. For example, a pinch click input may be detected when a tap (e.g., a peak in an accelerometer output signal) for a thumb device correlates with a tap for an index finger device and/or pinch force input may be gathered by measuring strain gauge output with strain gauges in devices as the devices press against each other.
Pinch force can also be detected by measuring the size of the contact patch produced when a finger presses against a two-dimensional touch sensor (larger contact area being associated with larger applied force). In other arrangements, pinch click gesture input and pinch force input may be gathered using only a single finger device (e.g., by measuring motion or forces of the finger pad or finger pulp of the finger wearing the finger device as the user pinches, presses, or taps on the finger pad with a thumb finger or other finger).
Consider, as an example, the use of a pinch gesture to select a displayed object associated with a user's current point-of-gaze. Once the displayed object has been selected based on the direction of the user's point-of-gaze (or finger point direction input) and based on the pinch gesture input or other user input, further user input gathered with one or more devices may be used to rotate and/or otherwise manipulate the displayed object. For example, information on finger movement (e.g., rotational movement) may be gathered using an internal measurement unit or other sensor in device(s) and this rotational input used to rotate the selected object.
In some scenarios, an object may be selected based on point-of-gaze (e.g., when a user's point-of-gaze is detected as being directed toward the object) and, following selection, object attributes (e.g., virtual object attributes such as virtual object appearance and/or real-world object attributes such as the operating settings of a real-world device) can be adjusted using strain gauge or touch sensor contact patch pinch input (e.g., detected pinch force between finger devices 10 that are being pinched together on opposing fingers) and/or can be adjusted using finger device orientation input (e.g., to rotate a virtual object, etc.).
The finger devices, according to Apple, may one day work with other devices such as a MacBook, smartglasses, an iPhone, iPad, a television, a vehicle and many more.
Apple's patent application number 20210089131 that was published today by the U.S. Patent Office was originally filed in July 2020.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Comments