A new Google Patent describes the possible use of Hand & Finger Gestures on a user's forearm to control the Watch UI
Back in Q4 2022, Apple Watch was the runaway leader worldwide with 27.5% market Share. Google watches came in a distant second with 8% market share. Of course Google's new found position was largely due to their acquisition of Fitbit. Google and Fitbit will be looking for ways to differentiate their watches from Apple's and a patent filing published in Europe earlier this month indicates that they're exploring the use of finger/hand gestures on a user's forearm to better control their respective watch user interfaces.
Touch Controls for Google Watch & Fit Bit
In Google's patent background they note that wearable devices, such as smart watches, may have touch screen displays that enable a user to interact with the device using touch gestures, such as a click, a double click, or a scroll. Touch screens are often sized comparable to a wrist of the user, and it is not uncommon for a touch screen to have a dimension of less than 2 inches (i.e.,< 50 millimeters). This size can limit what is displayed for touch interaction. For example, a total number of icons simultaneously displayed on the touch screen of a smart watch may be limited so that the size of each icon remains large enough for a user to conveniently touch. Increasing a touch area by physically expanding the touch screen or limiting what is displayed on the screen at any given time may not be desirable to a user. Accordingly, the functionality of the smart watch can be limited by the area provided for touch interaction.
Google invention generally describes a method for controlling a wearable device that includes receiving light at a detector on the wearable device, where the received light includes a focused-light component and a stray-light component.
Identifying a gesture in response to detecting a touch, may generally comprise analyzing the waterfall image for a presence of pixel values in the waterfall image indicative of a gesture. In this context, the method may take into account that different types of gestures and/or different gestures result in different waterfall images respectively characteristic for a type of gesture or a certain gesture. The gesture classifier may for example be configured to recognize a pattern in the waterfall image corresponding to the gesture. In a possible implementation, the gesture classifier may be configured to recognize different types of gestures on the basis of different types of stored (and previously learned) reference patterns corresponding to the different types of gestures.
Google's patent FIG. 2 below illustrates a bottom view of the smart watch with includes a photoplethysmography (PPG) sensor that can be configured to measure a heart rate of a user (i.e., wearer) like the Apple Watch. ). The PPG sensor includes one or more illuminators (e.g., LEDs #210) and one or more detectors (e.g., photodiodes #220). The LEDs can be configured to transmit focused light towards a user's wrist; FIG. 3 illustrates a cross-section side-view of a smart watch configured for photoplethysmography.
Google's patent FIG. 5 above illustrates a touch gesture (i.e., gesture) to a wrist of a user wearing a smart watch according to a possible implementation of the present disclosure. The gesture may produce reflected light 501 that can be received at the detector of a PPG sensor of the smart watch.
Google's patent FIG. 6 above illustrates a filtering block configured to isolate a stray-light component of back-reflected light.
Lastly, Google notes in their filing that in these implementations, the light from the light source may be directed to a body area (e.g., wrist, forearm) used for touch control rather than directed to the body area below (i.e., under, covered by, etc.) the wearable device.
Google's patent application 21795230 was published in Europe on May 10, 2023.
While the Google patent points to the gestures being on made on the user's forearm, the patent figures suggest that this is accomplished by the PPG picking up gesture signals on the user's skin. There was no clarification if the gestures could be made on a user's shirt or blouse sleeve.
Comments