Today, the U.S. Patent and Trademark Office published a patent application from Apple that reveals that advanced gestures for Apple Watch based on motion may work their way to market over time. Some of the advanced gestures will eventually cover sign language, according to Apple. Today as the user raises their arm to see what time it is or to check out an app, the Apple Watch goes from sleep mode to being fully awake. Today's invention goes way beyond that basic motion command to things like open your car door or answer a call.
Apple's Patent Background
Many existing portable electronic devices use voice or touch input as a method for the user to communicate commands to the devices or to control the devices. One example is a voice command system, which can map specific verbal commands to operations, for example, to initiate dialing of a telephone number by speaking the person's name. Another example is a touch input system, where the user can choose a specific device setting, such as adjusting the volume of the speakers, by touching a series of virtual buttons or performing a touch gesture. While voice and touch input can be an effective way to control a device, there may be situations where the user's ability to speak the verbal command or perform the touch gesture may be limited.
Apple's invention relates to a device like Apple Watch that detects a user's motion and gesture input through the movement of one or more of the user's hand, arm, wrist, and fingers, for example, to provide commands to the device or to other devices.
The device can be attached to, resting on, or touching the user's wrist, ankle or other body part. One or more optical sensors, inertial sensors, mechanical contact sensors, and myoelectric sensors, to name just a few examples, can detect movements of the user's body. Based on the detected movements, a user gesture can be determined.
The device can interpret the gesture as an input command, and the device can perform an operation based on the input command. By detecting movements of the user's body and associating the movements with input commands, the device can receive user input commands through another means in addition to, or instead of, voice and touch input, for example. Apple's patent FIGS. 3A-3H noted below illustrate exemplary finger and wrist movements
Apple's patent FIG. 9A noted below illustrates exemplary gestures and corresponding commands; FIGS. 9D-9E illustrate exemplary hand and wrist movement.
Apple's patent FIG. 9B noted below illustrates an exemplary process flow for determining a command based on the user's movement; FIGS. 9F-9H illustrate exemplary finger movements associated with sign language.
Apple's patent FIG. 4 below illustrates an exemplary configuration of a wearable device attached to the wrist of a user. A user's arm includes fingers, a hand and wrist. Apple Watch (Device #400) can be attached to, resting on, or touching a user's skin at any body part, such as the user's wrist. Muscles #430 can be attached to the bones in fingers #402 through tendons #410.
When a user wants to perform any one of the movements illustrated in FIGS. 3A-3H noted in our first graphic, the fingers, wrist and hand can move when the user's brain sends electrical signals to stimulate muscles. The muscles can contract in response to the received electrical signals. In response to the received electrical signals, the tendons attached to muscles can also contract or move and can cause the fingers, wrist and hand to move. As the tendons contract or move, the Apple Watch can detect the movement of the tendons, the electrical signal, or both. Based on either the tendon movement or electrical signal or both, the Apple Watch can determine the user's motion and gesture. The motion and gesture can be interpreted as commands to the device or another device.
Apple's patent FIG. 5A noted above illustrates a cross-sectional view of a wrist and an exemplary device with motion and gesture sensing using optical sensors; FIG. 5B illustrates a top view of a wrist and an exemplary device with motion and gesture sensing using optical sensors.
Apple's patent FIG. 10 noted below illustrates an exemplary block diagram of a computing system comprising one or more motion and gesture sensors for determining a user's gesture or motion; FIG. 9B illustrates an exemplary process flow for determining a command based on the user's movement; and lastly patent FIG. 6 illustrates a plan view of an exemplary device with motion and gesture sensing using inertial sensors.
Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Comments are reviewed daily from 5am to 6pm MST and sporadically over the weekend.