Apple won patents for keyboards that's operate in both text & gesture modes plus Input Location Correction based on device motion
Yesterday the U.S. Patent and Trademark Office officially published two granted patents from Apple that relate to future keyboards that may be able to operate in both text & gesture modes plus another patent covering Input Location Correction based on device motion. The latter patent could also apply to CarPlay systems assisting users hit the intended target icon should the vehicle hit a bump in the road.
System For Improving User Input Recognition On Touch Surfaces
Apple engineers have been exploring a project since at least 2018 that relates to keys on a keyboard doubling as a form of trackpad. A second patent surfaced in 2020 and a third in 2021. Yesterday, Apple was granted another such invention.
Apple notes in their patent that a physical keyboard can be used to collect user input in a typing mode or in a tracking or gesture mode. To use a tracking mode, first movement data is detected for a hand of a user in relation to a physical keyboard at a first location. The keyboard could be a part of a MacBook, an iMac, a future version of the iPad's Magic Keyboard and more.
According to one or more embodiments, a tracking mode may be triggered based on a user gesture detected by visual means or by user input to the physical keyboard.
For example, a particular key or combination of keys on the keyboard may trigger the tracking mode. The key or keys may be physically pressed or determined to be selected based on visual means by tracking a location of the hand in relation to the key or combination of keys. Further, a particular gesture may trigger the tracking mode, such as a touch or swipe on a predetermined part of the physical keyboard.
Apple's patent FIG. 1 below shows an example system setup (#100) in which techniques for utilizing a tracking mode on a physical keyboard are employed. In some embodiments, gaze tracking data may be obtained to identify a portion of a user interface (#160) at which the user's gaze is directed. Further, Fig. 1 illustrates a swiping action/gesture #130 on the spacebar; FIG. 6 shows a flowchart of a technique for using a combination of visual and contact sensor data to determine user input.
Apple's patent FIG. 9 above is a simplified block diagram of a network of computing devices which may be utilized to provide vision-based tracking on a physical keyboard. In some embodiments, this tracking comes by way of one or more cameras #930 and/or other sensors #915, such as one or more depth sensors. For more, review Apple's granted patent 11755124.
Input Location Correction Based On Device Motion
In another granted patent published yesterday, Apple describes techniques for improving the operation of electronic devices such as iPhone or CarPlay with touch-sensitive displays (touchscreens) under conditions when the device and/or the user's hand are subjected to forces that produce unintended motion of the device and/or the user's hand.
In particular, a touchscreen may be used in environments where either or both of the display or the user are subjected to forces that may cause a user to miss an intended touch target (e.g., a location on a touchscreen that the user intends to touch, such as a virtual button).
For example, when attempting to touch a virtual button on a touch-sensitive display in a vehicle (e.g., in the dashboard), a bump in the road may cause the user's hand to move in an unexpected manner, causing the user to touch the display in an area away from the virtual button (e.g., the user misses their intended touch target).
Similar problems may occur in many other contexts and with other devices. For example, bumps, impacts, and other forces may reduce touch-input accuracy when using a mobile phone as a passenger in a vehicle, or while attempting to provide touch inputs to a smart watch during a jog.
Apple's patent covers systems and techniques that can mitigate the effects of dynamic motion environments on touchscreen operation.
Information about absolute and relative motion of a touchscreen device and/or an input member may be captured or otherwise determined in various ways and using various sensing systems, including but not limited to accelerometers, inertial sensing systems, gravitometers, global positioning systems (GPS), gyroscopes, spatial sensors (e.g., for capturing data characterizing the presence, shape, and/or motion of three-dimensional objects in space), optical sensors (e.g., cameras), near- and/or far-field capacitive sensors, or the like.
Apple's patent FIGS. 1A-1B below illustrate a user attempting to touch an icon on their iPhone when motion makes the user miss the intended target icon; FIG. 3 illustrates an iPhone that uses a spatial sensor tracking the user's finger; FIGS. 2A-2B illustrate a location of a touch input on an electronic device being affected by motion of the electronic device and/or an input member.
Apple's patent FIG. 2C above illustrates a vehicle's CarPlay touchscreen receiving a touch input that may be misaligned in the case of the vehicle hitting a bump in the road; FIG. 8 illustrates an example process for compensating for effects of motion on input accuracy. For more on this, review Apple's granted patent 11755150.
Comments