On December 1, 2011, the US Patent & Trademark Office published a patent application from Apple that reveals one of the next chapters for Apple's hardware using non-visual controls. Apple's new invention also describes using Siri-like voice control assistance in future devices. This is likely the first of many new patents to follow covering this new trend of advancing to next generation interfaces that will one day be used in our home appliances, built-into our kitchen counter tops, used in our vehicle dashboards and far beyond.
Non-Physical Buttons or Controls: Apple's Patent Background
Devices are typically operated by controls. Controls may be physical or non-physical. The location of physical controls may be detected by touch alone. For example, even when in the dark, a user could locate a light switch by feeling the wall around the location in which light switches are typically positioned.
Non-physical controls, on the other hand, can't be detected by touch alone. For example, an icon on a computer screen can't be detected by simply running one's fingers across the computer screen. In this example, the icon can't be physically detected because the screen location of the icon feels no different to the touch than the rest of the screen.
Users typically rely on sight to locate non-physical controls. For example, to select an icon that is displayed on a touch screen, a user would typically look at the screen to locate the icon, and then use visual feedback to guide the user's finger to that location.
When users aren't able to easily see the non-physical controls of a device, it may become difficult or impossible for them to operate the device. A variety of circumstances may lead to situations in which users aren't able to see non-physical controls of a device. For example, some users may be visually impaired or lighting may be insufficient. As yet another example, some devices may not be able to generate visual depictions of non-physical controls, either because the devices or broken, or because they are not designed with that functionality.
Even when conditions exist that would otherwise allow a user to see a non-physical control, the user may have reasons for not looking at the device. For example, a user that is watching a movie or driving a vehicle may want their vision to remain focused elsewhere, rather than looking for the location of a non-physical control of a device. Similarly, a user may want to operate a device while keeping the device in his or her pocket.
Based on the foregoing, it is clearly desirable to help users operate non-physical controls when the users either can't or don't want to use their vision to locate the non-physical controls.
The approaches described in Apple's patent application are approaches that could be pursued, but not necessarily approaches that have been previously conceived or pursued.
Apple's invention relates to providing non-visual feedback to a user of a device and, in particular, providing non-visual feedback to assist the user in use of non-physical controls of the device.
In one example, Apple points to audio feedback, and the characteristic is volume. In such an embodiment, the volume of a tone emitted by the device may increase the closer the user input gets to the location of the non-physical control. Conversely, the volume of the tone would decrease as the user input moves away from the location of the non-physical control.
According to one embodiment, when the user input is sufficiently close to the control for the user to operate the control, that fact is reflected in the non-visual feedback that is generated by the device. For example, the device may generate a "beep" and cease to emit the tone when a user's finger touches an icon. At that point, the user may operate the control as desired.
Audio feedback is merely one example of non-visual feedback that may be used by the device to assist the user in locating non-physical controls of the device. Instead of or in addition to audio feedback, the device may provide tactile feedback. For example, a device that has a vibrator may vibrate to help the user locate a non-physical control, where the intensity of the vibration is based on the distance between the current location of user input and the location of the non-physical control.
Devices with Non-Physical Controls
Apple states that there are many types of devices that have non-physical controls. For example, a device may have a large track pad with no screen, where the non-physical controls are "located" but not displayed at particular locations on the track pad.
A Device Configured to Generate Non-Visual Feedback
Apple's patent FIG. 1 is a block diagram of a device that is configured to generate non-visual feedback to assist a user in locating a non-physical control. In the example, the device is a computing device 102 with a screen 104 that is displaying a control 106. For the purpose of explanation, it shall be assumed that computing device 102 receives user input by a user touching the screen 104 with a finger.
Radius Feedback and the Activation Zone
Apple states that the feedback radius (108), shown in the patent graphic below, is the radius of a circle that is centered on control 106. If the current location of user input is within the circle defined by the feedback radius, then the computing device generates non-visual feedback to indicate that the current location of user input is near the control.
For the purpose of illustration, it shall be assumed that the non-visual feedback is a sound. Under these circumstances, the computing device would generate a sound when the current location of user input is at any of points 114 and 116, and wouldn't generate a sound when the current location of user input is at point 112.
While the current location of user input is within the circle defined by the feedback radius, the computing device varies a characteristic of the non-visual feedback based on the distance between (a) the current point of user input and (b) the location of the control.
For the purpose of explanation, it shall be assumed that the computing device is designed to vary the volume of the sound based on the distance between the current location of user input and the location of the control. In an embodiment that increases volume as the distance between the current location of user input and the location of the control decreases, the computing device will generate a relatively louder sound when the user input is at point 116 than when the user input is at point 114.
According to one embodiment, the non-visual feedback is altered when the current location of user input is sufficiently close to the control to allow user activation of the control. The region within which a user is able to active a control is referred to herein as the "activation zone" for the control.
In one embodiment, the computing device may cease to generate a sound when in the activation zone, and instead emit a single "ding". Alternatively, the device may vibrate when the user input enters the activation zone. These are merely two examples of a virtually unlimited number of ways in which the computing device may communicate to a user using non-visual feedback.
Overlap Zones and the use of Apple's Siri
Apple discusses when two non-visual feedback control activation zones overlap. In the example illustrated in FIG. 2, the feedback radius 108 of control 106 noted in the first patent graphic is the same length as the feedback radius 208 of control 206 shown below. However, in alternative embodiments, different controls may have different feedback radius lengths. In such an embodiment, the feedback radius lengths may vary, for example, based on the frequency at which the respective controls are used or expected to be used.
When the computing device is generating multiple controls that have their own feedback radius, it is possible for there to be zones in which the circles defined by those feedback radii overlap. For example, point 212 is within an overlap zone that is within the feedback radii of both control 106 and control 206.
According to one embodiment, the computing device provides non-visual feedback to indicate to a user that the current location of user input is within an overlap zone. For example, when the user's finder is touching point 212, the computing device may audibly inform the user that "Okay button to the left. Open button to the right."
In another example of using Siri, Apple states that rather than generate no non-visual feedback when the current location of user input is at point 112, the computing device may generate non-visual feedback to communicate which controls are currently being displayed on screen 104, and where those controls are generally located. For example, in response to detecting that the user is touching point 112, computing device 102 may audibly indicate "Okay button is at the left middle, and Open button is at the right middle".
To Infinity and Beyond! Okay, to the Kitchen and Far Beyond!
While the next great thing from Apple may very well be a true next generation HDTV, Apple is already onto projects that go well beyond that into the future. They may not be priorities just yet, but they're on Apple's roadmap nonetheless. How do I know that? Well, about a year ago, Apple acquired a series of patents numbering around twenty from a Canadian by the name of Timothy Pryor. One of the patents was titled "Control of Appliances, Kitchen and Home (If our link to that patent ever dies, then the patent number for reference sake is 20100231506).
The middle graphic illustrated below is from that very patent. The image below that one is from Jackie Chia-Hsun Lee from MIT which did research into what he titled: Spatial User Interfaces: Augmenting Human Sensibilities in a Domestic Kitchen. Both Tim Pryor's patent and Mr. Lee's research cover the future kitchen that will utilize specialty projectors and smart instructional counter tops and appliances. Obviously there's a race to patent all-things related to next generation kitchens, and Apple is now in that race.
In another patent by Tim Pryor that Apple acquired, we see that it touches on advancing a car's dashboard in a way to assist future Apple devices.
The use of smart countertops and kitchen appliances will in part use multi-touch technology, but also voice control assistance. Apple's patent application published today is very vague in where these new non-visual controls will be utilized. Of course in round one, the new controls will work with Apple's current products in one way or another. But in round two, the technology promises to go well beyond what we can currently understand. So when Apple states in their patent that "...the techniques described herein are not limited to any particular type of device" - they really mean it!
Apple's patent application was originally filed in Q3 2010 by inventor Amir Djavaherian.
Notice: Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. Revelations found in patent applications shouldn't be interpreted as rumor or fast-tracked according to rumor timetables. Apple's patent applications have provided the Mac community with a clear heads-up on some of Apple's greatest product trends including the iPod, iPhone, iPad, iOS cameras, LED displays, iCloud services for iTunes and more. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.
Here are a Few Sites Covering this Original Report: MacSurfer, Twitter, Facebook, Apple Investor News, Google Reader, CeArab, Macnews, iPhone World Canada, MarketWatch, MacDailyNews, and more