The US Patent and Trademark Office officially published a series of 41 newly granted patents for Apple Inc. today. In this particular report we cover a major win for an advanced sensor-based user interface that could apply to a future iPhone, desktop display/monitor or even a television. The invention, if implemented, would enable a user to interact with a display at quite a distance. In a desktop application, as a user moves physically away from their desktop, the system will automatically switch from mouse controls to that of hand gesturing controls if the user so choses to do so. In theory, if applied to a TV scenario, the user would be able to use hand gesturing to control TV functionality without a remote. With new 3D depth cameras coming to market in late 2014, Apple may be able to further fine-tune such a unique system.
Apple Granted Patent for a Sensor-Based User Interface
Apple has been granted a major patent today for their invention that generally relates to computer user interface systems and methods, and more specifically to computer user interface systems and methods employing distance, depth and/or proximity sensors.
Although the system may apply to a future iPhone or iPod touch, the example actually used in their patent filing is that of a desktop computer display. Back in 2008, a TV wasn't a priority with Apple, yet this technology could equally apply to such a device if Apple wanted it to in the future.
Apple states that in some embodiments, a user gesture context may be determined based on information or user parameters detected by one or more sensors. In such embodiments, one or more operations of the computer may be performed based on the determined user gesture context.
For example, such operations may include, but are not limited to, scrolling, selecting, zooming, or the like. In various embodiments, such operations may be applied to an active application of the computer.
In general, user gesture contexts may be employed to allow a user to operate the computer remotely via the one or more sensors, without a remote controller or other auxiliary user manipulated input device.
The Sensor-Based User Interface for the Desktop
In Apple's patent FIG. 1 noted below we see a schematic representation of a computer which includes a sensor-based user interface. Apple notes that the method of implementing the sensor-based UI may include sensors (#110) being built-into the bezel and/or frame of a computer monitor. Apple notes that a single sensor can be dedicated to detecting a user's presence/absence and/or distance from the display.
Two examples of presence and distance are noted below in patent figures 6A/6B and 7A/B. When a user moves away from their computer the media on the display may be enlarged to provide information that is useable/viewable from an increased distance. Controls using a mouse in close proximity to the computer will automatically shift to gesture controls at a distance.
Presence Sensors will Automatically Control Privacy
Another interesting feature of this new system involves user content that may be able to be hidden or revealed by this new system using presence automatically as noted below. The system may enable or disable applications based on a user's presence or absence.
According to Apple's patent, private or sensitive information may only be viewable when the user presence context is that a single user is present, defining a private user presence context. Such information may be automatically hidden, for example, when a change in the user presence context is determined by detecting the presence of another person in a vicinity of the computer system (or the monitor thereof, for example).
Closing or hiding windows based on such a change in user presence context to a public user context may present the private or sensitive information from being viewed by others, even if the user is unaware of the other person's presence or forgets that the information is viewable, private or sensitive.
The Types of Sensors Associated with this New UI
Apple notes that the systems and methods describing a sensor-based UI may be of any suitable type, either currently known or hereafter developed, that is configured to detect one or more of depth, distance, proximity, presence, or the like. For example, approaches such as near field radio frequency (RF), ultrasonic, infrared (IR), antenna diversity, or the like may be employed.
This lists is not intended to be exhaustive, and it should be understood that other sensors may be employed as well, including, but again not limited to, visible light sensors, ambient light sensors, mechanical vibration sensors.
Apple states that for gesture extraction, it should be understood that the sensor system and/or context engine may be calibrated and trained to learn a specific set of gestures, for example, from a specific user.
Such a process may be adaptive, for example, beginning with a relatively small set of gestures that are more readily detectable or recognizable/distinguishable, and building a user-dependent gesture database with more complex gestures through training and/or use.
Gestures may have time and/or sequence dependence, for example, such that a specific sequence of detected user movements may be interpreted as a gesture. Gestures may also be distance dependent, for example, such that gestures include a distance component. A user will also be able to tap into known multitouch gestures now used on iDevices.
In Apple's patent FIG. 3 noted below we see a schematic block diagram illustrating an example of a sensor-based user interface system for a computer system
The user context engine may also be configured to "learn" or to be "trained" to recognize various user contexts and/or changes in user contexts, for example, by executing a training algorithm with a user providing various inputs via the sensor(s). In such a manner, the user context database may be populated or "educated."
As appropriate or desired, the user context database may be populated for specific users so that the sensor-based user interface is tailored, for example, to the characteristics and/or mannerisms of the particular user to better define the user parameters to be detected by the sensor(s).
An Overview of the Sensor-Based User Interface
In Apple's patent FIG. 4 below we see a block diagram illustrating a more detailed example of the sensor-based user interface system shown in FIG. 3.
Apple notes that in the sensor-based computer user interface systems, the control of and/or response by the computer system may be at an operating system level or at an application level.
For example, on the operating system level, the operating system may increase screen brightness if the user is determined to be far from the computer display. The operating system may transfer control of a displayed pointer from a mouse, for example, when the user is near, to gesture detection, for example, when the user is far.
Apple credits David Falkenburg, Aleksandar Pance and Jason Medeiros as the inventors of this granted patent which was originally filed in Q3 2008 and published today by the US Patent and Trademark Office. To review today's granted patent claims and details, see Apple's patent.
A Note for Tech Sites Covering our Report: We ask tech sites covering our report to kindly limit the use of our graphics to one image. Thanking you in advance for your cooperation.
Apple Granted Two Design Patents Today
On another note, Apple was granted two design patents today which cover the iBook Author logo and Apple's original iPad cover.
Patently Apple presents only a brief summary of granted patents with associated graphics for journalistic news purposes as each Granted Patent is revealed by the U.S. Patent & Trademark Office. Readers are cautioned that the full text of any Granted Patent should be read in its entirety for full details. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.