Apple Granted another Major 3D Sensing Patent that could be used with Future Macs, Apple TV and beyond
The U.S. Patent and Trademark Office officially published a series of 29 newly granted patents for Apple Inc. today. In this particular report we cover another fascinating PrimeSense patent that Apple has inherited that covers a 3D sensing device that works with distinct gestures that could control items on a display or TV. The patent points to this invention working with items like music, movies and games which makes it a strong candidate for working with Apple's iTunes on a Mac and with Apple TV's menus. In-Air gestures would allow a user to select an tune on iTunes to play or click on an icon to activate a channel like Netflix to open or select a movie or TV channel without the need of a remote control.
Granted Patent: Combining Explicit Select Gestures and a TimeClick Function in a Non-Tactile 3D UI
Apple's newly granted patent covers their invention relating to user interfaces for computerized systems, and specifically to user interfaces that are based on three-dimensional sensing.
Apple's granted patent comes by way of their PrimeSense acquisition. According to the patent, the invention covers a method including presenting, by a computer, multiple interactive items on a display coupled to the computer, receiving, from a depth sensor, a sequence of three-dimensional (3D) maps containing at least a hand of a user of the computer, detecting, in the maps, an explicit select gesture performed by the user toward one of the interactive items. The user could then select one of the interactive items responsively to the explicit select gesture, and actuating, subsequent to selecting the one of the interactive items, a TimeClick functionality for subsequent interactive item selections to be made by the user.
There is also provided, in accordance with an embodiment of the present invention an apparatus including a depth sensor, and a computer executing a non-tactile three dimensional (3D) user interface.
In Apple's patent FIG. 1 we're able to see a pictorial illustration of a non-tactile three dimensional (3D) user interface #20 for operation by a user #22 of a computer #26. The 3D user interface is based on a 3D sensing device #24 (also referred to herein as a depth sensor) coupled to the computer, which captures 3D scene information of a scene that includes the body (or at least a body part, such as one or more of hands) of the user. The depth sensor device or a separate camera (not shown in the figures) may also capture video images of the scene. The information captured by the depth sensor device is processed by the computer which drives a display so as to present and manipulate on-screen interactive items #38.
In patent FIG. 2 noted below we're able to see is a schematic, pictorial illustration showing visualization and interaction regions associated with the non-tactile 3D user interface.
Various methods may be used to determine when a body part such as a hand has crossed interaction surface 46 and where it is located. For simple tasks, static analysis of the 3D locations of points in the depth map of the body part may be sufficient.
Alternatively, dynamic, velocity-based detection may provide timelier, reliable results, including prediction of and adaptation to user gestures as they occur. Thus, when a part of the user's body moves toward the interaction surface for a sufficiently long time, it is assumed to be located within the interaction region and may, in turn, result in objects being moved, resized or rotated, or otherwise controlled depending on the motion of the body part.
Apple's patent FIG. 3 is a flow diagram that schematically illustrates a method of activating TimeClick functionality (also referred to herein as a TimeClick engaged mode). The TimeClick gesture comprises a user keeping a hand or hands relatively steady for a specific period of time while the computer is highlighting the subsequent interactive item. The specific period of time is also referred to herein as a hold-time or a hold-time parameter. Examples of virtual input devices include but are not limited to on-screen keyboards, keypads and game controls. For more on the on-screen Keyboard see our full report here.
Examples of explicit select gestures that can be used to select the highlighted interactive item include, but are not limited to Grab, Pull and Push gestures. The invention could be designed to also work with Apple TV as the patent notes that "in further embodiments the given interactive item is associated with a media item (e.g., a music track or a movie), and selecting a given interactive item comprises playing a media file associated with the given interactive item.
In Apple's patent FIGS. 6A and 6B we're able to see schematic pictorial illustrations of a computer presenting a flashlight cursor #100. The inventors selected a flashlight metaphor for cursor since positioning the flashlight cursor is similar to aiming a flashlight. Typically, when an individual wants to illuminate a given physical region (e.g., an area on a wall) with a flashlight, the individual first turns on the flashlight, repositions the flashlight so that the given region is illuminated, and then holds the flashlight steady in order to keep the given region illuminated.
In embodiments of the present invention, the user can select a given interactive item in a similar manner, by first pointing a hand toward the display in order to activate the flashlight cursor and then moving the hand in a transverse motion (i.e., along X-Y plane 40) in order to reposition the flashlight cursor over a given interactive item. While positioning the cursor the computer can animate the cursor with a visual effect such as changing the color and/or shading presented within the cursor.
In Apple's patent FIG. 7 noted below we're able to see a state diagram #110 that schematically illustrates the states and the transitions of the computer implementing the flashlight cursor function. Prior to the user performing a pointing gesture toward the display, the computer is in a flashlight cursor off state #112.
Apple credits Micha Galor, Jonathan Pokrass, Amir Hoffnung and Ofir Or as the inventors of granted patent 9,030,498 which was originally filed in Q3 2012 and published today by the US Patent and Trademark Office. Also see our January report titled "Apple Inherits PrimeSense's IP Relating to a 3D Scanning Engine."
Patently Apple presents only a brief summary of granted patents with associated graphics for journalistic news purposes as each Granted Patent is revealed by the U.S. Patent & Trademark Office. Readers are cautioned that the full text of any Granted Patent should be read in its entirety for full details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Comments are reviewed daily from 5am to 7pm MST and sporadically over the weekend.