Apple to Advance iPhone's Audio Quality via New Sensory System
Apple Files Four iAd Trademark Applications

Is this a Part of Apple's Revolutionary OS X Feature in Development?

1 - cover - virtual input device - apple sept 2010 
In July of this year a job posting by Apple stated that they were working on a new revolutionary Mac OS X feature. Today the U.S. Patent and Trademark Office published an Apple patent that could very well play a part of this revolutionary feature that Apple has in development. It's a cool virtual input device application that works in both 2D and 3D. The technology will somehow allow you to project an input device into the display of the unit that could then be used instead of the physical input device. For instance, a physical touchpad could be created virtually on your display and then used as you would your physical trackpad. That would require the display, in some applications, to be that of a touch display. The patent hints that it could also relate to gaming. That could mean adding virtual gamepad controls to the display if it's used in tablet mode or transferred to a tablet – like the iPad. This is one wild invention that will definitely take some time to fully understand and appreciate. But at the end of the day, we could all get a little buzzed thinking about what's in the works for us: Sweet!  

 

Patent Background

 

Apple states that traditional user interfaces allow a user to navigate between one or more interface elements (e.g., application windows) through the use of physical input devices (e.g., a keyboard, mouse, trackpad or touchpad). For example, a user could use a mouse or trackpad to search for and activate (e.g., by clicking on) individual interface elements in the user interface.

 

In particular, input received at an initial position on a trackpad can be compared to subsequent positions of input received on the trackpad. The relative change from the initial position to the subsequent positions determines an amount and direction of movement of a cursor, for example, from the cursor's current position in the user interface. In other words, the cursor's movement is based on a relative change in positions of input received through the trackpad. Because the cursor's movement is based on the relative change in positions on the trackpad, a position on the trackpad does not correspond to a single position in the traditional user interface. In addition, interaction between the user and the interface depends on the initial position of the cursor in the interface.

 

Patent Summary

 

Apple's patent is about a virtual input device, e.g., a virtual representation of a physical input device. In one aspect, virtual coordinates of the virtual input device correlate to real coordinates on the physical input device. Dimensions of the physical input device are proportional to dimensions of the virtual input device, and interactive objects are presented in the virtual input device.

 

Particular embodiments in this patent could be implemented to realize one or more of the following advantages. Virtual representations of input devices, that could include interactive, virtual representations of objects (e.g., application windows, applications, directories), allow a user to navigate to the objects more efficiently, thereby improving an ease of interacting with the objects and improving a user's experience.

 

A virtual representation of an input device could be a two-dimensional area that increases an amount of data (e.g., virtual representations of objects) that can be presented at a particular time, thereby improving the user's experience.

 

Furthermore, Apple states that the virtual representations of the input devices could have dimensions that are proportional to the input devices (e.g., physical input devices). As a result, the user could interact with an interface more efficiently because input provided by the user through the input device corresponds visually with indications of that input in the virtual input device.

 

In particular, a user does not have to look at the input device when interacting with the virtual input device, as the user could expect that his/her input through the input device will correspond to similar input (or interaction) at the virtual input device. In addition, because each position on the virtual device corresponds to a single position on the physical device, a user could navigate through an entire virtual space of the virtual device (e.g., up to and including the borders of the virtual representation) using a single gesture (e.g., by keeping a user's finger down on a trackpad).

 

Revolutionary Mac OS Feature in Development

 

On July 30, 2010 an Apple Job opening was listed that described Apple working on a new revolutionary Mac OS X feature. Gizmodo captured that job posting as noted in-part below. Could this new virtual input device application for OS X be a part of Apple's revolutionary feature? Could it be a part of Apple's future 3D version of Mac OS X? It very well might, considering that yet another multi-dimensional patent has surfaced this morning entitled "Interface Navigation Tools" under 20100251170 which further supports Apple's 2008 patent on the very same subject matter. A 3D version of OS X is very much alive if Apple's patents are any indication.

 

2 A - revolutionary os x feature in the makinig - job opening at Apple july 30, 2010 
 

Virtual Input Device Application

 

2 b - virtual input device application - fig 1 apple patent 
 

Apple's patent FIG. 1 is a block diagram showing an example virtual input device application 100. The virtual input device application includes an identification engine 110 for identifying input devices and input through the input devices (e.g., physical input devices); a render engine 120 for rendering content; a mapping engine 130 for mapping/correlating input for presentation; a preferences engine 140 for setting preferences associated with, for example, the display and configuration of a virtual input device; an interactivity engine 150 for processing interactions between a user and a physical input device and a corresponding virtual input device, for example; and a presentation engine 160 for presenting a virtual input device to a user.

 

In some implementations, the render engine could render interface windows (e.g., Finder and application windows in Mac OS X) such that they can be displayed in the virtual input device. For example, the render engine could interact with Expose for Mac OS X (and presentation engine 160) to render and present the interface windows in the virtual device. In some implementations, only open, unhidden interface windows are scaled for display in the virtual input device. In some implementations, all open and unhidden windows for a currently active application could be scaled for display in the virtual input device. In some implementations, all open windows (hidden and unhidden) could be scaled for display in the virtual input device. Other implementations are possible.

 

The Mapping Engine

 

The mapping engine could be used to correlate or map virtual coordinates on the virtual input device to physical coordinates on the physical input device. For example, the mapping engine could generate a grid for each of the virtual input device and the physical input device that includes coordinates corresponding to the respective device. In some implementations, the mapping engine could use anamorphic scaling. This type of anamorphic scaling can be desirable because users may be more likely to lift their fingers off of a trackpad, for example, before reaching the edge of a trackpad, but still intend to provide input to the edge of the trackpad.

 

The Interactivity Engine

 

The interactivity engine could process interactions between a user, a virtual input device, and a physical input device, for example, by storing information describing the various types of input provided by the user at the physical input device. The interactivity engine could use such stored information to determine what action is desired in response to a user's interaction with physical input device, and to perform the desired action.

 

For example, the interactivity engine could (1) receive an indication that a user has tapped an upper right quadrant of a trackpad, (2) determine that an interface window associated with the upper right quadrant of a trackpad should be activated, and (3) initiate and facilitate a request and display of the interface window.

 

As another example, the interactivity engine may (1) receive an indication, e.g., a gesture such as a slide of a finger across the upper right quadrant of the trackpad, that a user would like to preview the interface window, (2) determine that a visual representation of the interface window should be displayed on the virtual input device, (3) render a smaller representation of the interface window (e.g., using the render engine 120), and (4) present the smaller representation of the interface window in the upper right quadrant of the trackpad. As another example, the interactivity engine could be used to hide and display the virtual input device in response to a predetermined input, e.g., a gesture such as a four finger touch on a multi-touch trackpad. This system will acknowledge multi-touch gestures as well.

 

Example Virtual Input Devices

 

3 - future macbook displaying virutal input device fig 2 apple inc 
 

Apple's patent FIG. 2 is a diagram of an example computer laptop 200 that is displaying a virtual input device 240. The computer's two physical input devices include an Apple Multi-Touch trackpad and keyboard 230. In some implementations, the computer could include or be connected to other types of input devices. The computer could be connected to additional pointing devices including, for example, mice, joysticks, pointing sticks, and digital graphics tablets. As another example, the computer could be connected to audio or video input devices including, for example, microphones, webcams, and scanners (e.g., image scanners).

 

A virtual input device application could generate a virtual input device (e.g., a virtual trackpad 240) based on a physical input device (e.g., trackpad 220). In some implementations, the virtual trackpad has dimensions that are proportionate to the trackpad. A user could interact with virtual objects (e.g., virtual objects 242, 244, and 246) displayed on the virtual trackpad 240 by providing input at corresponding positions (e.g., physical coordinates) on the trackpad 220.

 

In some implementations, a user could provide input (e.g., gestures) to switch among two or more data sources or documents, so that objects associated with a selected data source or document (e.g., hierarchies within a document) are presented in the virtual input device. Furthermore, in some implementations, a user could provide input to change the magnification of the virtual input device or the objects presented in the virtual input device to enlarge or shrink the respective visual representations.

 

Because the dimensions of the virtual trackpad 240 are proportionate to the trackpad 220, a user does not have to look at the trackpad 220 to select virtual objects, for example, displayed on the virtual trackpad 240.

 

Example User Interfaces that Include a Virtual Input Device

 

Now this is where it begins to get interesting. Apple's patent FIG. 3 noted below illustrates an example interface 300 that includes a virtual input device 310. The virtual input device (e.g., a virtual trackpad) includes virtual objects 311, 312, 313, 314, 315 and 316. In this example, the virtual objects 311-316 are visual representations (e.g., virtual representations) of corresponding interface windows in the interface 300. For example, virtual object 311 could be a smaller visual representation of interface window 321, and virtual object 312 could be a smaller visual representation of interface window 322. The cool part is that the virtual input device/trackpad holds these virtual objects on a slanted angle so that you could still see the display.

 

Other implementations are possible. The virtual objects displayed on a virtual input device could be different types of visual representations of different types of content. For example, the visual representations could be images, animations, or videos. In addition, the visual representations could be two-dimensional or three-dimensional representations. Furthermore, the visual representations could be representations of different types of content. For example, the visual representations could represent documents, interface elements (e.g., interface windows), directories, and other types of objects (e.g., controls for a video player) and content that could be displayed in the interface.

 

The virtual objects could also be interactive. For example, a virtual object could be initially presented as a two-dimensional representation. Upon receiving input selecting or activating the virtual object, the virtual input device application could animate the virtual object such that it becomes a three-dimensional representation, or the virtual object provides a preview, e.g., the virtual object becomes a video clip (QuickTime Movie Trailer) or plays an audio clip (including iTunes).

 

In some implementations, the virtual input device application could generate text related to the virtual objects. For example, the text could be used to identify the virtual object. The text could be initially presented with the virtual objects, or be presented only with selected or activated virtual objects.

 

The virtual input device 310 also includes a control object 318. In this example, the control object could be used to deactivate (e.g., hide, remove from display, or exit) the virtual input device 310. In some implementations, the control object allows a user to cancel an interaction or selection. For example, a user could interact with the virtual input device without changing any content as long as the user's fingers are in continuous contact with a physical trackpad. The user could cancel any interaction or selection by lifting the user's fingers off at the position of the control object 318.

 

In some implementations, the virtual input device could be controlled based on other types of predetermined inputs. For example, a single tap on a trackpad could result in the virtual input device application entering a first mode, where a virtual trackpad is displayed in an interface. A double tap on the virtual trackpad could remove the virtual trackpad from display in the interface, in a second mode.

 

4 - example virutal ui with virtual input devices - apple inc patents sept 2010 
 

Apple's patent FIG. 5 illustrates the virtual input device 310 of FIG. 3, where another virtual object (e.g., virtual object 311) is selected. The virtual object 311 is indicated as the user's selection by the indicator 319. As shown in FIG. 5, the interface window 321 could be brought into user focus (e.g., the foreground of the interface 300) in response to the selection of virtual object 311.

 

Overview of System for Generating a Virtual Input Device

 

5 - virtual input device system overview - apple patent sept 2010 
 

Apple's patent FIG. 7 is a block diagram showing a system 700 for generating a virtual input device. Processing device 710 may include, for example, a computer, a gaming device, a messaging device, a cell phone, a personal/portable digital assistant ("PDA"), or an embedded device. Operating system 720 may include, for example, Mac OS X from Apple Inc. of Cupertino, Calif.

 

Stand-alone application 730 may include, for example, a browser, a word processing application, and a database application, an image processing application, a video processing application or other application.

 

Content source 740 and content sources 760 may each include, for example, a document having any of a variety of formats, files, pages, media, or other content, and content sources 740 and 760 may be compatible with stand-alone application 730.

 

Presentation device 780 may include, for example, a display, a computer monitor, a television screen, a speaker or other output device. Input device 790 may include, for example, a keyboard, a mouse, a microphone, a touch-screen, a remote control device, a speech activation device, or a speech recognition device or other input devices.

 

Apple credits John Louch as the sole inventor of patent application titled "Virtual Input Tools" originally filed in Q1 2009.

 

Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for further details. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.

 

A hearty congratulations goes out to the New Calgary Chinook Centre Apple Store team for a great opening yesterday! I hope to be down again on Sunday. Cheers!  

 

 

Comments

osx 10.7 baby...no not even...os11 yes its about time they start focusing on bigger and better things than the iPhone. The iPhone 4 is a great device I have one, but my iMac needs some updates.

The comments to this entry are closed.