Apple invents Finger devices to work with their Spatial Computer in manipulating data on a virtual display with a Virtual Trackpad
Apple Vision Pro allows a user to control or interact with visionOS with simple hand gestures. Yet there may be times where additional “controllers” may be required to control Apple TV content, video game play, a presentation or working on documents are charts and for that finger devices could be used manipulated data. Today the US Patent & Trademark Office published a patent application from Apple that relates to using a finger device on a virtual trackpad within a workspace in Vision Pro.
Extremity Tracker generating a Virtual Trackpad for Vision Pro
Apple's invention covers a method that includes obtaining extremity tracking data via an extremity tracker. The method includes displaying, on the display, a computer-generated representation of a trackpad that is spatially associated with a physical surface.
The physical surface is viewable within the display of Vision Pro along with a content manipulation region that is separate from the computer-generated representation of the trackpad. The method includes identifying a first location within the computer-generated representation of the trackpad based on the extremity tracking data.
The method includes mapping the first location to a corresponding location within the content manipulation region. The method includes displaying, on the display, an indicator indicative of the mapping. The indicator may overlap the corresponding location within the content manipulation region.
The electronic device displays an indicator indicative of the mapping. For example, based on finger manipulation data, the electronic device determines that the finger-wearable device is hovering over or contacting the center of the computer-generated representation of a trackpad. Accordingly, the electronic device displays an indicator at the center of the content manipulation region. By displaying an indication of the mapping, the electronic device provides feedback to a user characterizing the finger-wearable device engaging with the content manipulation region in some implementations. The feedback reduces the number of erroneous (e.g., undesired) inputs the electronic device receives from the finger-wearable device, thereby reducing resource utilization by the electronic device.
Accordingly, various implementations disclosed in Apple's patent enable a user to effectively engage with (e.g., manipulate) content that is within a content manipulation region. For example, when the finger manipulation data indicates that the finger-wearable device is drawing a circle on the computer-generated representation of the trackpad, the electronic device displays a corresponding representation of the circle within the content manipulation region. Accordingly, as compared with other devices, the electronic device provides more control and accuracy when engaging with the content manipulation region.
The finger-wearable device can be worn by a finger of a user. In some implementations, the electronic device tracks the finger with six degrees of freedom (6DOF) based on the finger manipulation data. Accordingly, even when a physical object occludes a portion of the finger-wearable device, the electronic device continues to receive finger manipulation data from the finger-wearable device.
Below are a series of Apple patent figures showing examples of an electronic device mapping a computer-generated trackpad to a content manipulation region within Vision Pro.
The electronic device in the patent figures above correspond to a head-mountable device (HMD) that includes an integrated display that displays a representation of the operating environment.
The display data may be characterized by an XR environment. For example, the image sensor obtains image data that represents the portion of the physical table #302 and the physical lamp #304, and the generated display data displayed on the display #312 displays respective representations of the portion of the physical table and the physical lamp,
In some implementations, the electronic device (HMD) includes a see-through display. The see-through display permits ambient light from the physical environment through the see-through display, and the representation of the physical environment is a function of the ambient light. For example, the see-through display is a translucent display, such as glasses with optical see-through. In some implementations, the see-through display is an additive display that enables optical see-through of the physical surface, such as an optical HMD (OHMD).
The physical surface (e.g., of the physical table #302) is viewable within the display #312 along with a content manipulation region #330 that is separate from the trackpad #324.
For example, the content manipulation region includes application content, such as web browser content, word processing content, drawing application content, etc. (as presented at the WWDC23 Keynote as presented below) Based on the finger manipulation data from the finger-wearable device #320, the HMD determines a mapping between the trackpad and the content manipulation region.
As Apple presented above, Apple Vision users could work with a virtual keyboard, and today's patent confirms that with a finger device, user's will be able to work with a virtual trackpad for manipulate content within Vision Pro.