Apple Reveals Next-Gen Eye and Hand-Tracking to assist users work in 3D Environments on Macs, XR Headset, an iPad & more
Today the US Patent & Trademark Office published a patent application from Apple that relates to methods of moving objects in 3D environments in an XR Headset, on future Macs and iDevices, specifically an iPad using both eye and hand-tracking gestures.
In Apple's patent background they note that methods and interfaces for interacting with environments that include at least some virtual elements (e.g., applications, augmented reality environments, mixed reality environments, and virtual reality environments) are cumbersome, inefficient, and limited. For example, systems that provide insufficient feedback for performing actions associated with virtual objects, systems that require a series of inputs to achieve a desired outcome in an augmented reality environment, and systems in which manipulation of virtual objects are complex, tedious and error-prone, create a significant cognitive burden on a user, and detract from the experience with the virtual/augmented reality environment. In addition, these methods take longer than necessary, thereby wasting energy. This latter consideration is particularly important in battery-operated devices.
Accordingly, there is a need for computer systems with improved methods and interfaces for providing computer generated experiences to users that make interaction with the computer systems more efficient and intuitive for a user. Such methods and interfaces optionally complement or replace conventional methods for providing computer generated reality experiences to users. Such methods and interfaces reduce the number, extent, and/or nature of the inputs from a user by helping the user to understand the connection between provided inputs and device responses to the inputs, thereby creating a more efficient human-machine interface.
Although Apple's patent could apply to future Macs, Apple Watch, an XR Headset, according to Apple, their patent filing focused on user's manipulating 3D environments using eye and hand-tracking applied to an iPad.
Apple notes that in some embodiments, a computer system displays a virtual environment concurrently with a user interface of an application. In some embodiments, the user interface of the application is able to be moved into the virtual environment and treated as a virtual object that exists in the virtual environment.
In some embodiments, the user interface is automatically resized when moved into the virtual environment based the distance of the user interface when moved into the virtual environment.
In some embodiments, while displaying both the virtual environment and the user interface, a user is able to request the user interface be displayed as an immersive environment.
In another aspect of the invention, Apple notes that examples of input devices include a touch screen, mouse (e.g., external), trackpad (optionally integrated or external), touchpad (optionally integrated or external), remote control device (e.g., external), another mobile device (e.g., separate from the electronic device), a handheld device (e.g., external), a controller (e.g., external), a camera, a depth sensor, an eye tracking device, and/or a motion sensor (e.g., a hand tracking device, a hand motion sensor), etc. In some embodiments, the electronic device is in communication with a hand tracking device (e.g., one or more cameras, depth sensors, proximity sensors, touch sensors (e.g., a touch screen, trackpad). In some embodiments, the hand tracking device is a wearable device, such as a smart glove. In some embodiments, the hand tracking device is a handheld input device, such as a remote control or stylus.
Apple's patent FIGS. 15A-15D below illustrate examples of an electronic device facilitating the movement and/or placement of multiple virtual objects in a three-dimensional environment using both eye and hand tracking technologies on an iPad.
Today's patent application #20230092282 titled "Methods for Moving Objects in a 3D Environment" is one of Apple's longest of the day at 169 pages. There's a ton of detail with a total of 86 patent figures that you could review here.
- Ben Boesel: Human Interface Designer
- Jonathan Ravasz: Human Interface Designer (previously worked at Oculus VR)
- Stephen Lemay: UI Designer
- Chris McKenzie: Human Interface Designer
- Shih-Sang Chiu: No LinkedIn Profile was available