Apple Invents Devices, Methods and OS GUIs for Interacting with 3-D Environments in XR Headsets, iDevices & Macs
In 2012, Patently Apple posted a granted patent report about Apple's work on a next-gen 3D-OS interface that included a "Transition Engine. In 2013, Apple was granted yet another 3D-OS for Macs. One of the patent figures from Apple's patents is presented below.
There was a lot of excitement over this potential development. In 2009, there was a video on YouTube covering the "iDesktopVR - head tracking for iPhone / iPod Touch as presented below.
While a 3D OS was definitely a little ahead of its time, Apple is hinting of a 3D interface once again for iDevices and their XR Headset, according to two patents that surfaced on Thursday. The first patent application report was titled "Apple Reveals Next-Gen Eye and Hand-Tracking to assist users work in 3D Environments on Macs, XR Headset, an iPad & more." That patent emphasized hand-tacking.
The second Apple patent, for some reason, triggered memories of the patents that I covered back in 2012 and 2013. The images in Apple's second patent on Thursday made it was easier to understand the 3D interface concept and how it had been something that Apple had been working on for well over a decade. The second patent placed more emphasis on using eye-tracking to control the 3D user interface.
Also remember that Apple had filed for a pair of patents in 2018 (01 and 02) regarding the ability to edit documents in an XR Headset environment to edit 3D documents.
Devices, Methods and GUIs for Interacting With 3-D Environments
Apple's second patent application regarding 3D operating environments published on Thursday states that there's a need for computer systems with improved methods and interfaces for providing computer generated experiences to users that make interaction with the computer systems more efficient and intuitive for a user.
New systems, methods and interfaces will complement or replace conventional systems, methods, and user interfaces for providing extended reality experiences to users. Such methods and interfaces reduce the number, extent, and/or nature of the inputs from a user by helping the user to understand the connection between provided inputs and device responses to the inputs, thereby creating a more efficient human-machine interface.
Apple's patent figure series FIGS. 7A-7J presented further below are block diagrams that illustrate displaying a user interface object at respective positions in a three-dimensional environment,
In some embodiments, one or more of the user interface objects are provided within a predefined zone in the three-dimensional environment, such that user interface objects placed in the predefined zone follow the user in the three-dimensional environment, whereas user interface objects placed outside of the predefined zone do not follow the user in the three-dimensional environment.
In some embodiments, the first display generation component is a heads-up display. In some embodiments, the first display generation component is a head-mounted display (TIMID). In some embodiments, the first display generation component is a standalone display, a touchscreen, a projector, or another type of display.
In some embodiments, the computer system is in communication with one or more input devices, including cameras or other sensors and input devices that detect movement of the user's hand(s), movement of the user's body as whole, and/or movement of the user's head in the physical environment.
In some embodiments, the one or more input devices detect the movement and the current postures, orientations, and positions of the user's hand(s), face, and body as a whole, etc. of the user. In some embodiments, the one or more input devices include an eye tracking component that detects location and movement of the user's gaze.
In some embodiments, the first display generation component, and optionally, the one or more input devices and the computer system, are parts of a head-mounted device (e.g., an HMD, or a pair of goggles) that moves and rotates with the user's head in the physical environment, and changes the viewpoint of the user into the three-dimensional environment provided via the first display generation component.
In some embodiments, the first display generation component is a heads-up display that does not move or rotate with the user's head or the user's body as a whole, but, optionally, changes the viewpoint of the user into the three-dimensional environment in accordance with the movement of the user's head or body relative to the first display generation component.
In some embodiments, the first display generation component is optionally moved and rotated by the user's hand relative to the physical environment or relative to the user's head, and changes the viewpoint of the user into the three-dimensional environment in accordance with the movement of the first display generation component relative to the user's head or face or relative to the physical environment.
In some embodiments, the user interface object 7104-1 comprises a panel that includes a plurality of selectable user interface options (e.g., buttons) that are selectable, by the user, via the user's gaze and/or a gesture (e.g., an air gesture) with one or more of the user's hands. In some embodiments, the user controls (e.g., modifies) which selectable user interface options are included in the panel (e.g., user interface object 7104-1). For example, the user selects certain application icons, settings, controls, and/or other options to be displayed within the panel, such that the selected application icons, settings, controls, and/or other options that are included in the panel are easily accessible by the user (e.g., the user is enabled to interact with the panel even as the user moves in the physical environment because the panel follows the user as the user moves in the physical environment, as described in more detail below).
In some embodiments, as illustrated in FIGS. 7E-7G, while the user is not paying attention to the user interface object 7104 (e.g., user interface object 7104-3, user interface object 7104-4, and user interface object 7104-5), the user interface object 7104 continues to be displayed with the visual deemphasis (e.g., as indicated by the shaded fill in FIGS. 7E-7F).
Lastly, Apple notes that the operating system includes instructions for handling various basic system services and for performing hardware dependent tasks.
In some embodiments, the XR experience module is configured to manage and coordinate one or more XR experiences for one or more users (e.g., a single XR experience for one or more users, or multiple XR experiences for respective groups of one or more users). To that end, in various embodiments, the XR experience module includes a data obtaining unit, a tracking unit, a coordination unit and a data transmitting unit.
This is a very detailed patent and you could review Apple's patent application number US 20230092874 A1 here.
Comments