An Apple Patent reveals an advanced Machine Learning System for Home apps using Microlocations' Tagged Data
With Apple Vision Pro setting the bar high for 'Spatial Computing,' will the Android Alliance attempt to match it or aim lower to compete with Meta

An Apple Vision Pro patent published in Europe Today dives deep into the headset's 3D 'Infinite Canvas' User Interface

1 x cover -- Apple-WWCD23-Vision-Pro-lifestyle-working

On Monday, Apple introduced their Revolutionary XR Spatial Computing Headset branded "Apple Vision Pro." The keynote for that segment blew me away. I was stunned at what I was seeing. Today, I was equally stunned to find an Apple Vision Pro patent published in Europe. Without seeing Apple's presentation, it would have been a little difficult to describe something we've never seen before. Now that most of the Apple Community has seen Apple's new headset, it'll be easier to understand Apple's patent and figures.

An extended reality (XR) environment refers to a wholly or partially simulated environment that people sense and/or interact with via an electronic device. For example, the XR environment may include augmented reality (AR) content, mixed reality (MR) content, virtual reality (VR) content, and/or the like. With an XR system, a subset of a person’s physical motions, or representations thereof, are tracked, and, in response, one or more characteristics of one or more virtual objects simulated in the XR environment are adjusted in a manner that comports with at least one law of physics.

As one example, the XR system may detect head movement and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. As another example, the XR system may detect movement of the electronic device presenting the XR environment (e.g., a mobile phone, a tablet, a laptop, or the like) and, in response, adjust graphical content and an acoustic field presented to the person in a manner similar to how such views and sounds would change in a physical environment. In some situations (e.g., for accessibility reasons), the XR system may adjust characteristic(s) of graphical content in the XR environment in response to representations of physical motions (e.g., vocal commands).

There are many different types of electronic systems that enable a person to sense and/or interact with various XR environments. Examples include head mountable systems, projection-based systems, heads-up displays (HUDs), vehicle windshields having integrated display capability, windows having integrated display capability, displays formed as lenses designed to be placed on a person’s eyes (e.g., similar to contact lenses) and more.

The head mountable system may incorporate one or more imaging sensors to capture images or video of the physical environment, and/or one or more microphones to capture audio of the physical environment. Rather than an opaque display, a head mountable system may have a transparent or translucent display. The transparent or translucent display may have a medium through which light representative of images is directed to a person’s eyes.

Apple's headset #105 may include one or more cameras such as camera(s) (e.g., visible light cameras, infrared cameras, etc.). It will also include various sensors including, but not limited to, cameras, image sensors, touch sensors, microphones, inertial measurement units (IMU), heart rate sensors, temperature sensors, Lidar sensors, radar sensors, sonar sensors, GPS sensors, Wi-Fi sensors, near-field communications sensors, etc.).

Moreover, the headset may include hardware elements that can receive user input such as hardware buttons or switches. User input detected by such sensors and/or hardware elements correspond to various input modalities for interacting with virtual content displayed within a given extended reality environment. For example, such input modalities may include, but not limited to, facial tracking, eye tracking (e.g., gaze direction), hand tracking, gesture tracking, biometric readings (e.g., heart rate, pulse, pupil dilation, breath, temperature, electroencephalogram, olfactory), recognizing speech or audio (e.g., particular hot-words), and activating buttons or switches, etc.

Apple's patent FIG. 2 below illustrates an example of an extended reality (XR) environment including multiple user interfaces displayed, by an XR Headset (Apple Vision Pro), to appear at multiple respective locations in a physical environment.

(Click on image to Enlarge) 2 Apple Vision Pro Patent FIG. 2

Apple's patent FIG. 3 below illustrates presents a different 3D view of Apple's patent FIG. 2 to illustrated the layering of real and virtual components that presents a complete AR view inside the Apple Vision Pro.

According to Apple, for each UI (e.g., and each underlying application), the Headset assigns a boundary that defines portion of the physical environment #200 that includes the location, remote from the Headset, at which the UI appears to be displayed (e.g., at which the display #230 causes the UI to be perceived by a user of the Headset, even though no physical display may be occurring at the perceived/apparent location).

(Click on image to Enlarge) 3 Vision Pro patent fig. 3

Apple's patent FIG. 4 below illustrates how a system process of the Apple Vision Pro headset that controls the access, by various applications, to scene information and/or user information.

4 Apple patent fig. 4

Apple's patent FIG. 7 below illustrates an example of an extended reality environment in which a displayed user interface is the only user interface displayed by the display, and in which the user interface includes a bounded two-dimensional UI window, a bounded partial three-dimensional UI window and a bounded three-dimensional portion in accordance with one or more implementations.

More specifically to FIG. 7, Apple's Headset has determined that the content of the displayed UI relates to the planet Mars, and has modified a portion of the viewable area #207 outside the boundaries of the two-dimensional UI window #700, the partially three-dimensional UI window #704, and the bounded three-dimensional UI window #726 based on the content of the displayed UI (e.g., to display an enhancement #708, such as by displaying a translucent red background, corresponding to the red color of the planet Mars, over the viewable area #207).

(Click on image to Enlarge) 5 Apple patent fig. 7

Apple's patent FIG. 13 below illustrates a block diagram of an example architecture for operating a cross-platform virtual reality application.  

6. Apple Vision Pro Headset patent fig. 6  figs 13 & 14

Apple's patent FIG. 14 above illustrates a block diagram of an example architecture for operating a third party application.

Apple filed for this patent back in December 2022 and it was published today in Europe under #WO2023102139.

Apple Inventors

  • Olivier Gutknecht: Director, Technology Development Group, Software
  • Peter Hajas: Manager, AR Frameworks (His LinkedIn Profile states that he's hiring in Cupertino.)
  • Raffael Hannemann: R&D / Sr. Software Engineer
  • Mike Buerli: AR/VR Software Engineering Manager (His LinkedIn Profile states that he's hiring in Boulder, Colorado.


10.51FX - Patent Application Bar


The comments to this entry are closed.