Apple wins a utility patent for an Advanced Under-Display Optical Fingerprint Sensor System & another Design Patent for 'Vision Pro'
A new Apple patent reveals next-gen 'Smart Gesturing' on iDevices that could be used in grocery stores, museums, on sight-seeing tours+

Apple wins a Vision Pro patent regarding Immersive Video Streaming using View-Adaptive Prefetching and Buffer Control

1-cover-immersive-video-granted-patent

Today, the U.S. Patent and Trademark Office officially granted Apple a patent that relates to streaming immersive video content for presentation to a user wearing Apple Vision Pro.

Apple notes in their granted patent that in general, computer systems can generate immersive video content (sometimes referred to as omnidirectional video content). As an example, immersive video content can include visual data that can be presented according to a range of viewing directions and/or viewing locations. Portions of the immersive video content can be selectively presented to a user to give the user the impression that she is viewing the visual data according to a particular field of view and/or viewing perspective.

In some implementations, immersive video content can be presented to a user in three-dimensions using a wearable display device, such as a virtual reality headset or an augmented reality headset. Further, different portions of the immersive video content can be presented to a user, depending on the position and orientation of the user's body and/or the user's inputs.

Apple's patent FIG. 7 below is diagram of an example system for streaming immersive video content for presentation to a user using multiple communications links; FIG. 2A is a diagram of an example viewport for presenting immersive video content; and  FIG. 2B is a diagram of example degrees of freedom of movement of a user's body.

2-immersive-video-apple-granted-patent-figures

Apple Vision's Viewport

More specifically, in Apple's patent FIG. 2A above, immersive video content #200 can include visual data that can be presented according to a range of viewing directions and/or viewing locations with respect to a user #102 (represented in FIG. 2A as a sphere surrounding the user).

A viewport #202 can be selected to present a portion of the immersive video content to the user (e.g., based the position and/or orientation of the user's head) to give the user the impression that she is viewing the visual data according to a particular field of view and/or viewing perspective. Further, the viewport can be continuously updated based on the user's movements to give the user the impression that they're shifting their gaze within a visual environment.

The sensors noted in FIG. 7 above can detect the user translating their head along one or more of these axis and/or rotating her head about one or more of these axes (e.g., according to six degrees of freedom, 6DoF).

As another example, the sensors can detect when a user rotates their head about the x-axis, sometimes referred to as a “roll” motion. As another example, the sensors can detect when a user rotates their head about the y-axis, sometimes referred to as a “pitch” motion.

As another example, the sensors can detect when a user rotates her head about the z-axis, sometimes referred to as a “yaw” motion.

For more details, review Apple's granted patent 11924391.

10.52FX - Granted Patent Bar

Comments

The comments to this entry are closed.