Today the US Patent & Trademark Office published a patent application from Apple that relates to streaming immersive video content for presentation to a user wearing their upcoming XR Headset.
Apple notes in their patent that in general, computer systems can generate immersive video content (sometimes referred to as omnidirectional video content). As an example, immersive video content can include visual data that can be presented according to a range of viewing directions and/or viewing locations. Portions of the immersive video content can be selectively presented to a user to give the user the impression that she is viewing the visual data according to a particular field of view and/or viewing perspective.
In some implementations, immersive video content can be presented to a user in three-dimensions using a wearable display device, such as a virtual reality headset or an augmented reality headset. Further, different portions of the immersive video content can be presented to a user, depending on the position and orientation of the user's body and/or the user's inputs.
Apple's patent FIG. 7 below is diagram of an example system for streaming immersive video content for presentation to a user using multiple communications links; FIG. 2A is a diagram of an example viewport for presenting immersive video content; and FIG. 2B is a diagram of example degrees of freedom of movement of a user's body.
Apple's XR Headset Viewport
More specifically, in Apple's patent FIG. 2A, immersive video content #200 can include visual data that can be presented according to a range of viewing directions and/or viewing locations with respect to a user #102 (represented in FIG. 2A as a sphere surrounding the user).
A viewport #202 can be selected to present a portion of the immersive video content to the user (e.g., based the position and/or orientation of the user's head) to give the user the impression that she is viewing the visual data according to a particular field of view and/or viewing perspective. Further, the viewport can be continuously updated based on the user's movements to give the user the impression that they're shifting their gaze within a visual environment.
The sensors noted in FIG. 7 above can detect the user translating their head along one or more of these axis and/or rotating her head about one or more of these axes (e.g., according to six degrees of freedom, 6DoF).
As another example, the sensors can detect when a user rotates their head about the x-axis, sometimes referred to as a “roll” motion. As another example, the sensors can detect when a user rotates their head about the y-axis, sometimes referred to as a “pitch” motion.
As another example, the sensors can detect when a user rotates her head about the z-axis, sometimes referred to as a “yaw” motion.
Developers and/or engineers could dive deeper into the details of this invention under Apple's patent application US 20230117742. Apple updated their invention by cancelling their initial 54 patent claims and replacing them with 30 new patent claims that includes adding a 5G cellular link to the headset and more under numbers 55-84 that could be reviewed in full at the bottom of this document.
- Fanyi Duanmu: Video Coding and Processing Engineer
- Jun Xin: Engineering Manager, Video Coding and Processing
- Xiaosong Zhu: Senior Software QA Engineer (many years of experience in Broadcast, digital video encoding process, and IPTV industry)
- Hsi-Jung Wu: No LinkedIn Profile was found.