Apple has Won a Patent for Immersive Video Streaming for their Future Mixed Reality Headset
Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to streaming immersive video content for presentation to a user wearing a head mounted device.
Immersive Video Streaming
According to Apple, immersive video content can be presented to a user in three-dimensions using a wearable display device, such as a virtual reality headset or an augmented reality headset. Further, different portions of the immersive video content can be presented to a user, depending on the position and orientation of the user's body and/or the user's inputs.
Apple's patent FIG. 1 below shows an example system #100 for presenting immersive video content to a user #102. The system 100 includes a video content source #104 communicatively coupled to a wearable display device #106 via a network #108.
The wearable display device can be any device that is configured to be worn by a user and to display visual data to user. As an example, the wearable display device can be a wearable headset, such as a virtual reality headset, an augmented reality headset, a mixed reality headset, or a wearable holographic display.
Apple's patent FIG. 2A above is a diagram of an example viewport for presenting immersive video content; FIG. 2B is a diagram of example degrees of freedom of movement of a user's body.
Apple's HMD Viewport
Further to patent FIG. 2A, Apple notes that immersive video content #200 can include visual data that can be presented according to a range of viewing directions and/or viewing locations with respect to a user. A viewport #202 can be selected to present a portion of the immersive video content to the user (e.g., based the position and/or orientation of the user's head) to give the user the impression that they're viewing the visual data according to a particular field of view and/or viewing perspective.
Further, the viewport can be continuously updated based on the user's movements to give the user the impression that they're shifting their gaze within a visual environment.
The Headset's sensors can also be configured to detect the position and/or orientation of a user's head in multiple dimensions. For example, referring to FIG. 2B, a Cartesian coordinate system can be defined such that the x-axis, y-axis, and z-axis are orthogonal to one another, and intersecting at an origin point O (e.g., corresponding to the position of the user's head).
The sensors (#120, FIG. 1) can detect the user translating her head along one or more of these axis and/or rotating her head about one or more of these axes (e.g., according to six degrees of freedom, 6DoF).
For example, the sensors can detect when a user translates their head in a forward or backwards direction (e.g., along the x-axis), sometimes referred to as a “surge” motion. As another example, the sensors can detect when a user translates their head in a left or right direction (e.g., along the y-axis), sometimes referred to as a “sway” motion. As another example, the sensors can detect when a user translates their head in an upward or downward direction (e.g., along the z-axis), sometimes referred to as a “heave” motion.
As another example, the sensors can detect when a user rotates their head about the x-axis, sometimes referred to as a “roll” motion. As another example, the sensors can detect when a user rotates their head about the y-axis, sometimes referred to as a “pitch” motion.
As another example, the sensors can detect when a user rotates her head about the z-axis, sometimes referred to as a “yaw” motion.
Developers and/or engineers could dive deeper into the details of this invention under Apple's granted patent US 11570417 B2.
Apple Inventors
- Fanyi Duanmu: Video Coding and Processing Engineer
- Jun Xin: Engineering Manager, Video Coding and Processing
- Xiaosong Zhu: Senior Software QA Engineer (many years of experience in Broadcast, digital video encoding process, and IPTV industry)
- Hsi-Jung Wu: No LinkedIn Profile was found.
Comments