What's the Game Plan for the Ever Aggressive Foxconn?
South Korean Firms are big Winners for the iPhone 8

A Major Warner Bros. Patent Reveals a Coming AR/VR Movie Delivery System for Theaters, Home systems & Headsets

1 AF X99 COVER GRAPHIC WARNER BROS PATENT


Years ago the big film studios began introducing 3D experiences for movies goers and handed out 3D glasses as you entered the theater. This week Patently Apple discovered a patent filing from Warner Bros. Entertainment describing new systems that are in development that will bring movie goers Augmented Reality, Virtual Reality and mixed reality films. And yes, users will be handed 3D/AR/VR headsets to watch these next-generation movies. The difference with these films will be that they're being designed to work in the theater and at home with iTunes or Netflix. So next-gen headsets are going to be needed to be able to watch this kind of content. Warner Bros. patent notes that these movies will work on headset powered either by Intel or ARM processors.

 

The patent filing lists HoloLens and other types of headgear already in the market that will be compatible with this future system – but this is likely where Apple will step in with an ARM system. There's no way that Apple is going to miss out on this next-wave of movies from Warner Bros. and other studios.

 

So even though Apple isn't mentioned in the patent by name, it's clear that this form of entertainment is going to provide some of the needed content to make new headsets popular with consumers.

 

Likewise, the system may work with live streaming of film and entertainment and this is where 5G networks will come into play. The new 5G networks begin rolling out sometime in late 2018. I doubt that Apple will want to late to the party and so this is something we can expect in the not-too-distant-future. This isn't going to be a project for a decade from now, but more like two or three years down the road at the latest.

 

After reading the Warner Bros. patent, it's clear as day that AR and VR enriched movies will play an important role in exciting the market to join this next wave of devices supporting AR and VR movies, games and live experiences.

 

Patent Background

 

"Virtual reality" is a term that has been used for various types of content that simulates immersion in a three-dimensional (3D) world, including, for example, various video game content, and animated film content. In some types of virtual reality, a user can navigate through a simulation of a 3D environment generated based on the computer model, by controlling the position and orientation of a virtual camera that defines a viewpoint for a 2D scene that is displayed on a two-dimensional display screen. A variation of these technologies is sometimes called "augmented reality." In an augmented reality setup, the display technology shows a combination of the user's surroundings that is "augmented" by one or more digital objects or overlays. Augmented reality content may be as simple as textual "heads up" information about objects or people visible around the user, or as complex as transforming the entire appearance of the user's surroundings into a fantasy environment that corresponds to the user's real surroundings.

 

Virtual reality (VR) and augmented reality (AR) have been applied to various types of immersive video stereoscopic presentation techniques including, for example, stereoscopic virtual reality headsets. Headsets and other presentation methods immerse the user in a 3D scene. Lenses in the headset enable the user to focus on a lightweight split display screen mounted in the headset only inches from the user's eyes. Different sides of the split display show right and left stereoscopic views of video content, while the user's peripheral view is blocked. In another type of headset, two separate displays are used to show different images to the user's left eye and right eye respectively. In another type of headset, the field of view of the display encompasses the full field of view of eye including the peripheral view. In another type of headset, an image is projected on the user's retina using controllable small lasers, mirrors or lenses. Either way, the headset enables the user to experience the displayed virtual reality content more as if the viewer were immersed in a real scene. In the case of augmented reality (AR) content, the viewer may experience the augmented content as if it were a part of, or placed in, an augmented real scene.

 

These immersive effects may be provided or enhanced by motion sensors in the headset that detect motion of the user's head, and adjust the video display(s) accordingly. By turning his head to the side, the user can see the virtual reality scene off to the side; by turning his head up or down, the user can look up or down in the virtual reality scene. The headset may also include tracking sensors that detect position of the user's head and/or body, and adjust the video display(s) accordingly. By leaning or turning, the user can see the virtual reality scene from a different point of view. This responsiveness to head movement, head position and body position greatly enhances the immersive effect achievable by the headset. The user may be provided the impression of being placed inside or "immersed" in the virtual reality scene. As used herein, "immersive" generally encompasses both VR and AR.

 

Immersive headsets and other wearable immersive output devices are especially useful for game play of various types, which involve user exploration of a modelled environment generated by a rendering engine as the user controls one or more virtual camera(s) using head movement, the position or orientation of the user's body, head, eye, hands, fingers, feet, or other body parts, and/or other inputs.

 

To provide an immersive experience, the user needs to perceive a freedom of movement that is in some way analogous to human visual perception when interacting with reality. Content produced for VR can provide this experience using techniques for real-time rendering that have been developed for various types of video games. The content may be designed as a three-dimensional computer model with defined boundaries and rules for rendering as video output. This content can be enhanced by stereoscopic techniques to provide stereoscopic output, sometime referred to as "3D," and associated with a VR application that manages the rendering process in response to movement of the VR headset, to produce a resulting VR experience. The user experience is very much like being placed inside a rendered video game.

 

In other types of VR and AR, the simulated 3D environment may be used primarily to tell a story, more like traditional theater or cinema. In this type of VR or AR, the added visual effects may enhance the depth and richness of the story's narrative elements or special effects, without giving the user full control (or any control) over the narrative itself.

 

However, the technology for experiencing anything similar to cinematic content delivered using VR or AR equipment or methods is in a very early stage of development. Actual implementations of technology are quite limited, and users have thus far been largely or completely untouched by VR or AR in their experience of narrative content.

 

It would be desirable, therefore, to develop new methods and other new technologies for mastering cinematic content for VR and AR use, that overcome these and other limitations of the prior art and enhance the appeal and enjoyment of narrative content for new immersive technologies such as VR and AR.

 

Warner Bros. Entertainment Solution

 

The invention by Warner Bros. Entertainment relates to the production, configuration, and providing, by a computer, of digital data for virtual reality or augmented reality output.

 

In one aspect of the invention, a computer-implemented method includes communicating, by a cinematic data distribution server over a wireless network, with multiple immersive output devices each configured for providing one of an augmented reality (AR) output or a virtual reality (VR) output based on a data signal, wherein each of the multiple immersive output devices is present within eyesight of a display screen.

 

For example, the multiple immersive output devices may be worn by moviegoers or home theater users. The method may include configuring the data signal based on digital cinematic master data that includes at least one of VR data or AR data. The method may include transmitting the data signal to the multiple immersive output devices contemporaneously, such that each of the users receives and processes the data and shares a contemporaneous immersive video experience.

 

In another aspect, the method may include outputting an image based on a video data portion of the digital cinematic master data on the display screen, contemporaneously with the transmitting. The users may thereby enjoy an AR experience in addition to the video on the screen, or if using fully VR equipment that obscures the screen, may enjoy a cinematic presentation that both supplements and duplicates the presentation on the screen.

 

For serving AR immersive output devices, configuring the data signal may include encoding the AR data for augmenting video data for output on the display screen, and including the AR data with the video data in the data signal.

 

The AR data may be configured to provide various effects. In an aspect, the AR data, when received by the multiple immersive output devices, continuously extends images on the display screen to areas beyond an outer limit of the display screen, for each person viewing AR output on one of the multiple immersive output devices.

 

For example, a person wearing an AR immersive output device may see elements of the scene that extend upwards, downwards, or sideways beyond the frame.

 

In another, alternative aspect, the AR data, when received by the multiple immersive output devices causes images that do not appear on the display screen to appear in a non-screen display volume to each person viewing AR output on one of the multiple immersive output devices.

 

For example, the non-screen object may be caused to appear in front of, above, or below the display screen, or even behind the viewer. These effects may similarly be provided by configuring VR data for a VR output device.

 

The data signal may be configured to provide each user with an "objective" experience, a "subjective" experience, or a mixture of objective and subjective experiences.

 

To provide a subjective experience, the cinematic data distribution server configures the data signal such that the images that do not appear on the display screen (i.e., the images that are visible only using an AR or VR output device) appear in a coordinate system defined relative to the each person viewing AR or VR output on one of the multiple immersive output devices.

 

To provide an objective experience, the cinematic data distribution server configures the data signal such that the images that do not appear on the display screen appear in the same coordinate system defined relative to the display screen, or in other words, a coordinate system that is relative to the cinema or home theater and the same for all immersive output devices.

 

To provide a mixed experience the cinematic data distribution server configures the data signal such that at least one visible object is defined individually relative to each person's subjective coordinate system, while at least one other object is defined in the common coordinate system and is the same for all viewers.

 

In another aspect, a user may be able to interact with objects depicted in AR or VR output. Accordingly, the method may include providing in the AR data or VR data code enabling each person viewing AR output on one of the multiple immersive output devices to interact with at least one of the images, causing the AR output or VR output to change.

 

In a related aspect, the method may include changing video output shown on the display screen based on the each person's interaction with at least one of the images. For example, different versions of a scene may be provided in stored cinematic data, and the version selected at runtime may be selected based on an aggregate of feedback from the different users.

 

2AF X99 WARNER BROS FIGS. 4 AND 6
 

Warner Bros. Entertainment patent FIG. 4 above illustrates a concept diagram illustrating elements of a system for outputting immersive content to multiple users in a cinema or home theater setting; FIG. 6 is a schematic diagram illustrating components of a stereoscopic display device for providing an immersive VR experience.

 

3AF X99 WARNER BROS. FIGS. 7 AND 8A

Patent FIG. 7 is a diagram illustrating components of, and concepts concerning, a cinema or home theater space for multi-user VR or AR; FIG. 8A is a flow chart illustrating elements of serving VR or AR data to an AR or VR output device providing a cinema experience.

 

In another area of the patent filing, Warner Bros. explains that several viewers of a home theater or cinema presentation may be equipped with the content consumption devices. The apparatus may include, for example, a processor based on Intel's architecture a system-on-a-chip designed by ARM, or any other suitable microprocessor.

 

The processor may be communicatively coupled to auxiliary devices or modules of the 3D environment apparatus using a bus or other coupling. Optionally, the processor and some or all of its coupled auxiliary devices or modules may be housed within or coupled to a housing having a form factor of a personal computer, gaming console, smartphone, tablet, laptop computer, set-top box, wearable googles, glasses, or visors, or other form factors.

 

A user interface device may be coupled to the processor for providing user control input to an immersive content display process operated by a VR or AR immersive display engine executing on the processor.

 

User control input may include, for example, selections from a graphical user interface or other input (e.g., textual or directional commands) generated via a touch screen, keyboard, pointing device (e.g., game controller), microphone, motion sensor, camera, or some combination of these or other input devices.

 

Control input may also be provided via a sensor coupled to the processor. A sensor may comprise, for example, a motion sensor (e.g., an accelerometer), a position sensor, a temperature sensor, a location sensor (for example, a Global Positioning System (GPS) receiver and controller), an eye-tracking sensor, or a microphone. The sensor may detect a motion or other state of a user interface display, for example, motion of a virtual-reality headset, or the bodily state of the user, for example, skin temperature or pulse.

 

The device may optionally include an input/output port coupled to the processor 202, to enable communication between a VR/AR engine and a computer network, for example a cinema content server or home theater server.

 

Such communication may be used, for example, to enable multiplayer VR or AR experiences, including but not limited to shared immersive experiencing of cinematic content. The system may also be used for non-cinematic multi-user applications, for example social networking, group entertainment experiences, instructional environments, video gaming, and so forth.

 

While many forms of devices are covered, they note specifically "In a fourth type of device, a laser projector or projector mounted to the user's head and directed to the user's eyes projects images directly on the user's eyes, making the retinas the only screen on which the augmented content is displayed.

 

Examples of devices for immersive and non-immersive AR output include Microsoft's HoloLens; Google Glass by Google; Digital Lightfield by Magic Leap; Space Glasses by Meta Company; and castAR glasses by of castAR.

 

4 AF X99 FIG. 2

Patent FIG. 1 noted above is a schematic block diagram illustrating aspects of a system and apparatus for the production and configuration of digital data for virtual reality or augmented reality output coupled to a distribution system; FIG. 13 is one of many flowcharts associated with the AR/VR System for Cinemas and Home Theaters.

 

5 X 9999WARNER BROS PATENT FIG. 13

Warner Bros. Entertainment's patent application was filed back in Q4 2016 and published this week by the U.S. Patent Office.. Considering that this is a patent application, the timing of such a product to market is unknown at this time.To review some of Apple's augmented reality patents, click here.

 

15 Bar - Xtra News +

Patently Apple presents a detailed summary of patent applications with associated graphics for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for full and accurate details. About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or behavior will result in being blacklisted on Disqus.

 

 

Comments

The comments to this entry are closed.