Apple's Patent for Creating Motion-Based 3D Facial Models for Memoji was Published Today
Apple's 8K Foveated Display Technology could apply to both Mini-Displays for a VR Headset & a Wall Mounted Display+

Apple invents a way of Creating Smoother CGI Scenes for VR Headsets that could eliminate Eye Strain and/or Nausea

1 X Cover Apple creating smoother 3D CGI scenes

 

Today the US Patent & Trademark Office published a patent application from Apple that relates to devices, systems, and methods for stereoscopic 360° rendering of a three-dimensional (3D) object. The method of stereoscopic 360°   rendering of a three-dimensional (3D) object can be performed in a content creation application that is performed on a device such as a mobile device, desktop, laptop, or server device.

 

More importantly, the method can be performed on a device designed for viewing stereoscopic images such as a future virtual reality (VR) display (e.g., a head-mounted display (HMD)) or an augmented reality (AR) display from Apple.

 

One of the inventors behind Apple's invention worked at DreamWorks Animation.

 

In order to understand the aim of Apple's invention, they first lay out the problem that they believe has now be solved.

 

Apple notes that 360° Virtual Reality (VR) video is typically formatted in a 2:1 aspect ratio rectangle using equirectangular projections and stored as a video file. The file contains one equirectangular projection for each frame of the video.

 

Stereoscopic 360° VR video contains two equirectangular projections for each frame. It includes one projection for each eye's perspective/viewpoint. Since stereopsis (stereoscopic depth perception) functions based on horizontal parallax shift of points in a scene, it is important that the left and right views offer slightly different points of view.

 

This is easy to achieve in stereoscopic rendering of rectilinear computer-generated imagery (CGI) scenes by simply separating the two virtual cameras from each other on the camera's local x axis. This separation is referred to as interaxial distance (The distance between the centers of the lenses of two recording cameras) and the greater the interaxial distance, the greater the apparent stereoscopic depth in the scene.

 

If the two virtual cameras are angled in slightly towards an intervening centerline they are 'converged' at a certain distance, and the chosen angle can be used to perceptually position objects in the scene at different stereoscopic depths.

 

Rendering a full 360° stereoscopic CGI scene is much more difficult because the virtual "camera" at the center of the scene cannot simply be duplicated and separated on a single axis.

 

Additionally, unlike live VR rendering that links two virtual cameras to the headset wearer's eye positions and renders a viewport of the current position/orientation, a 360° stereoscopic video file needs to contain a complete rendering of the whole CGI scene on all video frames.

 

In addition, the zenith and nadir should not contain stereoscopic parallax. If they do, when a user wearing a headset looks directly up or down and then rotates their head (Y axis rotation), the stereoscopic parallax will produce vertical disparity of the eyes and inevitably cause eye strain, pain, or potentially nausea.

 

One 360° stereoscopic rendering technique involves stitching six 90° X 90°  views together and then offsetting four of the views on the equator (+X, +Z, -X, -Z) and their respective local axis to create artificial interaxial separation. The +Y and -Y views are the same for both left and right eye renders. These disjointed views are stitched together using distortion near the stitch lines or superimposition blending. The results achieved by the technique are not seamless, and the six individual renders plus stitch phases are also processor intensive.

 

Another approach is a 'slit scan' method, where a 180° high by 1° (or smaller) view is rendered and repeated for each Y-axis rotation of 1° (or smaller) with the left and right eye virtual cameras, offset from the center of the scene on the camera's local x axis. The (360.times.2) slit renders are combined to produce two complete equirectangular views, one for each eye. This approach is very slow and memory intensive due to the repeated rendering that is required. The approach also does not avoid stereoscopic parallax near the poles.

 

Apple's Invention: Stereoscopic Rendering of Virtual 3D Objects

 

Apple's invention is to efficiently render 3D objects stereoscopically for equirectangular video by, for example, creating equirectangular projections for each of the left eye and right eye viewpoints. It is further desirable to enable fast rendering of a stereoscopic 'camera pair' for every possible view orientation in the scene, while smoothly transitioning the zenith and nadir to monoscopic (i.e., zero parallax).

 

Apple's patent FIG. 1 below is a block diagram of an example operating environment of a device. In position #25 Apple points to a content creation application running on the device, a VR headset; FIG. 9 illustrates a stereoscopic rendering of an object in 3D space.  

 

2 Application for creating stereoscopic images for an AV Headset

 

In patent FIG. 8 above, "View #99" shows a view of the object from a single camera direction (e.g., with the camera view looking forward). A user interface of a content creation application can display the equirectangular representation #90 and/or one or more the camera views of an object, such as front camera view #99, simultaneously to allow the user to accurately visualize the placement of the object in 3D space. Moreover, the equirectangular representation and/or one or more of the camera views of an object, can be updated in real time, for example, as the user repositions the object in 3D space using the user interface.

 

Apple's patent FIG. 10 below illustrates translating positions of two vertices based on interaxial distance; FIG. 11 illustrates translating positions of a vertex based on interaxial distance that is reduced as the latitude increases from the equator or as the proximity is closer to one of the poles.

 

3 Apple patent figs 10 & 11 creating smooth cgi scenes

 

Apple's patent application 20190180504 that was published today by the U.S. Patent Office was originally filed back in Q4 2018. Considering that this is a patent application, the timing of such a product to market is unknown at this time.

Inventors

 

Stuart Pomerantz: is a Software Engineer that came to Apple via DreamWorks Animation. At DreamWorks he worked on rendering and infrastructure for the Ptch iOS app being produced by DreamWorks Animation Investments Inc.

 

Tim Dashwood: a Software Engineer. In his bio he notes that he was the founder of 11 Motion Pictures and its sister companies Dashwood Cinema Solutions and the Toronto-based stereoscopic 3D production company Stereo3D Unlimited Inc. One of the products created was a Stereo3D Toolbox.

 

10.51FX - Patent Application Bar

About Making Comments on our Site: Patently Apple reserves the right to post, dismiss or edit any comments. Those using abusive language or negative behavior will result in being blacklisted on Disqus.

Comments

The comments to this entry are closed.