Apple wins a major Light Field Camera patent that will allow iPhone owners to use Gestures to capture scenes from different positions & more
Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to Apple possibly adopting a light field panorama system for iPhone that will allow a user to perform gestures that will capture images of a scene from different positions. In the bigger picture, Light Field photography and videography coming to future iPhones will provide users with a major advancement. Both Apple and Google who are working to bring this technology to market.
Apple's granted patent covers methods and apparatus for capturing, processing, and rendering light field panoramas.
In embodiments of a light field panorama system, a user holding an iPhone or iPad can perform specific gestures by moving the camera in front of a scene of interest to capture a set of digital images of the scene from different positions.
Additional information, for example white balance and exposure settings of the camera and position and orientation information from motion and position sensing technology of the device, may also be captured with the images.
The captured images and information may be processed to determine metadata including the relative camera positions of the images with respect to the scene and depth and geometry information for content of the scene captured in the images.
Apple's patent FIG. 1 below graphically illustrates a high-level flow of operations of a light field panorama system.
Apple's patent FIGS. 5A and 5B below graphically illustrate viewing a light field panorama using an iPhone or iPad.
More specifically to FIG. 5A, a viewer may move an iPhone to the left, right, up, or down (or diagonally) to view different parts of the scene. The viewer may instead or also rotate their iPhone to the left or the right, or up or down (referred to as "tilt") to view the scene at different angles.
The viewer may also move their iPhone forward and backward to zoom in or out on the scene. As the viewer moves their iPhone, a rendering engine may obtain or estimate a current position of the iPhone in relation to the scene represented by light field panorama (#520), and dynamically render and cause to be displayed a view (#540) of the scene from the images and metadata in light field panorama based on the current position.
More specifically to patent FIG. 5B above, it shows example portions of the scene that are viewed at different positions and rotations. As the viewer changes their viewing position and/or angle by rotating their iPhone, the user will be able to see behind or over objects in the scene, zoom in or out on objects in the scene, or view objects in different parts of the scene.
Apple's patent FIGS. 4A through 4F below illustrate non-limiting, example gestures that may be used to capture frames for generating a light field panorama. FIG. 4A shows a circular gesture. FIG. 4B shows a spiral gesture. FIG. 4C shows a "figure eight" gesture. FIG. 4D shows a closed arc gesture. FIG. 4E shows a vertical zig-zag gesture. FIG. 4F shows a horizontal zig-zag gesture.
Apple further notes that captured scenes represented by the light field panorama data may be explored by a viewer using a rendering and viewing system on an HMD, an iPhone, iPad, Mac or TV systems.
Apple's patent FIGS. 6A and 6B below graphically illustrate viewing a light field panorama using a head-mounted display (HMD).
Overall, the light field panorama data (images and metadata) for the scene may be processed by a rendering engine to render different 3D views of the scene to allow the viewer to explore the scene from different positions and angles with six degrees of freedom.
For example, using an HMD, the viewer may move to the left or right, move up or down, rotate their head left or right, or tilt their head up or down to view the scene from different positions and angles.
Alternatively, touch gestures may be used to explore the scene on a mobile device. Using a computer system such as a laptop or notebook computer, the user may use a cursor control device, touch screen, or keyboard to explore the scene from different positions and angles.
Using the rendering and viewing system, the viewer may change their viewing position and angle to see behind or over objects in the scene, zoom in or out on the scene, or view different parts of the scene.
Thus, the light field panorama allows a viewer to explore a scene with six degrees of freedom (6DOF), meaning the viewer can rotate with the content as well as translate in different directions.
By contrast, a typical 360 panorama (or photo sphere) only allows three degrees of freedom in the rendering, meaning that the viewer can only rotate their head but cannot translate through the content as they can when exploring the light field panorama.
For more details, review Apple's granted patent 11,044,398. Apple's very first granted patent for a light field camera was covered by Patently Apple back in 2017.
For interest sake, the leading light field camera company Lytro was acquired by Google back in 2018. The video below presents an overview of Lytro cameras using light field.
Behind the scenes, Google is obviously working to bring this camera technology to market and we may see it coming to a future Pixel phone or other devices. Google has even hired two of Apple's light field engineers that were listed on Apple's granted patent.
One is Gary Vondran an Imaging Scientist now with Google as a SW Engineering Manager. A second is Ricardo Motta. He was a "Distinguished Engineer – DEST at Apple for 6 years and is now with Google as a "Distinguished Engineer."
Which of the two companies will be first to bring light field camera technology to smartphones in the future? While only time will tell, it's going to be a major development in camera technology that could be a killer feature.