Apple wins a Patent for a Scene Camera System for a Mixed Reality Headset that includes a 2-Dimensional Array of Cameras
Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to a new HMD camera system that includes a two-dimensional array of cameras that capture images of respective portions of a scene.
Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to a new HMD camera system that includes a two-dimensional array of cameras that capture images of respective portions of a scene.
The cameras are positioned along a spherical surface so that the cameras have adjacent fields of view. The entrance pupils of the cameras are positioned at or near the user's eye while the cameras also form optimized images at the sensor. Methods for reducing the number of cameras in an array, as well as methods for reducing the number of pixels read from the array and processed by the pipeline, are also described.
While today's patent report could be appreciated by all Apple fans, the patent will be particularly appreciated by engineers, optical engineers and techies who follow all things related to VR headsets.
The simulated environments of virtual reality systems and/or the mixed environments of mixed reality systems may be utilized to provide an interactive user experience for multiple applications, such as applications that add virtual content to a real-time view of the viewer's environment, applications that generate 3D virtual worlds (Metaverse), interacting with virtual training environments, gaming, remotely controlling drones or other mechanical systems, viewing digital media content, interacting with the Internet, exploring virtual landscapes or environments, or the like.
Apple's patent covers scene cameras for video see-through head-mounted displays (HMDs) that may be used in mixed reality (MR) or virtual reality (VR) systems.
In conventional HMDs, one or more scene cameras may be mounted at the front of the HMD. However, the point of view (POV) of the scene cameras is substantially offset from and different than the POV of the user's eyes. Apple's invention corrects the point of view (POV) of the cameras to match the POV of the user by shifting the entrance pupils of the cameras towards the user's eyes.
In some embodiments, an HMD includes two-dimensional arrays of small form factor cameras (e.g., one array for the left eye, and a second array for the right eye) that capture images of respective portions of a real-world scene in front of the user. The cameras are positioned along a spherical curve or surface so that the cameras have non-overlapping, adjacent fields of view (FOVs).
To achieve a more accurate representation of the perspective of the user, the cameras' optics are configured so that the entrance pupils of the cameras in the array are positioned behind the cameras' image planes formed at the image sensors, and at or near the user's eye while the cameras also form optimized images at the sensor. In this way, each array of cameras captures views of the scene from substantially the same perspective as the user's respective eye.
In some embodiments, one sensor may be used to capture images for multiple (e.g., four) cameras in the array. In these embodiments, the optics of the cameras used with a sensor may be shifted or adjusted to align the image planes with the sensor surface.
In some embodiments, a curved sensor and a simpler lens system may be used to provide a wider FOV for each camera and thus reduce the number of cameras in the array.
Apple's patent FIG. 1A below illustrates a head-mounted display (HMD) that includes an array of cameras with entrance pupils at or near the user's eye; FIG. 2 illustrates a portion of a camera array.
Apple's patent FIG. 3 below illustrates an example camera with entrance pupil at or near the user's eye that may be used in an array as illustrated in FIGS. 1A and 2.
(Click on image below to Enlarge)
Apple's FIGS. 9A and 9B below illustrate reducing the number of cameras by reducing resolution and shifting the entrance pupil of the cameras in peripheral regions.
Apple's patent FIG. 16 above graphically illustrates adding a negative meniscus lens in front of a camera array to increase the field of view (FOV) of the cameras towards the periphery; FIG. 17 graphically illustrates adding a negative meniscus lens in front of a camera array to progressively offset the point of view (POV) of the cameras towards the periphery.
Apple's patent FIGS. 20A through 20C below graphically illustrate an example scene camera that includes a negative meniscus lens in front of the cameras in the camera array.
For more details, review Apple's granted patent 11,448,886.
A Few of Apple's Inventors
Dan Hennigan: Optical Mechanical Engineering Manager
Saito Kenichi: Optical Engineer (previously worked at Cannon & Fuji Photo Film)
Brett Miller: Engineering Manager, Camera Incubation
Noah Bedard: Prototyping Engineer
Kathrin Berkner: Senior Engineering Manager - Camera Incubation
Comments