Facebook wins a Depth Camera System Patent for their Future Smartglasses Device aimed at Replacing Smartphones
In late December Patently Apple posted a report titled "Facebook has been working on Smartglasses to Replace Smartphones that includes a secret new OS to make it happen." A Facebook spokeswoman stated at the time that Facebook's planned operating system will reportedly be focused on both current and future products, such as augmented-reality glasses.
With Apple's iOS and Google's Android dominating the last decade, Andrew Bosworth, Facebook’s head of hardware told The Information news site that "We really want to make sure the next generation has space for us" referring to their future OS and added that "We don’t think we can trust the marketplace or competitors to ensure that’s the case. And so we’re going do it ourselves."
With that as our context, Patently Apple will begin covering Facebook's patents on smartglasses that are aimed at challenging Apple's iPhone and Google's Pixel.
On the flip side, Apple intends to invade Facebook's Oculus space in the not-too-distant future and so some of Facebook's HMD patents will be covered on Patently Apple while the majority of them will be posted on our Patently Mobile blog.
Yesterday, Facebook was granted patent 10,630,925 that relates to depth sensing, and specifically relates to depth determination using polarization of light and a camera assembly with augmented pixels each having multiple gates and local storage locations.
The Facebook patent notes that in order to achieve a compelling user experience in artificial reality systems (Mixed Reality, AR & VR), it is essential to rely on an accurate and efficient camera for sensing a three-dimensional (3D) surrounding environment.
However, it is challenging to design a depth camera having a high performance and low computational power, which is also robust to the environment, flexible to operate, and have a compact form factor.
Moreover, conventional methods for depth sensing typically involve either a triangulation or time of flight based depth determination, and have several drawbacks. For example, the triangulation based methods generally have a high computational cost to generate a depth map that involves rectification and searching for corresponding points using a pair of stereo images.
The depth resolution achieved with the triangulation-based methods also relies on a baseline (e.g., distance between source and camera), and a size of the baseline increases with increasing depth. The time-of-flight methods for depth sensing experience a limited lateral resolution due to a limited number of pixels in conventional sensors.
Facebook's invention is to provide a better solution. Facebook states that their Depth Camera Assembly (DCA) will determine depth information associated with one or more objects in a local area.
The DCA comprises a light source assembly, a camera assembly, and a controller. The light source assembly is configured to project pulses of light into a local area, wherein each pulse of light has a respective polarization type of a plurality of polarization types. The camera assembly is configured to image a portion of the local area illuminated with the pulses of light.
The camera assembly includes a plurality of augmented pixels, wherein each augmented pixel has a plurality of gates and at least some of the gates have a respective local storage location.
An exposure interval of the camera assembly is divided into intervals and some of the intervals are synchronized to the projected pulses of light such that each respective local storage location stores image data associated with a different polarization type.
The controller is configured to determine depth information for the local area based in part on the polarization types associated with the image data stored in respective local storage locations.
An eyeglass-type platform, like the smartglasses in patent FIG. 1 below, represents a Near-Eye Display (NED) or some other type of a headset can further integrate the DCA.
The NED further includes a display and an optical assembly. The NED may be part of an artificial reality system. The display of the NED is configured to emit image light. The optical assembly of the NED is configured to direct the image light to an eye-box of the NED corresponding to a location of a user's eye. The image light may comprise the depth information for the local area determined by the DCA.
Facebook's patent FIG 1 below is a diagram of a near-eye-display (NED); FIG. 2 is a cross-section of an eyewear of the NED in FIG. 1, which may include a depth camera assembly (DCA).
Facebook's patent FIG. 3A below is an example sensor having a plurality of augmented pixels, which may be part of the Depth Camera Assembly (DCA) in FIG. 2; FIG. 3B is an example augmented pixel of the sensor in FIG. 3A.
Engineers and/or the technical savvy could check out the finer details of Facebook's invention under granted patent 10,630,925 here.
One of the inventors listed is Michael Anthony Hall, Technical Lead, 3D Cameras and Sensors. He was the Optical Scientist at Facebook Reality Labs. Hall previously worked at Microsoft as a Senior Optical Engineer that worked on HoloLens|Xbox|Kinect.