Apple was Granted Two Patents today relating to Time-of-Flight based LiDAR coming to 2021 Pro iPhones
The U.S. Patent and Trademark Office officially published two patents relating to next-gen Time-of-Flight (ToF) based LiDAR that is reportedly coming to Pro iPhones later this year. The first covers calibration of depth sensing and the second patent relates to waveform design for a LiDAR system.
Time-of-flight (ToF) imaging techniques are used in many depth mapping systems (also referred to as 3D mapping or 3D imaging). In direct ToF techniques, a light source, such as a pulsed laser, directs pulses of optical radiation toward the scene that is to be mapped, and a high-speed detector senses the time of arrival of the radiation reflected from the scene. The depth value at each pixel in the depth map is derived from the difference between the emission time of the outgoing pulse and the arrival time of the reflected radiation from the corresponding point in the scene, which is referred to as the "time of flight" of the optical pulses. The radiation pulses that are reflected back and received by the detector are also referred to as "echoes."
In order to determine the ToF of each pulse unambiguously, the time interval between consecutive pulses is chosen to be longer than the ToF. This guarantees that only one pulse is "in the air" at a given time, and that the control and processing circuitry receiving the detected signals and determining the ToF knows when each received pulse was emitted by the light source.
Calibration of Depth Sensing
When the iPhone 12 debuted, Apple made a big deal about LiDAR making Night Mode a reality. Apple's marketing states: "Night mode comes to both the Wide and Ultra-Wide cameras, and it’s better than ever at capturing incredible low-light shots. LiDAR makes Night mode portraits possible."
In early January Patently Apple posted a report titled "Sony has Reportedly signed a 3-Year Deal with Apple for next-gen 'Direct Time-of-Flight' LiDAR scanners for iPhone 13 & beyond." This is going to be the next step for LiDAR on an iPhone.
Today, the U.S. Patent Office granted Apple a patent relating to calibration of depth. More specifically "to systems and methods for depth mapping, and particularly to beam sources and sensor arrays used in time-of-flight sensing.
Apple's patent FIG. 1 below is a schematic side view of a depth mapping system #20; FIG. 3A below is a schematic representation of a pattern of spots projected onto a target scene; FIG. 3B is a schematic frontal view of a ToF sensing array; and FIG. 3C is a schematic detail view of a part of the ToF sensing array of FIG. 3B, onto which images of the spots in a region of the target scene of FIG. 3A are cast.
Review granted patent 10,955,234 for the technical details.
Waveform design for a LiDAR system
Apple was granted a second patent relating to Time-of-Flight based LiDAR today. The patent is titled "Waveform design for a LiDAR system with closely-spaced pulses."
Apple's granted patent covers a LiDAR that is capable of avoiding the confusion between pulses or pulse sequences that are emitted at time intervals shorter than the expected values of ToF. This, in turn, enables the LiDAR to operate at a higher throughput and to map the scene either at a higher spatial resolution or with a larger FoV than otherwise possible.
Apple's patent FIG. 1 below is a schematic side view of a depth-sensing apparatus #20; FIG. 4 is a flowchart that schematically illustrates a method for optimizing and applying codes of tuple and dither structures.
Apple's granted patent 10,955,552 is a highly technical invention. To review the details, click here.
Comments