A new Apple patent application points to advancing the cameras in all Mac and iOS devices to include advanced strobe lights of visible and invisible light. The strobe light provides the picture with important metadata to assist in deblur operations. I'm not the biggest camera buff, but this sound like Apple is once again working on improving their latest iOS device cameras to produce better and better photos and videos as time goes on.
The Problem to Solve
In photography, a conventional camera flash is used to improve image picture quality in low light situations, by illuminating the scene with a burst of visible light while a picture of the scene is taken. For portable devices, such as handheld dedicated digital cameras and multifunction devices referred to as smart phones, the practical choices for an integrated, flash light source include the use of light emitting diodes (LEDs) and gas discharge lamps. An LED flash can be used to provide continuous illumination, which provides good illumination for capturing a rapid sequence of images, such as a video sequence. A gas discharge flash is typically operated in a pulsed manner to provide a very high intensity light but for a relatively short duration, no longer than the period of time the shutter is allowed to remain open to capture the scene for a single picture or frame. It is sometimes desirable to provide a less intense flash, e.g. during a redeye reduction process where the main flash is immediately proceeded by one or more reduced intensity flashes.
Illumination by flash is provided during the image-framing period (also referred to as the single shutter cycle for taking a picture). A typical range for such a period is 200-300 milliseconds. Some LED flashes are not capable of providing their highest level of illumination for the entire image framing period, and thus have to be pulsed with, for example, one larger pulse and one smaller pulse during the entire shutter cycle. There may also be thermal reasons for pulsing an LED flash.
In other aspects of photography, it is known that a moving object in the scene, or movement of the camera relative to an object in the scene, causes motion blur. In other words, the object appears blurred in the picture. Shortening the exposure time for taking the picture may reduce such blur, provided the image sensor is sensitive enough to capture a sufficient amount of light from the scene during the shorter exposure time. In another technique known as deblurring, a signal processing operation known as deconvolution can be applied to the picture in order to recover the high frequency or edge details that have been lost in the blurred picture. It has been reported that for an improved deblur operation, rather than leaving the shutter open continuously for the entire exposure duration, the camera shutter is "fluttered", i.e. opened and closed rapidly during a single exposure period, in accordance with a binary pseudo-random sequence or code. This flutter effectively changes the inherent filtering effect of the exposure time, in a way that better preserves the high frequency spatial or edge details in the picture, such that the subsequent deconvolution (deblurring) operation can be more easily performed to recover the edge information in the picture. This so-called coded exposure photography technique or flutter shutter technique has been suggested as being extendable to strobe lighting flashes. This coded flash sequence has been suggested to provide a greater ability to control motion sensing.
Apple's Solution – Visible and Invisible Light Pulses
Apple's patent points to an electronic device which is characterized as an Apple product with a camera in it. For the sake of simplicity and realism, the patent focuses in on the iPhone.
The iOS device has a camera function for taking a picture, where a controller is to command a camera flash to produce two or more multi-value coded pulses of light during a single shutter cycle of the picture. As redefined here, the term "camera flash" is not limited to elements that produce only visible light pulses; the camera flash could also, or alternatively, produce non-visible light pulses that can be reflected from moving objects in the scene and then detected by an imaging sensor (as a picture of the scene with the moving object).
Multi-Value Coded Pulses
The pulses are said to be "multi-value coded" in that the amplitudes of at least two of the flash pulses are non-zero and different relative to each other. This variation in the flash pulses inherently embeds useful information into the picture about the motion of an object, which in turn provides an effective mechanism to subsequently deblur the picture (using stored knowledge of the timing and variable amplitude characteristics of the variable flash pulses).
The mechanism is also applicable in the case of video compression, to perform motion compensation across several frames of video. Having the amplitude of the flash pulses be variable yields an improved ability to subsequently discriminate the high frequency or edge components of the picture, during the subsequent deblurring or motion compensation operation.
Built-in Strobe Light
According to Apple's patent describes the optics associated with a flash in a camera that could produce a strobe of light or, in the case of video capture, continuous light for a longer duration, to illuminate the scene (while the pictures are being taken at the same time).
The flash in Apple's patent is described in terms of having multiple LED lamps each of which could be driven by a different pulse sequence (during the single exposure time interval). The flash controlled in this manner thus allows the picture to be taken without decreasing the exposure time, thereby capturing a sharper picture even while there may be some relative movement between the device and the object in the scene being illuminated.
The concepts of timing and amplitude for purposes of defining the flash pulses are covered in the patent's detailing of patent FIG. 2. We've provided the graphic but you'll have to go to the patent for the details of this feature.
DeBlur Operation Able to Accesses Metadata
In addition to the image content of a picture, information that describes the flash pulses that were used when taking the picture are also written to the picture storage by the controller as part of metadata of the respective picture. Thus, referring briefly to FIG. 3 where N pictures which may be either still images or a sequence of frames of video are depicted, each picture includes an image and associated metadata. In other words, the image has been tagged with metadata that defines the timing and amplitude of the flash pulses used when taking the picture.
Once the picture is available within picture storage, the deblur block could access the metadata of the picture and perform a deblur operation upon the picture. This is done using information in the accessed metadata which describes the flash pulses that occurred when the picture was first taken.
Apple credits as the inventors of patent application, originally filed in Q3 2009.
Apple credits Richard Tsai as the sole inventor of patent application 20110081142, originally filed in Q4 2009
Other Noteworthy Patent Applications Published Today
A new Apple patent application has published titled Ejectable Component Assemblies in Electronic Devices." Patent application 20110080699 bascially covers the same information from an earlier 2009 patent that we covered on this subject matter.
Notice: Patently Apple presents only a brief summary of patents with associated graphic(s) for journalistic news purposes as each such patent application is revealed by the U.S. Patent & Trade Office. Readers are cautioned that the full text of any patent application should be read in its entirety for further details. Patents shouldn't be digested as rumors or fast-tracked according to rumor time tables. Apple patents represent true research that could lead to future products and should be understood in that light. About Comments: Patently Apple reserves the right to post, dismiss or edit comments.