Apple Reveals a Possible Future Health app for Apple Watch and an MR Headset to Assist those with Parkinson's Disease
Future iPhones will use Blazingly Fast Line-of-Sight Device-to-Device Communications for Sharing Photos, documents and more

A Future iPhone Camera will adopt Apple's Latest Advances in Red-Eye Removal Substantially



Today the US Patent & Trademark Office published a patent application from Apple that relates to advancing Apple's camera feature for removing red-eye. With the importance that cameras play in smartphones today, Apple continues to advance their cameras with new features and whenever possible advance older features as new technologies and processes permit.  


Apple notes that red-eye artifacts are prevalent in consumer photography, mainly due to the miniaturization of digital cameras. Mobile devices equipped with a camera, having the flash and the lenses in high proximity to each other, often cause a direct reflection of flash light from a subject's pupils to the camera's lenses. Due to this reflected light, the pupils captured by the camera appear unnatural, assuming various colors (from dark to brighter shades of red) as a function of the capturing conditions and the subject's intrinsic traits.


Correcting for red-eye artifacts typically involves first detecting (segmenting) the eye region containing the artifacts, and, then correcting the color of the respective pixels. Segmentation of the image region that had been distorted by the red-eye artifacts is commonly done by clustering the image pixels based on color, using a color space such as YCbCr or RGB, and/or by recognizing image patterns (e.g., the pupils' size and shape) by means of annular filters, for example.


Once the image regions affected by the red-eye artifacts are identified, typically, the affected pixels are corrected by reducing their intensity (darkening). Many of the techniques that correct red-eye artifacts operate based on an already processed image in which the original appearance of the red-eye artifacts, due to the processing, is not preserved. Apple now has new way of approaching the red-eye problem.


Apple's invention covers new systems and methods for correcting red-eye artifacts in a target image of a subject. In one aspect, one or more images, captured by a camera, may be received, including a raw image. The target image may be generated by the processing the captured images. Then, an eye region of the target image may be modulated to correct for the red-eye artifacts, wherein correction may be carried out based on information extracted from at least one of the raw image and the target image.


In another aspect, modulation may comprise detecting landmarks associated with the eye region; estimating spectral response of the red eye artifacts; segmenting an image region of the eye based on the estimated spectral response of the red eye artifacts and the detected landmarks, forming a repair mask; and modifying an image region associated with the repair mask.


In yet another aspect, modulation may comprise detecting landmarks associated with the eye region; estimating spectral response of a glint; segmenting an image region of the eye based on the estimated spectral response of the glint and the detected landmarks, forming a glint mask; and rendering one or more glints in a region associated with the glint mask. By leveraging both a raw image (or a pseudo-raw image) and a processed image, the accuracy of detecting affected regions, rendering the natural appearance of a subject's eyes, and restoring glints can be improved.


Red-eye artifacts are caused by light reflected from the pupil regions of a subject's eyes. Typically, red-eye artifacts are exacerbated when a subject is photographed in a dark environment with active camera flash. Light from the camera flash reaches the subject's pupils and is reflected back from the pupils to the camera's lenses.


These reflections are captured by the camera's sensors and create the undesired image-artifacts. However, red-eye artifacts, despite their name, are not always red in color. The color of the light reflected from the subject's pupils and captured by the camera's sensors may vary based on the capturing conditions.


As illustrated in FIG. 1 below, capturing conditions may include: the distance between the camera and the subject, the angle between the eye surface and the optical axis, and the intensity of the light source (flash). For example, at a short distance between the camera and the subject, red-eye artifacts may cause an eye reflection to appear in an amber or red color. While, at a long distance between the camera and the subject, an eye reflection may appear whiter. Thus, red-eye artifacts may be materialized within a spectrum of colors, depending, inter alia, on the capturing conditions.


Apple's patent FIG. 1 below is a diagram illustrating a configuration including a camera, a light source, and two subjects, positioned at different distances from the camera; FIG. 2 is a diagram illustrating different red-eye artifacts.




Apple's patent FIG. 3 above is a block diagram showing a camera system for red-eye artifact correction according to an aspect of the present invention; FIG. 4 below is diagram showing exemplary image processing algorithms.


3 Red Eye algorithm fig. 4


Apple's patent FIG. 5 below is a functional block diagram illustrating a technique for red-eye artifact correction; FIG. 6 is a diagram illustrating intermediate processing results of a technique for red-eye artifact correction according to an aspect of the present disclosure.


4 Apple Red Eye advancements figs 5 and 6


Apple's patent application 20190370942 that was published today by the U.S. Patent Office was filed back in Q2 2019 with some work dating back a year earlier.


Considering that this is a patent application, the timing of Apple implementing their advanced red-eye removal system is unknown at this time. Considering the final filing date of this patent application, it would seem highly unlikely that it was implemented in iPhone 11.


Some of Apple's Inventors


David Hayward: Graphics Engineering Manager. Hayward is a 25 year veteran at Apple managed Core Image and RAW Camera Team


Mark Zimmer: Painter Author, Apple Engineer; Inventor, Image processing, RAW Processing. Zimmer was founder, President and CEO of Fractal Design Corporation.


Emmanuel Piuze: Machine Learning Engineer


10.51FX - Patent Application Bar


The comments to this entry are closed.