Apple continues to work on Notchless iPhones and MacBooks with their 3rd Under-Display Camera Patent published last week
In February 2023 Apple won a patent for placing their Face ID camera system under a future iPhone display. Then in September a continuation patent was published illustrating their ongoing work on this project. This week the US Patent & Trademark Office published a patent application from Apple titled “Displays Having Transparent Openings,” that clearly indicates their ongoing work on eliminating the irritating notch found on the iPhone, iPad and some MacBooks today.
Apple notes in their patent background that there’s a trend towards borderless electronic devices with a full-face display. These devices, however, may still need to include sensors such as cameras, ambient light sensors, and proximity sensors to provide other device capabilities. Since the display now covers the entire front face of the electronic device, the sensors will have to be placed under the display stack. In practice, however, the amount of light transmission through the display stack is very low (i.e., the transmission might be less than 20% in the visible spectrum), which severely limits the sensing performance under the display. It is within this context that the embodiments herein arise. Apple’s patent relates to a solution to this issue.
Displays Having Transparent Openings
An electronic device may include a display and an optical sensor formed underneath the display. The electronic device may include a plurality of non-pixel regions that overlap the optical sensor. Each non-pixel region may be devoid of thin-film transistors and other display components. The plurality of non-pixel regions is configured to increase the transmittance of light through the display to the sensor. The non-pixel regions may therefore be referred to as transparent windows in the display.
Light passing through the transparent windows may have associated diffraction artifacts based on the pattern of the transparent windows. To mitigate diffraction artifacts, a first sensor may sense light through a first pixel removal region having transparent windows arranged according to a first pattern. A second sensor may sense light through a second pixel removal region having transparent windows arranged according to a second pattern that is different than the first pattern. The first and second patterns of the transparent windows may result in the first and second sensors having different diffraction artifacts. Therefore, an image from the first sensor may be corrected for diffraction artifacts based on an image from the second sensor. There may be a gradual transition between a full pixel density region of the display and a pixel removal region in the display.
In one arrangement, thin-film transistor sub-pixels may be smaller than a pixel area for a given sub-pixel, providing a transparent opening around the periphery of each thin-film transistor sub-pixel. To mitigate back emission that is sensed by the sensor under the display, the display may include a black pixel definition layer. Additionally light absorbing layers may be coated on metal layers in the thin-film transistor layer of the display to mitigate back emission. Signal lines in the pixel removal region may be transparent.
Apple’s patent FIG. 3 below we see optical sensor / camera #13 that may be formed under the display stack of an electronic device (iPhone+).
Apple further notes that sensor 13 may be an optical sensor such as a camera, proximity sensor, ambient light sensor, fingerprint sensor, or other light-based sensor.
Apple’s patent FIG. 4 below is a cross-sectional side view of an illustrative pixel removal region of a display showing how pixels may be removed to increase transmission through the display. As shown in FIG. 4, display #14 may include a pixel removal region #332 that may include some pixels and some areas with removed components for increased transmittance (e.g., opening #324).
Apple’s patent FIG. 18D above illustrates a pixel removal region #332 formed only in the center portion along the top edge of device 10 (i.e., the pixel removal region covers a recessed notch area in the display); FIG. 18F illustrates yet another suitable example in which the pixel removal region covers the entire display surface.
Different pixel removal regions in the display may have different designs to mitigate diffractive artifacts. This principle is illustrated in FIG. 19A. When the optical sensor/camera captures an image of the point light source through the display, the point light source should (ideally) appear as a circular area of light in an image captured by the camera through the display.
Apple’s patent FIG. 25 above is cross-sectional side view of an illustrative thin-film transistor layer having transparent conductive layers in a pixel removal region.
To review the full details of this invention, review patent application 20230337467.
Comments