Today the US Patent & Trademark Office published a patent application from Apple that relates to another biometrics method for user authentication by scanning the surface of user's palm.
Today Apple uses Touch ID using fingerprint scanning and Face ID for scanning a user's face on modern iPhone and iPads. Apple has been researching biometric that could scan veins and now today's patent application shows that scanning a user's palm may be yet another future biometric method. The patent puts a lot of emphasis of this palm method in context with an Apple Watch shown in our cover graphic.
Apple's invention covers a display layer including light transmissive portions and non-transmissive portions. The electronic device may also include a palm biometric image sensor layer beneath the display layer and configured to sense an image of a user's palm positioned above the display layer based upon light reflected from the user's palm passing through the light transmissive portions of the display layer.
The electronic device may further include a controller configured to capture image data from the user's palm in cooperation with the palm biometric image sensor layer and determine a surface distortion of the user's palm based upon the image data.
The controller may also be configured to perform a biometric authentication of the user's palm based upon the image data and the surface distortion.
The palm biometric image sensing layer may include a substrate, a photodiode layer on the substrate, and a narrowing field of view layer above the photodiode layer, for example. The palm biometric image sensing layer may include a substrate, a photodiode layer on the substrate, and a focusing layer above the photodiode layer, for example.
The electronic device may further include an infrared light source, and the controller may be configured to determine palm vein data from the image data resulting from the infrared light source. The controller may be configured to perform the biometric authentication based upon comparing palm vein data to stored palm vein data, for example.
The electronic device may include a flood light source, and the controller may be configured to determine palm crease data from the image data resulting from the flood light source. The controller may be configured to determine the surface distortion based upon comparing the palm crease data to stored palm crease data, for example. The flood light source may include a flood light source operable at a wavelength between 450 nm to 560 nm, for example.
Apple's patent FIGS. 2 and 3 below represent an electronic device (#20) that illustratively includes a portable housing and a controller carried by the portable housing. The electronic device could be an iPhone, Apple Watch, iPad, MacBook or other device. The devices could use a contactless palm biometric sensor as highlighted and detailed in patent FIG. 3.
Apple's patent FIG. 5 below illustrates the notch area of an iPhone with various cameras and sensors; FIG. 6 is an image diagram of a user's palm acquired using an electronic device; FIG. 7 is a flow diagram according to an embodiment.
The contactless palm biometric sensor #40 (of FIG. 3) also includes a flood light source (#45) carried by the portable housing (#21). The flood light source or flood illuminator may include a visible flood light illuminator, for example, and operate in a wavelength between 450 nm and 560 nm (i.e., visible blue-green).
In some embodiments, the flood light source may alternatively or additionally include an IR flood light source. Moreover, in some embodiments, the display (#23) may alternatively or additionally define the flood light source or an additional flood light source, and may selectively operate pixels. Where the flood light source includes an IR flood light source, the IR flood light source may cooperate with the dot projector (#41) and IR camera (#43) for further illumination during determination of the orientation offset of the user's palm (#44).
The IR camera cooperates with the flood light source to capture image data of the user's palm illuminated by the flood light source while the user's palm is positioned in spaced relation adjacent the contactless palm biometric sensor (Block #72).
The controller (#22), at Block #74, determines a surface distortion, for example, palm crease data, of the user's palm based upon the image data. More particularly, the controller determines the surface distortion based upon a comparison between the palm crease data to stored palm crease data, for example, stored in the memory (#27). Of course, one or more other or additional light sensors may be used to capture the image data of the user's palm illuminated by the flood light source.
As will be appreciated by those skilled in the art, skin surface cracks in the user's palm 44 are the dominant features in images of the user's palm taken using visible wavelengths in the blue and green range.
Apple's patent FIG. 14 below is a schematic diagram of the display layer and the palm image biometric image sensor layer of the electronic device of FIG. 12; FIG. 10 shows a user gaining access to their Apple Watch using the palm of their hand as a biometric measure. FIG. 11 shows the back of a user's hand hovering over the Apple Watch below.
Apple's patent application 20190278973 that was published today by the U.S. Patent Office was filed back in Q1 2019. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Dale Setlak: Engineering sensors and measurement systems. He joined Apple after AuthenTec was acquired by Apple. Setlak was co-founder and CTO at AuthenTec.
Giovanni Gozzini: Director of Engineering. He came to Apple via AuthenTec.
Moe (Mohammad) Yeke Yazdandoost: Tech Lead. He previously worked at Teledyne DALSA as Sr. R&D Design Specialist.