Ivanka Trump and Apple's CEO visit Idaho's Wilder School District to Examine the School's new Focus on 'Subject Mastery'
In-Display Biometric Sensor Companies line-up for Samsung's Galaxy S10 Business and Apple could be next

Google Wins a Patent for Face ID-like Technology that could be used in a Future Pixel Phone to interpret hand Gesturing

1 X cover dot projector  Google

 

Back in March Reuters reported that "most Android phones will have to wait until 2019 to duplicate the 3D sensing feature behind Apple’s Face ID security according to three major parts producers." Part of the delay is due to a shortage of vertical-cavity surface-emitting lasers, or VCSELs supply. The supply issue aside, Google was granted a patent today by the U.S. Patent and Trademark Office covering the future use of VCSELs in products like a Pixel smartphone.

 

Google was granted patent 10,139,217 today titled "Array based patterned illumination projector." When Apple introduced their TrueDepth Camera that supports Face ID authentication, Apple presented a slide outlining the make-up of such a camera that included a "Flood illuminator" and a "Dot projector" that floods a subjects face to create an accurate 3D model of a face.  

 

2 apple truedepth camera

 

Google's patent covers the same type of "pattern illumination projector." Google's patent FIG. 5 below is a block diagram of an example system that includes a light emitter and a light detector; FIG. 6 is a flowchart of a method for operating a light emitter to produce patterns of illumination.

 

2 Google patent figs 5 & 6 re   VCSELs on future smartphone

 

Google notes in their granted patent that "The light emitter #560 producing a particular pattern of illumination includes generating light from a set of one or more interconnected light-emitting elements (e.g., LEDs, lasers, VCSELs) of the die #570.

 

The light emitter #560 can be part of a smart phone, digital assistant, head-mounted display, controller for a robot or other system, or some other portable computing device.

 

In such examples, the light emitted from the light emitter (e.g., as different patterns of illumination) could be used to determine the location of objects (e.g., of objects including light detectors) relative to such other objects (e.g., the location of a user's hand, on which is disposed a light detector, relative to a user's head, on which a head-mounted display including the light emitter 560 is disposed).

 

Alternatively, the light emitter can be part of a system that is mounted to a floor, wall, ceiling, or other object or building such that the location of the light emitter is relatively static relative to an environment of interest.

 

The object #500 of FIG. 5 could be part of or disposed on a system (e.g., a drone), a tag or other device attached to an object or person of interest (e.g., to a body segment of a person, to facilitate motion capture), or configured in some other way to facilitate determination of the location of the object #500 based on a time-varying intensity of light received from the light emitter.

 

Google further notes that "It could be beneficial in a variety of applications to detect and/or determine the location of an object in an environment. These applications could include tracking the location of a drone, a ball used in a game, a conductor's baton, a controller, a body part of a person (e.g., for motion capture or gesture recognition), or some object(s).

 

In an example application, the location of a plurality of markers or tags disposed on respective different locations on a person's body could be determined and used to detect the location and/or motions of the person and/or of particular parts of the person's body.

 

In another example application, the location of a control wand or other device, relative to a head-mounted device or other device worn by a person, could be detected and used as an input to the head-mounted device or other system.

 

In yet another example application, the location of a drone, robot, or other mobile system within an environment of interest (e.g., a room of a house, a warehouse, a factory) could be determined and used to control the motion of the drone, robot, or other mobile system within the environment."

 

Back in March the VP and General Manager of Apple supplier Finisar Allen, Curtis Barratt, pointed to gesture recognition as a very important application associated with VCSEL Lasers for 3D Sensing.

 

Just today Google and Microsoft were granted patents regarding the future use of hand gesture recognition to be used in future computers, smartphones and smart watches which I'll cover in future patent reports.

 

10.3 - Xtra News

Comments

The comments to this entry are closed.