An Apple Patent Reveals the Mind-Boggling array of Facial Sensors that could end up in Apple's future mixed reality headset over time
Today the US Patent & Trademark Office published a patent application from Apple that relates to their future mixed reality headset that could integrate a mind-boggling array of facial sensor over time. A sampling of the sensors include: depth cameras, gaze tracking, Lidar, Touch ID and a list of health sensors starting with an electromyography (EMG), oxygen, heart rate, blood flow and even sensors designed to detect in-air gestures that a user will make in the form of an instruction to the HMD to perform.
One aspect of Apple's invention relates to providing their future mixed reality headset with a super-snug fit on the users face and nose for comfort and keeping external light interfering with viewing content.
Another aspect of the invention relates to facial sensors that may be provided in the light seal to measure facial expressions and gather other measurements. Information on a measured facial expression of a user can be transmitted to external devices so that the external devices can update corresponding facial expressions on an avatar to reflect the user's current facial expression.
Apple further notes that facial expression sensors may also include electromyography sensors (EMG), resistive sensors, strain gauges, accelerometers and other motion sensors, magnetic sensors, potentiometers, and other sensors. Actuators in the light seal may be controlled based on facial sensor measurements and other measurements. Facial sensor measurements and other measurements from light-seal sensors can be used for authentication, actuator adjustments, avatar control, health monitoring, sensor calibration, and other activities.
The types of sensor options that Apple could use on their first-generation HMD and accessories are many and in fact too many to be used at one time. Yet Apple lists the sensors that could end up in future versions of their headset and accessories as follows:
Sensors in input-output devices may include force sensors (e.g., strain gauges, capacitive force sensors, resistive force sensors, etc.), audio sensors such as microphones, touch and/or proximity sensors such as capacitive sensors (e.g., a two-dimensional capacitive touch sensor integrated into display (FIG. 1 below, #14), a two-dimensional capacitive touch sensor overlapping display, and/or a touch sensor that forms a button, trackpad, or other input device not associated with a display), and other sensors.
If desired, sensors may include optical sensors such as optical sensors that emit and detect light, ultrasonic sensors, optical touch sensors, optical proximity sensors, and/or other touch sensors and/or proximity sensors, monochromatic and color ambient light sensors, image sensors, fingerprint sensors, iris scanning sensors, retinal scanning sensors, and other biometric sensors, temperature sensors, sensors for measuring three-dimensional non-contact gestures ("air gestures"), pressure sensors, sensors for detecting position, orientation, and/or motion (e.g., accelerometers, magnetic sensors such as compass sensors, gyroscopes, and/or inertial measurement units that contain some or all of these sensors), health sensors such as blood oxygen sensors, heart rate sensors, blood flow sensors, and/or other health sensors, radio-frequency sensors, depth sensors (e.g., structured light sensors and/or depth sensors based on stereo imaging devices that capture three-dimensional images), optical sensors such as self- mixing sensors and light detection and ranging (lidar) sensors that gather time-of-flight measurements, humidity sensors, moisture sensors, gaze tracking sensors, electromyography sensors to sense muscle activation, facial sensors, and/or other sensors. In some arrangements, device 10 may use sensors 16 and/or other input-output devices to gather user input. For example, buttons may be used to gather button press input, touch sensors overlapping displays can be used for gathering user touch screen input, touch pads may be used in gathering touch input, microphones may be used for gathering audio input, accelerometers may be used in monitoring when a finger contacts an input surface and may therefore be used to gather finger press input, etc.
Apple's patent FIG. 9 above is a cross-sectional view of an illustrative light seal with an electromyography (EMG) sensor; FIG. 11 s a cross-sectional view of an illustrative light seal with an optical sensor; FIGS. 12 and 13 are cross-sectional views of an illustrative light seal with a motion sensor such as an inertial measurement unit or accelerometer. FIG. 13 also illustrates a camera.
Apple's patent FIG. 23 above is a flow chart of illustrative operations involved in using a head-mounted device with light seal sensing circuitry.
Under FIG. 23, Apple further notes that an array of facial sensors may, for example, extend along the seal. Different sensors measure different portions of the user's face and gather information such as facial pressure, facial movement, facial shape (from the measured thickness of the seal at locations along its length), and/or other facial information (skin color, muscle activity, temperature, hear rate, blood flow, blood oxygen level, etc.).
For more details, see Apple's patent application 20210294104.
Considering that this is a patent application, the timing of such a product to market is unknown at this time. For newbies to patent reports, it has to be said that patent applications and/or granted patents aren't rumors and shouldn't be used as such in reports. Secondly, everything new that Apple introduces in the form of hardware and even key software is patented so as to protect an invention against claims made by a competitor and patent trolls that Apple is copying other inventions.
Apple Inventors
Samuel Resnick: Product Design Engineer
Farhan Hossain: Mechanical Product Designer
Comments