Apple's future Headset could sharpen focus of objects using a Liquid Filed Lens System and 3D LiDAR Sensors
Today the US Patent & Trademark Office published a patent application from Apple that relates to lenses in a future headset being able to adjust visuals by using liquid filled lenses system to provide sharp focus of images. Another key aspect of this invention is the use of using 3D LiDAR sensors. Apple just released their new iPad Pro with LiDAR scanning on March 18th.
Electronic devices are sometimes configured to be worn by users. For example, head-mounted devices are provided with head-mounted structures that allow the devices to be worn on users' heads. The head-mounted devices may include optical systems with lenses. The lenses allow displays in the devices to present visual content to users.
Head-mounted devices typically include lenses with fixed shapes and properties. If care is not taken, it may be difficult to adjust these types of lenses to optimally present content to each user of the head-mounted device.
Apple's invention covers a head-mounted device may have a display that displays content for a user. Head-mounted support structures in the device support the display on the head of the user.
The head-mounted device may have respective left and right lenses and respective left and right portions of a display. The left lens may direct images from the left portion of the display to a left eye box whereas the right lens may direct images from the right portion of the display to a right eye box.
A lens module in the head-mounted device may include first and second lens elements separated by a liquid-filled gap with an adjustable thickness.
A pump or other component may control how much liquid is forced from a liquid reservoir into the liquid-filled gap. The first and second lens elements may form a catadioptric lens having a thickness that depends upon the adjustable thickness of the liquid-filled gap.
A lens module in the head-mounted device may include first and second fluid-filled chambers and first and second flexible membranes. Control circuitry in the head-mounted device may control a first amount of fluid in the first fluid-filled chamber and a second amount of fluid in the second fluid-filled chamber to adjust curvature of the first flexible membrane and curvature the second flexible membrane.
The first and second flexible membranes may have different varying stiffness profiles. The varying stiffness profiles may be a result of the flexible membranes having a varying thickness, having surface relief that varies the elastic modulus of the flexible membranes, or being formed from an anisotropic material.
A lens module in the head-mounted device may include a flexible lens element with a periphery and a plurality of actuators around the periphery of the flexible lens element. Control circuitry in the head-mounted device may control the plurality of actuators to dynamically adjust the flexible lens element. Each actuator may pull radially outward on the flexible lens element away from a center of the flexible lens element or may bend or compress the periphery of the flexible lens element. The actuators may be piezoelectric actuators or voice coil actuators.
In some cases, a lens module may include a fluid-filled chamber, a semi-rigid lens element that at least partially defines the fluid-filled chamber, and at least one actuator configured to selectively bend the semi-rigid lens element.
Six actuators that are evenly distributed around the periphery of the semi-rigid lens element may be used to control the curvature of the semi-rigid lens element. The semi-rigid lens element may initially be planar or non-planar. For example, the semi-rigid lens element may initially have a spherically convex surface and a spherically concave surface. A tunable spherical lens may be incorporated into the lens module to offset a parasitic spherical lens power from the semi-rigid lens element
Apple's new iPad Pro includes cameras that can now use LiDAR scanning for 3D objects. Interestingly, Apple's patent application published today makes note of using 3D LiDAR sensors in their future headset.
Apple notes that Input-output circuitry may include sensors that may include, for example, three-dimensional sensors (e.g., three-dimensional image sensors such as structured light sensors that emit beams of light and that use two-dimensional digital image sensors to gather image data for three-dimensional images from light spots that are produced when a target is illuminated by the beams of light, binocular three-dimensional image sensors that gather three-dimensional images using two or more cameras in a binocular imaging arrangement, three-dimensional liDAR (light detection and ranging) sensors, three-dimensional radio-frequency sensors, or other sensors that gather three-dimensional image data), cameras (e.g., infrared and/or visible digital image sensors), gaze tracking sensors (e.g., a gaze tracking system based on an image sensor.
If desired, a light source that emits one or more beams of light that are tracked using the image sensor after reflecting from a user's eyes), touch sensors, buttons, force sensors, sensors such as contact sensors based on switches, gas sensors, pressure sensors, moisture sensors, magnetic sensors, audio sensors (microphones), ambient light sensors, microphones for gathering voice commands and other audio input, sensors that are configured to gather information on motion, position, and/or orientation (e.g., accelerometers, gyroscopes, compasses, and/or inertial measurement units that include all of these sensors or a subset of one or two of these sensors), fingerprint sensors and other biometric sensors, optical position sensors (optical encoders), and/or other position sensors such as linear position sensors, and/or other sensors.
The sensors may include proximity sensors (e.g., capacitive proximity sensors, light-based (optical) proximity sensors, ultrasonic proximity sensors, and/or other proximity sensors). Proximity sensors may, for example, be used to sense relative positions between a user's nose and lens modules in the HMD.
Appl's patent FIG. 2 below is a top view of a Head Mounted Display system. Each lens element of lens module #72 may be formed from any desired transparent material (e.g., glass, a polymer material such as polycarbonate or acrylic, a crystal such as sapphire, etc.).
Apple's patent FIG. 4A is cross-sectional side view of an illustrative head-mounted device with a catadioptric lens that includes two lens elements separated by a fluid-filled gap having a variable thickness.
Apple's patent FIG. 10A below is a cross-sectional side view of an illustrative elastomeric membrane that is attached to actuators that are controlled for dynamic stiffness tuning; FIGS. 10B and 10C are top views of the elastomeric membrane of FIG. 10A showing how the actuators perform dynamic stiffness tuning.
Apple's patent FIG. 18A above is a perspective view of the tunable non-planar semi-rigid lens element in an unbent state; FIG. 18B is a perspective view of the tunable non-planar semi-rigid lens element of FIG. 17 in a bent state.
Apple's patent application 20200096770 that was published today by the U.S. Patent Office was filed back in Q3 2019. Considering that this is a patent application, the timing of such a product to market is unknown at this time.
Inventors: Pedder; James E.; (Oxon, GB) ; Stamenov; Igor; (Cupertino, CA) ; Chen; Cheng; (San Jose, CA) ; Dorjgotov; Enkhamgalan; (Mountain View, CA) ; Myhre; Graham B.; (San Jose, CA) ; Chan; Victoria C.; (Sunnyvale, CA) ; Wen; Xiaonan; (San Jose, CA) ; Lv; Peng; (Cupertino, CA) ; Li; Yuan; (Cupertino, CA) ; Horie; Yu; (Pasadena, CA) ; Hazra; Siddharth S.; (Cupertino, CA)