Apple wins a patent for future Smartglasses that will use Tunable Lenses negating prescription lens costs
Today the U.S. Patent and Trademark Office officially granted Apple a patent that relates to optical systems, and, more particularly, to Apple's future smartglasses with tunable lenses. With today's Apple Vision Pro, additional Zeiss prescription lenses could cost users approximately US$600 a pair, according to Bloomberg's Mark Gurman. With Apple's future smartglasses, they'll use an advanced adjustable lens system that, more than likely, not carry an added cost.
In Apple's patent background they note that eyewear may include optical systems such as lenses. For example, eyewear such as a pair of glasses may include lenses that allow users to view the surrounding environment. It can be challenging to design lenses that function properly for users having different prescriptions. A user may not know or remember his or her lens prescription, or the user may provide a prescription that is inaccurate.
Smartglasses with Adjustable Lens Systems
Apple's granted patent relates to eyeglasses that may be worn by a user and may include one or more adjustable lenses each aligned with a respective one of a user's eyes. For example, a first adjustable lens may align with the user's left eye and a second adjustable lens may align with the user's right eye.
Each of the first and second adjustable lenses may include one or more liquid crystal cells or other voltage-modulated optical material. Each liquid crystal cell may include a layer of liquid crystal material interposed between transparent substrates. Control circuitry may adjust the optical power of the lens by applying control signals to an array of electrodes in the liquid crystal cell to adjust a phase profile of the liquid crystal material.
The control circuitry may be configured to determine a user's prescription and accommodation range during a vision characterization process. The vision characterization process may include adjusting the optical power of the lens until the user indicates that an object viewed through the lens is in focus.
A distance sensor may measure the distance to the in-focus object. The control circuitry may calculate the user's prescription based on the optical power of the lens and the distance to the in-focus object. During vision characterization operations, control circuitry may adjust the optical power automatically or in response to user input.
The object viewed through the lens may be an Apple device. The user may control the optical power of the lens and/or indicate when objects are in focus by providing input to an Apple device. For example, an Apple device may be an Apple Watch electronic and the user may control the optical power of the lens and/or indicate whether objects are in focus by rotating the watch crown.
In another illustrative example, the Apple device may be having a touch sensor and a display that displays user interface elements, such as a future MacBook, iPhone or iPad, and the user may control the optical power of the lens and/or indicate whether objects are in focus by providing touch input to control the user interface elements.
Apple's patent FIG. 1 below is a diagram of illustrative system that includes eyeglasses with adjustable lenses; FIG. 3 is an equation describing a relationship between the optical power of a lens, the distance to an in-focus object, and a user's prescription; FIG. 4 is a diagram of an illustrative system showing how a user may view an object through a lens during a vision characterization process.
Apple's patent FIGS. 5-7 above represents Apple devices that may be used in a vision characterization process.
Apple's patent FIG. 8 is a diagram illustrating how a vision characterization process may be used to determine a user's accommodation range; FIG. 9 is a diagram illustrating how eyeglasses may bring an object out of focus to check whether a user's prescription and accommodation range have been accurately determined.
Apple's patent FIG. 10 above is a flow chart of illustrative steps involved in determining a user's prescription and accommodation range using optical power adjustment; FIG. 11 is a flow chart of illustrative steps involved in determining a user's prescription and accommodation range using distance adjustment.
Apple further notes that sensors in smartglasses may include one or more digital image sensors such as camera(s) (FIG. 1 #24) that may be an inward-facing camera that captures images of the user's eyes and/or may be an outward-facing camera that captures images of the user's environment.
As an example, the camera may be used by control circuitry (#26) to gather images of the pupils and other portions of the eyes of the viewer. The locations of the viewer's pupils and the locations of the viewer's pupils relative to the rest of the viewer's eyes may be used to determine the locations of the centers of the viewer's eyes (i.e., the centers of the user's pupils) and the direction of view (gaze direction) of the viewer's eyes.
Smartglasses (FIG. 1 #14) may include sensors such as depth sensor #20 for measuring the distance d to external objects such as external object #18. A depth sensor may be a light-based proximity sensor, a time-of-flight camera sensor, a camera-based depth sensor using parallax, a structured light depth sensor (e.g., having an emitter that emits beams of light in a grid, a random dot array, or other pattern, and having an image sensor that generates depth maps based on the resulting spots of light produced on target objects), a sensor that gathers three-dimensional depth information using a pair of stereoscopic image sensors, a lidar (light detection and ranging) sensor, a radar sensor, or other suitable sensor.
For more details, review Apple's granted patent 11703698.