A New Round of Patents from Meta and Google have Surfaced relating to Future Smartglasses
In Meta's patent background they note that "With recent advances in technology, prevalence and proliferation of content creation and delivery has increased greatly in recent years. In particular, interactive content such as virtual reality (VR) content, augmented reality (AR) content, mixed reality (MR) content, and content within and associated with a real and/or virtual environment (e.g., a 'metaverse') has become appealing to consumers.
To facilitate delivery of this and other related content, service providers have endeavored to provide various forms of wearable display systems. One such example may be a head-mounted display (HMD) device, such as a wearable eyewear, a wearable headset, or eyeglasses.
In some examples, the head-mounted display (HMD) device may project or direct light to may display virtual objects or combine images of real objects with virtual objects, as in virtual reality (VR), augmented reality (AR), or mixed reality (MR) applications. For example, in an AR system, a user may view both images of virtual objects (e.g., computer-generated images (CGIs)) and the surrounding environment. Head-mounted display (HMD) devices may also present interactive content, where a user's (wearer's) gaze may be used as input for the interactive content.
Mixed Reality Interaction with Eye-Track Techniques & more
Meta's patent application covers tracking a position and orientation of the eye as well as gaze direction in head-mounted display (HMD) devices may unlock display and rendering architectures that can substantially alleviate the power and computational requirements to render 3D environments. Furthermore, eye-tracking enabled gaze prediction and intent inference can enable intuitive and immersive user experiences adaptive to the user requirements in his/her interaction with the virtual environment.
Eye tracking may be achieved via a number of techniques. Fringe projection, which projects a periodical pattern onto the eye and uses the reflected pattern to determine three-dimensional (3D) features, is one technique. Another technique utilizes time-of-flight analysis of light projected onto the eye. These and similar techniques involve projection of light, for example, laser light onto the eye and capture of the reflection from the eye at a near distance.
In some examples of the present disclosure, mixed reality (MR) user interactions may be enabled through a combination of eye tracking (user gaze determination) and secondary inputs such as finger gestures, hand gestures, eye gestures, body movements, wrist band device input, handheld controller input, and similar ones. A near-eye display device with eye tracking capability may display content generated or stored at the device or streamed to the device to a user. A location of interest in displayed content may be identified through the user's gaze and/or fixation/saccades.
The gaze based location of interest identification may be considered a primary input. Actions such as zoom, rotate, pan, move, open actionable menus, select from presented options, etc. may be performed on the location of interest based on the secondary inputs captured through an image sensor or other sensors on the near-eye display device or by other devices and communicated to a controller of the near-eye display device.
Meta's patent FIG. 1 below illustrates a block diagram of an artificial reality system environment including a near-eye display; FIGS. 3A and 3B illustrate a perspective view and a top view of a near-eye display in the form of a pair of smartglasses.
Meta's patent FIG. 5A above illustrates control of interaction with displayed content based on eye tracking in conjunction with finger gestures, hand gestures / wrist band gestures and even gaming controller input; FIG. 6 illustrates control of interaction with displayed content such as a 3D interactive map based on gaze focus in conjunction with secondary inputs.
For full details review Meta's U.S. patent application 20240248527.
Other Patents relating to Smartglasses from Google this past week
20240249477: Fit Prediction based on Feature Detection in Image Data
WO2024155811: Determining a Change to a Frame for Smartglasses
20240241374: Control of Diffraction Artifacts in Augmented Reality Display using View Control Layer