Orion Watch: Meta Patents on Technologies being considered for their Advanced Smartglasses – Part 1
Meta has achieved early success in the initial stages of the smart glasses market through a partnership with Ray-Ban. Meta is now ginning up excitement for the prototype of a much more advanced pair of glasses called Orion, a project nearly a decade in the making.
Zuckerberg’s reveal of Orion late last month has triggered an avalanche of enthusiasm in techland and Patently Mobile will now cover Meta's smartglasses patents in a continuing series.
With Smartglasses possibly being able to challenge the dominance of smartphones globally, the battle between Apple and Meta will intensify. Die hard techies will be able to drill down into Meta's patents to explore many of the technologies that that their engineers are exploring. Below is the video review from The Verge.
On Friday, Meta’s Orion smartglasses prototype received another positive review by Engadget’s Karissa Bell who noted: “Meta isn’t just trying to create a more convenient form factor for mixed reality hobbyists and gamers. It’s offering a glimpse into how it views the future, and what our lives might look like when we’re no longer tethered to our phones. Meta still has a lot of work to do before that AR-enabled future can be a reality, but the prototype shows that much of that vision is closer than we think. And lastly, you could also check out Marques Brownlee’s Friday review – that’s another positive review.
Patently Mobile’s / Patently Apple’s new “Orion Watch” Series is about following Meta’s work on Orion via their patents so as to give techies a glimpse of the technologies and projects that Meta’s engineering teams are working on. Of course like any major project, some technologies and patents will make it into the final product, some for future versions of Orion and some will simply die and be replaced by new breakthroughs over time.
This new series will obviously depend on the flow of Meta’s patents from the U.S and European Patent Offices and so we’ll post updates as they’re made available.
Some of the patents covered in this series will also cover features that will first appear in the Ray-Ban Meta glasses and then work their way into Orion over time. The first patent below is one such patent that was first revealed during Meta’s ‘Connect Conference’ 2024, covering ‘Live Translation.”
Meta Patent: Translation with Audio Spatialization
Meta’s patent generally relates to near real time translation of voice signal from a first language to a second language, and more specifically to spatializing original and translated voices.
When people are traveling or communicating with a person who speaks a different language, they can employ a human translator. Alternatively, people may use a translation application (e.g., a mobile app) to try to capture what the other side is speaking. An existing translation application may generate some text and display the text to a user. It is often cumbersome to listen to people while reading the text on a display.
The examples described in the patent allow users to hear multiple voices in different languages sequentially or at a same time, while spatializing them such that the multiple voices would not interfere with each other. Users can adjust the spatialization based on their preferences. For example, a user who is more fluent in the foreign language may want to spatialize the original voice in the foreign language to be closer or louder. Alternatively, a user who is less fluent in the foreign language may want to spatialize the translated voice to be closer or louder. The audio system may help users to navigate in a foreign country, communicate with a foreigner, listen to foreign radio, watch foreign movies, and/or learn a foreign language.
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been adjusted in some manner before presentation to a user, which may include, e.g., a virtual reality (VR), an augmented reality (AR), a mixed reality (MR), a hybrid reality, or some combination and/or derivatives thereof. Artificial reality content may include completely generated content or generated content combined with captured (e.g., real-world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or in multiple channels (such as stereo video that produces a three-dimensional effect to the viewer)
Meta’s patent FIG. 1A is below a perspective view of a headset implemented as an eyewear device; FIG. 4 illustrates an example environment in which the audio system is described.
Meta’s patent FIG. 7 above is a flowchart of a method for generating and spatializing a first voice signal in a first language and a translated voice signal in a second language based on text. Meta’s patent was published in Europe on October 16, 2024 under number EP4447045.
Meta Patent: Input Methods for Smartglasses
Meta’s Orion related patent covers systems and methods that will allow a user to control one or more electronic devices communicatively coupled with the wrist-wearable device using one or more touchless inputs (e.g., in-air hand gestures).
By detecting touchless inputs, the wrist-wearable device allows a user to provide inputs in an easy to use, frictionless, and socially accepted manner. Further, the systems and methods disclosed herein allow for the coordination of multiple input methods from distinct sources. In particular, the system and methods disclosed herein coordinate touchless inputs with voice-based commands to accelerate commands and performed at a wearable device and increase the overall efficiency of the devices. Through use of the systems and methods disclosed herein, the user can use reduce the number of inputs required to perform an action and accelerate the performance of one or more actions
One example of a method of detecting touchless inputs is disclosed. The method includes detecting, by a wrist-wearable device worn by a user, an in-air hand gesture performed by the user. The wrist-wearable device is communicatively coupled with one or more electronic devices. The method includes, in response to a determination that the in-air gesture is associated with a control command, (i) determining an electronic device of the one or more electronic devices to perform the control command and (ii) providing instructions to the electronic device selected to perform the control command. The instructions cause the electronic device to perform the control command. The method further includes providing an indication via the wrist-wearable device and/or the one or more electronic devices that the control command was performed.
In some embodiments, the method is performed at a wrist-wearable device. In some embodiments, the method is performed by a system including the head-wearable device and the wrist-wearable device. In some embodiments, a non-transitory, computer-readable storage medium includes instructions that, when executed by a wrist-wearable device, cause the wrist-wearable device to perform or cause performance of the method.
One example of a method includes detecting, by a first wearable device worn by a user, a first touchless input. The first wearable device is communicatively coupled with at least a second wearable device worn by the user. The method includes, in response to a determination that the first touchless input gesture is associated with a first control command to be performed at the first wearable device and/or the second wearable device, causing the first wearable device and/or the second wearable device to perform the first control command.
The method includes detecting, by the second wearable device, a second touchless input and, in response to a determination that the second touchless input gesture is associated with a second control command to be performed at a respective wearable device performing the first control command, causing the respective wearable device to perform the second control command.
In some embodiments, the method is performed at a wrist-wearable device. In some embodiments, the method is performed at a head-wearable device. In some embodiments, the method is performed by a system including the head-wearable device and the wrist-wearable device. In some embodiments, a non-transitory, computer-readable storage medium includes instructions that, when executed by a wrist-wearable device, cause the wrist-wearable device to perform or cause performance of the method.
In some embodiments, a non-transitory, computer-readable storage medium includes instructions that, when executed by a head-wearable device, cause the wrist-wearable device to perform or cause performance of the method.
The features and advantages described in the specification are not necessarily all inclusive and, in particular, certain additional features and advantages will be apparent to one of ordinary skill in the art in view of the drawings, specification, and claims. Moreover, it should be noted that the language used in the specification has been principally selected for readability and instructional purposes.
Meta’s patent FIGS. FIGS. 1A-1E below illustrate example inputs provide by a user via a hand gesture detected by a worn wrist-wearable device
Meta’s patent FIGS. 3A-B below illustrate different in-air hand gestures performed by a wearer of a wrist-wearable device; FIG. 10A below examples of wrist wearable devices.
Below is Meta’s Orion band that was illustrated in Brownlee’s review presented earlier. In 2021, Patently Apple covered one of Meta’s first patents on this band here.
Meta’s patent was published in the U.S. and Europe on October 10, 2024 under number US20240338171.