Meta introduced new features for their Ray-Ban photo-glasses today while providing developers with a glimpse of their future AR Glasses
A report this morning from Reuters, prior to the Meta Connect event, reminded us that the head of Meta's metaverse-oriented Reality Labs division acknowledged in 2023 that a VR Glasses device that they could viably bring to market was “still a few years away - a few, to put it lightly."
Reuters further noted that Meta "is planning for the first generation of the AR glasses this year to be distributed only internally and to a select group of developers, with each device costing tens of thousands of dollars to produce, according to a source familiar with the project.
Meta aims to ship its first commercial AR glasses to consumers in 2027, by which point technical breakthroughs should bring down the cost of production, the source said." Of course "aiming" is one thing and delivering them is another. After seeing the Meta Connect Keynote of products, Orion is nice prototype that will take much more time to come to market with glasses that don't make you look like a techno-nerd or 'dork.'
Introducing Orion, Meta's First True Augmented Reality Glasses
Towards the end of the Meta Connect Keynote, CEO Mark Zuckerberg rolled out Meta's deep research prototype. He pumped up that they were the world's first AR Glasses. No. They'll be the first AR Glasses when they actually ship to market. For now, it's all smoke and mirrors.
Meta notes that there are three primary reasons why AR glasses are key to unlocking the next great leap in human-oriented computing.
- They enable digital experiences that are unconstrained by the limits of a smartphone screen. With large holographic displays, you can use the physical world as your canvas, placing 2D and 3D content and experiences anywhere you want.
- They seamlessly integrate contextual AI that can sense and understand the world around you in order to anticipate and proactively address your needs.
- They’re lightweight and great for both indoor and outdoor use, and they let people see each other’s face, eyes and expressions.
That’s the north star our industry has been building towards: a product combining the convenience and immediacy of wearables with a large display, high-bandwidth input and contextualized AI in a form that people feel comfortable wearing in their daily lives.
The Evolution of Smart Glasses
Ray-Ban Meta glasses have demonstrated the power of giving people hands-free access to key parts of their digital lives from their physical ones. You could talk to a smart AI assistant, connect with friends and capture the moments that matter – all without ever having to pull out a phone.
Yet while Ray-Ban Meta opened up an entirely new category of display-less glasses super-charged by AI, the XR industry has long dreamt of true AR glasses – a product that combines the benefits of a large holographic display and personalized AI assistance in a comfortable, all-day wearable form factor. Orion rises to the challenge.
Groundbreaking AR Display in an Unparalleled Form
Meta has been hard at work for years to take the incredible spatial experiences afforded by VR and MR headsets and miniaturize the technology necessary to deliver those experiences in a pair of lightweight, stylish glasses.
Nailing the form factor, delivering holographic displays, developing compelling AR experiences, creating new human-computer interaction (HCI) paradigms – and doing it all in one cohesive product – is one of the most difficult challenges our industry has ever faced. It was so challenging that they thought they had less than a 10% chance of pulling it off successfully. Until now.
Orion is a feat of miniaturization – the components are packed down to a fraction of a millimeter. Dozens of innovations were required to get the design down to a contemporary form that you’d be comfortable wearing every day.
Orion has the largest field of view in the smallest AR glasses form to date. That field of view unlocks truly immersive use cases for Orion, from multitasking windows and big-screen entertainment to life-size holograms of people – all digital content that can seamlessly blend with your view of the physical world.
But what makes Orion unique is that it is unmistakably a pair of glasses in both look and feel – complete with transparent lenses. Unlike MR headsets or other AR glasses today, you can still see other people’s eyes and expressions, so you can be present and share the experience with the people around you.
Meta has their smart assistant, Meta AI, running on Orion. It understands what you’re looking at in the physical world and can help you with useful visualizations. So you can open up your refrigerator and ask for a recipe based on what’s inside. Or video call a friend while adjusting a digital family calendar as you wash the dishes.
Back to Earth: Real World Ray-Ban Meta Glasses
Of course before Meta's Orion ever comes to market, the company will continue to evolve their Ray-Ban Meta Glasses.
Meta noted that First, they’re making it easier for you to have a conversation with Meta AI. Kick off your conversation with “Hey Meta” to ask your initial question and then you can ask follow-up questions without saying “Hey Meta” again. And you no longer need to say “look and” to ask Meta AI questions about what you’re looking at.
Meta is adding the ability for your glasses to help you remember things. Next time you fly somewhere, you don’t have to sweat forgetting where you parked at the airport — your glasses can remember your spot in long-term parking for you. And you can use your voice to set a reminder to text your mom in three hours when you land safely.
You can now ask Meta AI to record and send voice messages on WhatsApp and Messenger while staying present. This comes in especially handy when your hands are full or when you can’t get to your phone easily to write out a text.
Meta is adding video to Meta AI, so you can get continuous real-time help. If you’re exploring a new city, you can ask Meta AI to tag along, and then ask it about landmarks you see as you walk or get ideas for what to see next — creating your own walking tour hands-free. Or, if you’re at the grocery store and trying to plan a meal, you can ask Meta AI to help you figure out what to make based on what you’re seeing as you walk down the aisles, and if that sauce you’re holding will pair well with that recipe it just suggested.
Soon, your glasses will be able to translate speech in real time. When you’re talking to someone speaking Spanish, French or Italian, you’ll hear what Meta plans to add support for more languages in the future to make this feature even more useful.
Although it was a demo, "Live Translation" was impressive. Of course some smartphones offer that today, but the demo made it a more natural way to use live translation via the glasses. Time will tell if that feature pans out as they presented.
In addition, Meta is advancing their integrations with Spotify and Amazon Music, and adding new partnerships with Audible and iHeart. You can use your voice to search, discover and play content on the go. Ask to play by song, artist, album, or audiobook. And you can get more information about the content your glasses are playing (“Hey Meta, what album is this from?”).
Meta is also bringing EssilorLuxottica’s new range of UltraTransitions GEN S lenses to the Ray-Ban Meta collection, giving you even more options to adapt quickly in all light conditions.