Meta announced that Ray-Ban Meta Glasses now have Live Translation, which they introduced back in September, and added Shazam Support+
During Meta's Connect developer conference in September, they revealed their next-gen 'Orion' AR Glasses and announced that new features were coming to the their Ray Ban Meta AI glasses such as Live Translation as noted in our cover graphic above.
Yesterday, Meta announced that members of their Early Access Program are about to tap into two new superpowers that we announced at Connect 2024.
The first is live AI, which adds video to Meta AI on your glasses. During a live AI session, Meta AI can see what you see continuously and converse with you more naturally than ever before. Get real-time, hands-free help and inspiration with everyday activities like meal prep, gardening, or exploring a new neighborhood. You can ask questions without saying “Hey Meta,” reference things you discussed earlier in the session, and interrupt anytime to ask follow-up questions or change topics. Eventually live AI will, at the right moment, give useful suggestions even before you ask.
The second is live translation, which Meta Founder & CEO Mark Zuckerberg demoed live onstage at Connect this year. Through this new feature, your glasses will be able to translate speech in real time between English and either Spanish, French, or Italian. When you’re talking to someone speaking one of those three languages, you’ll hear what they say in English through the glasses’ open-ear speakers or viewed as transcripts on your phone, and vice versa. Not only is this great for traveling, it should help break down language barriers and bring people closer together.
The Third is adding Shazam that will be available to customers in the U.S. and Canada.