The It already worked well as a head-mounted camera and a pair of open headphones, but now the Meta access to live AI without the need for a wake word, live translation between several different languages, and access to Shazam to identify music.
Meta in most of these features in September. Live AI lets you start a "live session" with Meta AI, which gives the assistant access to everything you're seeing and allows you to ask questions without saying "Hey Meta." If you need your hands to cook or fix something, Live AI should keep your smart glasses useful even if you need to concentrate on what you're doing.
Live translation allows your smart glasses to translate between English or French, Italian or Spanish. If live translation is enabled and someone is speaking to you in one of the selected languages, you'll hear what they're saying in English through the smart glasses' speakers or as a transcript in the Meta View app. You'll need to download specific models to translate between each language, and live translation must be enabled before it can actually act as a translator, but it feels more natural than holding out your phone to translate something.
With Shazam integration, your Meta smart glasses will also be able to identify the song you hear around you. A simple "Meta, what's this song" will let you know you're listening to the smart glasses' microphones, just like you would using Shazam on your smartphone.
All three updates move towards Meta's ultimate goal of a pair of augmented reality glasses that can replace your smartphone. is a view from real life. Combining artificial intelligence or VR and AR seems to be an idea that many tech giants are also circling. Google's newest XR platform, Like Gemini, it's built around the idea that generative AI could be the glue that makes VR or AR compelling. We're still years away from any company wanting to replace your field of vision with holographic images, but in the meantime, smart glasses seem like a moderately useful break.
All Ray-Ban Meta Smart Glasses owners will be able to enjoy Shazam integration as part of the Meta's v11 update. For live translation and live AI, you must be part of Meta's Early Access Program, which you can join now. .
Source link