Meta’s Ray-Ban glasses are getting smarter thanks to an update that’s introducing three features, including new AI functionality. In an announcement this week, Meta said the features are coming soon to its smart glasses. The first two are only available to Early Access program members (you can sign up here), but the third is available to all US and Canadian users.
Also: Best smart glasses
Live AI
First up is live AI. Meta’s glasses already have AI capabilities, but this latest update adds video support. During a live AI session, Meta explains that your glasses can see what you’re seeing and continue a conversation more naturally than ever.
Also: The 7 tech gadgets I couldn’t live without in 2024
If you’re cooking, for example, you might ask, “How much salt did you say I needed for that stew recipe?” or “Do you see a substitute for butter, and how much would I need?” If you’re in a new neighborhood, you might ask, “What’s the story behind this landmark?” or “Was this landmark built at the same time as the last one?” Meta also offered the example of wearing the glasses in your garden and asking, “How often do these plants need to be watered?” or “Where’s a good place to put this plant?”
You can ask questions without using the “Hey Meta” introduction each time, reference something you discussed earlier, ask a follow-up question, or even change the topic. Meta says live AI will eventually be able to offer suggestions before you even ask.
Live translation
After teasing the feature earlier this year at Connect, Meta is rolling out live translation. Your glasses can translate speech in real time between English and Spanish, French, or Italian. If you’re listening to someone speak in one of those languages, you’ll hear an English translation through your glasses’ speakers or see it as a transcript on your phone.
According to an official support page, you’ll need to download the languages you want to translate the first time you use this feature, and you’ll need to select the language you want to translate. It doesn’t detect automatically like Google Translate.
Also: Meta Ray-Ban smart glasses review
Shazam integration
Finally, your Meta Ray-Ban glasses can now use Shazam to identify a song that’s playing nearby. Just ask, “Hey Meta, what is this song?” and you’ll get your answer hands-free.
Artificial Intelligence
Source : https://www.zdnet.com/article/your-meta-ray-ban-smart-glasses-just-got-a-massive-ai-upgrade/