Ray-Ban Meta Smart Glasses Upgraded With Live AI and Live Translation Features

Ray-Ban Meta smart glasses received two new artificial intelligence (AI) features on Monday. The first is Live AI which adds real-time video processing capability to Meta AI and lets the chatbot continuously see the user’s surroundings and answer questions about them. The second is live translation which lets the AI translate speech in real-time within the supported languages.

Ray-Ban Meta Smart Glasses Upgraded With Live AI and Live Translation Features

Ray-Ban Meta smart glasses received two new artificial intelligence (AI) features on Monday. The first is Live AI which adds real-time video processing capability to Meta AI and lets the chatbot continuously see the user's surroundings and answer questions about them. The second is live translation which lets the AI translate speech in real-time within the supported languages. The latter was also demonstrated by CEO Mark Zuckerberg during Connect 2024. These are first being rolled out to the members of Meta's Early Access Programme in Canada and the US.

Ray-Ban Meta Smart Glasses Get Two New AI Features

The tech giant says that the two new AI features coming to the Ray-Ban Meta smart glasses are part of the v11 software update for the smart glasses that is now rolling out to eligible devices.

Live AI will let Meta AI access the cameras in the smart glasses and continuously process the video feed in real time. This is similar to ChatGPT's Advanced Voice with Vision feature recently released by OpenAI. The company highlighted that during a session, the AI can continuously see what the user sees, and converse about it more naturally.

Users will be able to talk to Meta AI without saying the “Hey Meta” activation phrase. Additionally, users can also ask follow-up questions as well as reference things discussed earlier in the session, according to the company.

They can also change the topic and go back to previous topics fluidly. “Eventually Live AI will, at the right moment, give useful suggestions even before you ask,” the post added.

Live translation offers real-time speech translation between English and either Spanish, French, or Italian languages. So, if a user is talking to someone who speaks one of those three languages, Meta AI can translate them in real time and generate the translated audio through the glasses' open-ear speakers. Users can also view the translation on their smartphone as transcription.

The tech giant cautions that these new features may not always get it right and that it will continue to take user feedback and improve the AI features. Currently, there is no word on when these features will be released for all users globally. Meta has yet to release any of these AI features in India.