Meta recently announced that its Ray-Ban smart glasses will introduce three new features: real-time AI, live translation, and Shazam music recognition. Currently, the real-time AI and translation features are available exclusively to members of Meta's early access program, while the Shazam feature has been launched to all users in the United States and Canada.
Real-time AI and live translation functionalities made their debut earlier this year at the Meta Connect 2024 conference. The real-time AI allows users to engage with Meta’s AI assistant, which continuously monitors the surrounding environment. For instance, in the produce section of a supermarket, users can ask the AI assistant to suggest recipes based on the visible ingredients. Meta states that this feature can operate for approximately 30 minutes on a full charge.
The live translation feature supports real-time voice translation in English, Spanish, French, and Italian. Users can choose to hear the translations through the glasses or view the translated text on their smartphones. Before use, the required language pairs must be downloaded, and users must set their own and their conversation partner’s languages.
The Shazam feature is straightforward to use—users simply prompt Meta AI when they hear a song, and it will identify and provide information about the track.
If users have not yet accessed these features, they should check that their glasses are updated to software version v11 and ensure the Meta View app is updated to version v196. Members who are not part of the early access program can apply to join through Meta’s official website.
This update comes at a time when major technology companies are actively promoting AI assistants as key functionalities for smart glasses. Last week, Google announced the launch of the Android XR operating system, highlighting its Gemini AI assistant as a crucial application for smart glasses. Meanwhile, Meta’s Chief Technology Officer, Andrew Bosworth, stated in a blog that 2024 is a pivotal year for significant advancements in AI glasses. He also suggested that smart glasses could become the best form of “genuinely AI-centric devices” and the first hardware category “fully defined by AI from the ground up.”