Meta Ray-Ban Smart Glasses Initiate Multi-Modal AI Function Testing

2023-12-13

Put on Ray-Ban smart glasses and say "Hey, Meta" to summon a virtual assistant that can observe and listen to your surroundings.



Meta is finally allowing people to try out its highly anticipated Meta Ray-Ban smart glasses AI features, albeit as early access testing. Today, Meta announced the rollout of its multimodal AI capabilities, which can tell you what Meta's AI assistant can see and hear through the glasses' camera and microphone.





Mark Zuckerberg showcased this update in a short video on Instagram, where he had the glasses suggest some pants that would go well with the shirt he was holding.


The glasses provided some pants options based on a description of the shirt. He also had the AI assistant of the glasses translate text and display some descriptions of images.


Zuckerberg first revealed such multimodal AI features for Ray-Ban glasses in a September interview. He mentioned that people would "ask different questions to the Meta AI assistant at different times of the day," indicating its ability to answer questions about the content the wearer is viewing or their location.


In a video featuring Chief Technology Officer Andrew Bosworth, this AI assistant accurately described an illuminated, California-shaped wall sculpture. He explained some other features, including asking the assistant to help tag photos you take or translate and summarize text—common AI functionalities found in Microsoft and Google's other products.