Meta CEO Mark Zuckerberg at the Meta Connect 2024 on Wednesday, announced updates to the company’s Ray-Ban Meta smart glasses, reinforcing Meta’s vision of making smart glasses the next essential consumer device.
The upcoming enhancements include advanced AI capabilities and feature familiar to smartphone users, set to roll out later this year.
Among the new offerings is real-time AI video processing, allowing users to interact with their surroundings more intuitively.
Users will soon be able to ask the glasses questions about what they see, with Meta AI providing verbal responses in real time. This upgrade promises a more natural experience compared to the current functionality, which only allows for image capture and description.
During a live demonstration, participants asked the glasses questions about various scenes, such as meals they were preparing or sights in the city, showcasing the potential of real-time interaction.
While similar capabilities have been showcased by competitors like Google and OpenAI, Meta aims to be the first to implement them in a consumer product.
Zuckerberg also highlighted a new live language translation feature.
He said the feature would enable English-speaking users to converse with individuals speaking French, Italian, or Spanish, with the glasses translating the dialogue on the fly, stressing that additional languages are expected to be added in the future.
He stated, "Other features announced include reminders that can be set by simply looking at an object, such as a jacket, enabling users to save images for later sharing. The glasses will also integrate with popular streaming services like Amazon Music, Audible, and iHeart Radio, making it easier for users to enjoy music through built-in speakers.
"Additionally, the Ray-Ban Meta glasses will include QR code scanning capabilities, allowing users to capture codes effortlessly and open them on their phones without any extra steps. New Transitions lenses, which adjust to changes in UV light, will also be available, enhancing the user experience based on environmental lighting.
"These updates reflect Meta’s ongoing commitment to pushing the boundaries of augmented reality technology and enhancing the everyday lives of its users."