Meta’s Ray-Ban smart glasses receive several pretty impressive AI-powered features. On the program: a real-time assistant, instant translation and Shazam integration.
After smartphones, tablets, wireless headphones, watches and connected rings, the next mobile product ready to invade our daily lives could well be connected glasses, also called smart glasses! Many digital giants have gotten into it, such as Google, Lenovo, Meta and Oppo. The Ray-Ban Stories 3 from Meta, released in October 2023, are enjoying great success across the globe, including in France.
Both companies are gradually improving their product. In particular, they began to integrate artificial intelligence last April. THISThat’s good, the latest update (v11) adds several very interesting functions, such as the Meta announcement in a blog post. On the program: an AI capable of continuously analyzing the user’s environment, real-time translation and the integration of Shazam.
Ray-Ban Meta: Live AI, instant translation and Shazam
The first function is called Live AI. It allows Meta’s artificial intelligence to “see what you see all the time and converse with you more naturally than ever” without having to use the command “Hey Meta.”. For example, it is possible to obtain a recipe from the ingredients placed in front of you, to receive gardening advice or even information about a neighborhood. At the moment, the feature is limited to 30-minute sessions. “Eventually, live AI will give you helpful suggestions at the right time, before you even ask the question”promises Meta.
The second added function is quite incredible, since it involves the translation of conversations in real time. “When you talk to someone who speaks one of these three languages, you hear what they say in English through the speakers in the glasses or as a transcription on your phone, and vice versa”explains the company. Currently only supported are English, Spanish, French and Italian, but there is no doubt that Meta will add more languages later. Note that, to function, the system requires the prior downloading of language packs and the configuration of the languages of the user and their interlocutor.
Finally, this update marks the arrival of Shazam within glasses. Thus, it will be possible to find a song broadcast around you using a simple voice command. For now, Live AI and real-time translation features are only available to members of Meta’s Early Access program, which itself is only accessible in the United States and Canada. They should arrive in a stable version in 2025. As for the Shazam function, it is already available to everyone, but again only within these two countries.
It remains to be hoped that they will make their way to Europe, but it is not certain. Indeed, these new functions require the collection of an enormous amount of data. Not sure that the European authorities see this very favorably…