Meta has unveiled a suite of innovative features for its Ray-Ban smart glasses, enhancing their functionality with live AI assistance, real-time translations, and Shazam integration. These enhancements aim to redefine the utility of smart glasses, not just as a fashion statement but as a practical tool for everyday life.
Live AI interaction
The live AI feature, exclusive to members of Meta’s Early Access program, allows for a seamless conversation with the AI assistant by continuously streaming what the wearer sees through the glasses. Imagine you’re in a supermarket contemplating dinner options; with this update, you could simply say, “Hey Meta, what can I make with these?” while looking at ingredients, and the AI could suggest recipes on the spot. This feature, which Meta claims can operate for about 30 minutes on a full charge, embodies the vision of AI as an omnipresent helper in our daily routines.
Real-time translation
Another exciting addition is the real-time language translation capability. This feature supports translating speech between English and three other languages: Spanish, French, and Italian. Users can opt to hear translations directly through the glasses or view them as text on a connected smartphone. However, to utilize this feature, one must download specific language pairs and set your own language and that of your conversational partner. This could be particularly useful for travelers or anyone needing to bridge language barriers without reaching for a phone or translation app.
Shazam integration
Perhaps the most universally accessible update is the integration of Shazam, the renowned song identification service, now available to all Ray-Ban Meta glasses users in the US and Canada. With a simple voice command, “Hey Meta, what’s this song?” the glasses can identify music playing in the environment, much like Shazam does on a smartphone. This was demonstrated by Meta CEO Mark Zuckerberg in an Instagram reel, showcasing how straightforward and integrated this feature is into the glasses’ functionality.
Software and access:
To access these new features, your Ray-Ban Meta glasses need to be updated to software version v11, and you should ensure the Meta View app is at version v196. For those interested in the live AI and translation features, joining the Early Access program is necessary, which can be done through Meta’s dedicated website.
This update arrives at a time when tech giants are intensifying their focus on AI as the cornerstone of smart wearables. Google, for instance, recently unveiled Android XR, a new operating system for smart glasses, highlighting its Gemini AI as a pivotal element. This trend underscores a broader industry push toward making AI the central feature of wearable tech.
In a recent blog post, Meta CTO Andrew Bosworth reflected on these developments, stating, “2024 was the year AI glasses hit their stride.” He posits that smart glasses could be the ultimate form factor for an “AI-native device,” suggesting that this category might be the first to be genuinely shaped by AI from inception.
The rollout of these features by Meta not only enhances the functionality of their Ray-Ban smart glasses but also signals a significant step forward in how we interact with technology in our immediate environment.
Discover more from GadgetBond
Subscribe to get the latest posts sent to your email.
