Uncategorized

Ray-Ban Meta Smart Glasses are getting even more AI features, like live language translations and Meta AI for live video

While there are no new Ray-Ban Meta Smart Glasses, the existing specs are getting new features, including live language translations and Meta AI for video, via free updates.

The Ray-Ban Meta Smart Glasses might be the most popular smart glasses around, and that’s likely thanks to their feature set and housing. Meta partnered with Ray-Ban to bring the iconic Wayfarers into the technological age with slightly thicker frames, two cameras, some speaker tech, microphones, and connectivity.

These smart glasses started out as a unique way to capture photos or videos, at times even more ‘in the moment’, given that you didn’t need to take your phone out and launch the camera. In recent months, Meta has infused these smart glasses with Meta AI, enabling you to look at something and simply say, “Hey Meta, what’s this?” and let it look, analyze it, and then provide you an answer. It’s pretty neat. 

Now, though, at Meta Connect 2024, the team working on the Smart Glasses wants to make these even smarter – and if you guessed that they’re doing that with AI, you’d be correct. 

Language translations and visual reminders

Kicking things off with what might be the most helpful new feature, the Ray-Ban Meta Smart Glasses will get live language translation later this year. Similar to what Samsung accomplished with the Galaxy Buds Pro 2 or Google with the Pixel Buds Pro 2, the Ray-Ban Metas will be able to translate languages near-live, initially between English and Spanish, Italian, and French. 

This could prove pretty helpful, and more natural than attempting to do this with earbuds in, as it’s baked into the smart glasses, which you might already be wearing daily if you’ve opted to install prescription lenses. 

Furthermore, beyond asking Meta to set a reminder verbally, you can now set up reminders based on things that you see, and therefore Meta is viewing – so it could be as you’re getting milk out of the fridge and realize you’re almost out, or maybe a package that you left near your front door that you need to be sure you take with you. This feature should be rolling out sooner rather than later.

Similarly, you’ll now be able to scan QR codes for events, phone numbers, and even full contact information. If a QR code is visible, you’ll be able to ask Meta via the Ray-Ban Meta Smart Glasses to scan it – we imagine the information will then appear in the Android or iOS companion app.

An ambitious video step

Likely the most ambitious forthcoming feature, also set to arrive later this year, Meta AI for video, meaning that Meta can view what you’re looking at in real time, not just an image snapshot, and provide clarity or answer questions. This could be helpful for navigating around a city, cooking a meal, completing a math problem, or helping you finish a Lego set.

This is likely a big step and would raise some privacy concerns, as it’s a live view from your glasses that’s being processed immediately. You’ll also need the Ray-Ban Meta Smart Glasses to be connected to the internet via an iPhone or Android for this to work, as it would need to process the information in real time. 

Still, though, it does likely give us an idea of where Meta is headed with the smart glasses category, and it’s great to see that Meta is continuing to roll out new features to the smart glasses. And that’s the good news here – these updates don’t, so far as we know, require new hardware. These are set to arrive as over-the-air updates for folks who already have the Ray-Ban Meta Smart Glasses, or who purchase them. 

Another update that’s on the way is integration with Amazon Music, Audible, iHeart, and Spotify integration, which will give you easier access to your favorite songs, artists, podcasts, and books hands-free. You’ll also see new Transition lens options arriving from EssilorLuxottica, the eye giant behind brands ranging from Dolce & Gabbana to Oakley. 

So, if you haven’t loved the available looks enough to get a pair yet, or want to freshen yours up, once those hit the scene it’ll a good time to consider them again. We’ll be going hands-on with these new features, from language translation to Meta AI for video, as soon as we can, so stay tuned to TechRadar.

You Might Also Like…

I took these ChatGPT smart glasses on a day trip to the beach, and I …Amazon’s Echo Frames update just might make you finally wear …Google might’ve just teased its new Smart Glass project but don’t call …

Read More 

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to top
Generated by Feedzy