Meta Ray-Bans can now ‘remember’ things and translate speech in real time for you

Real-world test awaits!

Meta unveiled a range of updates to the Ray-Ban Meta glasses, introducing new AI features to help you with everyday tasks. These include remembering where you parked, translating speech in real-time, and answering questions about what you're seeing.

Interacting with Meta AI is also made easier. Simply start with “Hey Meta” for your initial question, and you can follow up without needing to say it again. Plus, you no longer have to say “look and” when asking Meta AI questions about what you're looking at.

The glasses are getting a memory boost too. No more stressing about where you parked at the airport—your glasses will remember for you. You can even set a voice reminder to text your mom in three hours when you land. Plus, you can now ask Meta AI to record and send voice messages on WhatsApp and Messenger, keeping you connected while staying hands-free and present.

A cool new update is the addition of video to Meta AI, providing continuous real-time assistance. Exploring a new city? You can have Meta AI tag along, ask it about landmarks, and get suggestions for what to see next—creating your own hands-free walking tour. At the grocery store, Meta AI can help plan meals based on what you're looking at, even letting you know if that sauce you're holding will pair well with the recipe it just suggested.

Soon, your glasses will be able to translate speech in real-time. If you're talking to someone speaking Spanish, French, or Italian, you'll hear their words in English through the glasses' open-ear speakers. Perfect for travelling, Meta says this feature helps break down language barriers and brings people closer. 

In addition to these software updates, Meta is launching new transition lenses with Ray-Ban’s parent company, EssilorLuxottica.There are also new limited-edition clear frames that reveal all the tech inside.