CEO Mark Zuckerberg unveiled a number of new AI-powered capabilities for the company’s Ray-Ban partnership on Wednesday at the Meta Connect event.

The addition of real-time translation through the glasses’ speakers is the most intriguing feature of the lot.

Meta clarifies:

You’ll soon be able to translate speech in real time with your glasses. Through the open-ear speakers on the glasses, you can hear what others are saying in English when they speak Spanish, French, or Italian. This is not only excellent for travel, but it should also aid in removing barriers like language and fostering interpersonal relationships. To make this function even more helpful, we intend to extend support for additional languages in the future.

READ MORE: According To An Expert, AI Glasses Will Allow Humans “Superpowers” To Identify If Someone Are Lying Or Attracted To Them

The businesses have not yet disclosed a release date for this particular AI feature, but contingent on its execution, it may prove to be an immensely beneficial enhancement to the livestreaming glasses.

READ MORE: These High-Tech Glasses Will Subtitle Real-Life Conversations

Live translation has been considered the holy grail of electronics for both startups and well-established companies. Notably, Google unveiled a concept set of spectacles with a head-up display that could translate text in real time. But that never progressed past the prototype phase.

Although Meta has not yet disclosed the languages that will be initially accessible, it appears that they will initially be restricted to romance languages such as English, Spanish, French, and Italian based on the aforementioned remark.

Source