top of page
Gen-AI Employee Support & Automation Platform

Meta Ray-Ban Glasses Now Help You Remember Parking Spots and Send Voice Messages


Meta Ray-Ban Glasses Now Help You Remember Parking Spots and Send Voice Messages - Morning Toasts

Following last week's Meta Connect event, Meta is rolling out a series of new AI-powered features for its Meta Ray-Ban glasses. The updates, available through the Meta View app, will start going live today, with more features coming in the weeks ahead. Mark Zuckerberg introduced these improvements to enhance the AI capabilities of the Meta Ray-Ban glasses, making them more appealing to users. The updates focus on improving the user experience, allowing the glasses to interact more naturally with wearers, from scanning QR codes to sending voice messages and even remembering helpful details like where you parked.


The most notable feature rolling out today is the glasses' ability to "remember" things for the user. In a recent Instagram post, Zuckerberg demonstrated this by asking his glasses where he parked, and the glasses replied with the exact spot. This "photographic memory" feature could be useful for everyday tasks like recalling grocery lists or remembering event details. The update also improves how the AI is invoked. Now, users need to say "Hey Meta" only once to start a conversation, and the AI will remain active for follow-up commands without needing to be reactivated.


While the initial updates are available today, some highly anticipated features like live translation and the "Be My Eyes" partnership for visually impaired users will arrive later. Live translation will allow users to have real-time conversations in languages like Spanish, French, and Italian, with the glasses translating on the go. The "Be My Eyes" feature will let users stream videos to a volunteer who can assist them with tasks like reading signs or navigating unfamiliar areas.


Meta's move to integrate AI more deeply into its Ray-Ban glasses is part of a broader trend of enhancing wearable technology. With the new updates, users can expect a more natural, multimodal AI experience, where the glasses are always on and capable of recognizing objects, scanning QR codes, and offering help with daily tasks. These glasses, starting at $300, are set to compete with similar offerings from companies like Google and Apple.

Comments


bottom of page