Meta’s Ray-Ban smart glasses are getting an exciting update in April 2024, as reported by The New York Times. They’ll soon have AI features that let you find out about things just by looking at them and asking a question.
Imagine you’re curious about the sugar content in a pack of gummies. You could simply say, “Hey Meta, look and tell me how much sugar is in this pack of gummies,” and in a few seconds, you’ll have your answer. The glasses will take a picture (you’ll hear a click), and then the built-in AI will analyze the image and tell you what you want to know.
This feature is super handy because you won’t need to pull out your phone to search for information or physically pick up items to read their labels. And it’s not just for food labels. You can ask about animals, plants, or even get recipes based on ingredients you’re looking at.
Think of it like having ChatGPT, but with eyes. You need to use the “look” command to get information on what you’re seeing. Or, you can simply say “Hey Meta” followed by your question, without activating the camera.
Mark Zuckerberg shared a post on Instagram showing how it works.
All the photos taken and answers from the AI are saved in the Meta View app on your phone. This turns your smart glasses into a sort of smart note-taker, recording your questions and providing useful suggestions for later.
Since this AI feature is still being tested, it’s not perfect yet. If you’re interested, you have to sign up and wait for your turn to try it out.