META is upgrading its AI smart glasses to let them "see" what's going on around you – and then tell you about it.
The Ray-Ban Meta smart glasses already have an artificial intelligence assistant built-in.
But currently, they're only able to answer generic questions that anyone could ask from anywhere – like what's the weather like in New York, or how tall is the Empire State Building.
The smart specs were announced back in September with a promise that they would one day talk about things that you're seeing.
Now Meta is making good on that promise with an Early Access trial for some users to check out the new feature.
Mark Zuckerberg even showed off the feature in an Instagram video.
The billionaire held up a shirt from his closet and asked the Meta AI glasses to recommend a pair of pants – entirely based on what he was seeing.
He said: "Hey Meta, look and tell me what pants to wear with the shirt."
The Meta AI replied: "Based on the image, it appears to be a striped shirt.
"A pair of dark washed jeans or solid-color trousers would complement this shirt well."
Most read in News Tech
The feature is a major upgrade to smart glasses, and sounds like something out of science fiction.
But Meta says it's ready to release the product to the public – albeit only partially.
Meta explained: "Today we’re introducing new updates to make the glasses smarter and more helpful than ever before.
"We’re launching an early access program for Ray-Ban Meta smart glasses customers to try out and provide feedback on upcoming features ahead of their release.
"Starting today, customers who have opted in to the program will have access to a new test of multimodal AI-powered capabilities.
"You won’t just be able to speak with your glasses — the glasses will be able to understand what you’re seeing using the built-in camera.
"You can ask Meta AI for help writing a caption for a photo taken during a hike or you can ask Meta AI to describe an object you’re holding."
How to get access to Meta AI's new vision trick
First of all, you'll need a pair of Ray-Ban Meta AI smart glasses, which currently start at $299.
You'll also need to be in the US, where the test has been strictly limited to.
Then you can join the Early Access program.
Go into the Meta View app and tap the cogwheel to enter Settings.
Now swipe down and tap Early Access, then tap Join Early Access – or toggle the setting on.
Next you'll need to turn the feature on.
Go into the Meta View app, and again tap the cogwheel to find Settings.
Swipe down and tap Early Access, then tap Look And Ask With Meta AI to turn it on.
You might be prompted to turn some features on that are necessary for the feature to work.
How does the feature work?
To use the feature, you'll want to say "Hey Meta, look and..." then ask a question about what you're seeing.
Meta suggested trying out some of the following options:
- "Describe this landscape in an inspiring way."
- “Write a funny Instagram caption about this dog.”
- "What can I make with these ingredients?”
- “What can I pair with this wine?”
- "What is this plant?”
- “How much water do these flowers need?”
- "Translate this text into English.”
- “What does this sign say?”
Meta explained: "When you ask Meta AI questions about what you’re looking at, your glasses will send a photo to Meta’s cloud to be processed with AI.
"After processing, Meta AI will deliver an audio response on your glasses, and you will be able to review your request, the image, and the response in Your requests."
READ MORE SUN STORIES
Importantly, you won't be able to access personal info about people you're seeing.
So thankfully, you can't look at another person and get their name using Meta AI.