Meta Glasses, what do people think

By Moopie Curran, 23 October, 2025

Forum
Assistive Technology

Hi,
I met someone with the meta Rayband smart glasses the other day. She really likes them, she is low vision. I really want a pair of these for Christmas, but I want to know if anyone who is totally blind has them, if so, what do you all think? Are they worth it? I know envision has smart glasses, solo has glasses coming out, but they all cost more, a lot more, out of budget more... :) Are the Meta rayband glasses any good or not?
Thanks

Options

Comments

By Brian on Thursday, November 27, 2025 - 17:05

When you are getting that error message, is that coming audibly from the Meta glasses, or showing up as a message on your iPhone via VoiceOver? Have you plugged the case in to let it charge for a bit, lets say 10-15 minutes?
Also have you tried cycling the power off and then on from the glasses? if the 2nd Gen are like the 1st Gen, the power slide switch will be on the inside of the left arm, just before the hinge.

HTH.

By Dan Cook on Saturday, November 29, 2025 - 13:28

I love these! Are they perfect? Of course not, nothing will be. However, I’ve had so many wonderful experiences already just using them to identify items in my house, take hands free phone calls and silly things like looking in a mirror. I can see these being invaluable, especially as I grow more proficient in using them and asking the right questions in terms of the AI features. The platform updates as well are nearly always adding significant new features, so I’m really excited to see where this goes in the coming months.

By Brian on Saturday, November 29, 2025 - 16:19

They are really good for hands-free control with audiobooks. Doing things like navigating to previous or next chapter, or skipping back-and-forth, it's very easy with just your voice, while using the Meta smart glasses. And you don't even have to put your Audible app into hands-free mode to do this. :-)
It's a small thing, but I am an audiobook junkie, so I absolutely love the Audible integration with the Meta glasses.

By Dan Cook on Saturday, November 29, 2025 - 16:50

I like that they give you an option to control with either your voice or the touchpad when you want to be a bit more discreet. I’m so glad I finally bought these, if Apple or other companies decide to go down this road in future, they’ve already got a tough job ahead to get me to switch to their version if they aren’t reasonably priced. When you think of what these can already do, the fact they have kept the same price from the last generation is wonderful.

By Dan Cook on Saturday, November 29, 2025 - 16:56

Having never had vision before apart from light perception, I’m finding it absolutely fascinating taking pictures and figuring out where my head has to be in order for them to notice certain items and features in the room. Plus as others have said, we’re just at the start in terms of realising the potential this technology has, it’s wonderful to feel included even if it was accidental at first. This is the most excited I’ve been about a new purchase since my first iPhone back in 2011.

By Brian on Saturday, November 29, 2025 - 17:23

Just wait until you record video with your Meta glasses, then upload it to something like PiccyBought, to have audio description added to it by AI.

Talk about having your mind blown! 🤯

I did this while playing Mortal Kombat 11 on my laptop. Recorded a match, including performing a finishing move. Added audio description with PiccyBot, then sent it to a sighted friend to get their opinion. They were pretty impressed. Probably more so that a blind person was able to do that, than over the video itself. Ha ha

By Kushal Solanki on Saturday, November 29, 2025 - 20:28

Hi mate.
The error message is audibly coming from the meta glasses.
I have tried all the troubleshooting steps but nothing seems to have fixed this issue.

By Brian on Sunday, November 30, 2025 - 00:41

1. Make sure the glasses are in their case, and the case is closed.
2. Make sure you have the Meta AI application running on your smart phone. Double tap Devices in the bottom right area of the screen and their should be an, Add Device button.
3. Double tap on that.
4. Next choose your Meta's from the list and follow the instructions on the next screen. You have likely already done this, but make sure to remove any tabs from the glasses and case. They are little plastic tabs if my memory is not failing me, and will prevent your glasses from pairing.
5. With tabs removed, glasses in the case, and you holding the button, on the back of the case, not the glasses, for about 5 seconds, your glasses/case should pair successfully.
6. Do not try any of this while wearing the glasses by the way...

That should do it. If not, return them. 🤷

By Stephen on Tuesday, December 2, 2025 - 13:38

Have you done the set up for the ai features? Also sometimes they have trouble on WiFi connections. If you have a data plan try using them on that and see if it makes a difference :)

By Alexandre Toco on Sunday, December 21, 2025 - 13:29

I has two good gadgets that was developed specially for the blind and they are discontinued now because they wasn't economically viable.
So a great point with iphone and meta glasses is that they were developed for a big comunity and it helps this products will continue to improove.
In my experience, at this moment, we have better AI resources that can be used in iphone. But all of then require a good picture to work fine. And, for a native blind person, take good pictures is not a simple work. The glasses makes this task easier and more acurate. Let's continue to ask microsoft to integrate seeingAi into mega glasses! It will open wide possibilities!

By Jesse Anderson on Monday, December 22, 2025 - 15:24

I have used both Gen 1 and 2 glasses, and have a pair of both at the moment. Functionally, the Gen 2 glasses are pretty much the same as the Gen 1 at this point. There don't seem to be any exclusive features or functionality yet to the Gen 2 glasses.

There are some nice, but subtle improvements though. The camera is supposed to be a little better, and I think a little wider angle, although I haven't noticed this impacting things much as of yet. The audio quality seems to be a little better too, a fuller sound. The biggest improvement though is the battery life. I just got them a couple weeks ago, and haven't yet had a chance to use them for a ful heavy day of use, but for the times I have used them, the battery life seems to be much better. After a few hours of use one day, I was at 85% where I'd probably be well under 50% with the Gen 1 glasses. I don't know if they will hold up to Meta's 8 hour claim, but the battery is better.

If people don't yet own a pair of Metas but are considering them, I'd spend the little extra money and get the Gen 2's especially because the AI will keep improving, and this is what will likely drain the battery the most. The Gen 1 glasses aren't bad by any means, but might as well future proof your purchase a little too.

By Missy Hoppe on Monday, December 22, 2025 - 18:56

This almost definitely has to be an issue unique to me. I have the gen 2 Meta Ray-bans, and also the Ally Solos glasses. With both, it seems all but impossible to get meaningful results. A perfect example happened just a few minutes ago. I was notified by Alexa that a package had been delivered. I found a package outside my building, but wanted to make sure it was mine before I opened it. I tried using the Meta glasses, and no matter how I positioned the box, it couldn't find a label. The glasses seemed to be seeing my kitchen rather than the package, so I guess I just haven't figured out how to position my head so that the camera on the glasses will "see" what I want it to. After a few attempts, I gave up and used the live recognition feature on my iPhone. That was able to give me the info I needed in less than 10 seconds. It's a bit depressing when I think about how much money I've spent on vision assistant apps and smart glasses, only to find myself more often than not getting the assistance I need from something my iPhone can do natively. I love that glasses can make text and object recognition hands-free, but it's only beneficial if I can figure out how to help the glasses work properly.

By Jonathan Candler on Monday, December 22, 2025 - 20:03

I'm sorry you're having this issue. How are you positioning the glasses? I'm not sure if this is accurate with the gen2 glasses but I have the first gen glasses and The cams are on the left side on mine. if that helps any. Also, light helps to. Note, try looking down at the box a bit rather than just straight ahead because at that point it's gunna tell you what's in your kitchen most likely. If you're too close to an object you're wanting to look at it's gunna look blurry from what sighted people explains to me. HTH for what it's worth.

By Brian on Monday, December 22, 2025 - 21:07

Hi,

Here is a little advice that might make life a little easier with the Meta smart glasses.
1. Go to your local pharmacy and pick up a box of alcohol pads. These are sometimes called alcohol swabs, but they are amazing for cleaning off the camera lens of your smart glasses. I would suggest maybe twice a week, using an alcohol pad to wipe down your camera lens. Like Jonathan said above, the GEN 1 camera is on the left edge of the frames. I believe the GEN 2 is in the middle? Since I have yet to check out a pair, I am really not sure. Ask a friend, or look it up online, whatever's clever. Ultimately, my advice is to wipe down the lens at least twice a week, it will help greatly.
2. Try scanning documents, such as package labels, mail, a love letter you got from your neighbor, whatever, the same way you would use FaceID on your iPhone to unlock your device. If you keep that principle in mind, you may have better results with reading documents.

HTH.

By Missy Hoppe on Monday, December 22, 2025 - 22:51

Perhaps I am totally misremembering, but I thought the Meta glasses had the camera on the right. I remember thinking when I got the Ally Solos that it was the opposite. If the camera is on the left for both, knowing that might help a little. As for the address label I was trying to get the glasses to read, tried 3 or 4 different positions. I held the box in my hands and looked down at where I felt the label. Iset the box on my stovetop and tried to look at the label from there. However, if the camera has moved to the left, that would definitely explain the issues I've been having, at least when it comes to the Metas. It's just going to take a lot of patience and practice to figure things out. For situations where hands-free isn't entirely necessary, though, I'll probably just keep resorting to live recognition. In any case, I really appreciate all of the advice offered here. Thank you.

By Jonathan Candler on Monday, December 22, 2025 - 23:26

Again, I don't know about the gen2 glasses I'd look that up. For me though, the cam is on the left side of the glasses. The lens on the right is the recording light. As in regards of cleaning the lenses. Yeah, I should probs clean mine at some point as well Lol.

By Brian on Tuesday, December 23, 2025 - 00:09

As far as I understand, the GEN 1 camera was always located on the left side of the frame. I am unsure about the Gen 2 models, but I thought I read somewhere that they were in the middle, like the bridge of the nose area?
As for cleaning the camera lens, as I mentioned before I do mine typically twice a week. You would be surprised how much better they perform with a clean lens. Lol

By Seanoevil on Tuesday, December 23, 2025 - 05:47

Hi All,
I find my Meta Glasses a very helpful device.
With regards to Camera Locations, the Meta Rayban Gen 1 and Gen 2 both have the camera situated on the left hand side of the frame. That is to the left, and slightly ,above, the Wearers left eye.
The Oakley Vanguards, the sport styled frames, have their camera in the centre of the frame., above the bridge of the nose.
All three models have their action button and control pad on the right side arm. That is, the Wearers right side.
Hope this helps with your photography.
@SeaNoEvil00

By Missy Hoppe on Tuesday, December 23, 2025 - 13:15

I just want to thank everyone out here who corrected my misconception regarding the location of the camera on my Meta glasses. Hopefully, knowing that the camera is actually on the left will improve things. Perhaps, it might even make it easier to practice using both the Metas and my Ally Solos glasses. It just goes to show that you truly can learn something new every day.

By soni on Tuesday, December 23, 2025 - 18:18

Funny enough, I can't make video calls with Whatsapp with the glasses, but be my eyes works! I've done all the things, factory reset, conected through communications, updates all good, all of it! I can't even switch to the video call from the phone using the capture button! Amazon, looks like I'm swapping them out! I even called the support and that's what they suggested! So frustrating! Anyone have any other suggestions to try?

By Dan Cook on Thursday, December 25, 2025 - 13:17

Used them this morning to identify my Christmas presents, which was a lovely feeling.
One thing I’m really surprised with, especially given that it is meta we are dealing with is the complete lack of battery drain the app uses. I was worried I’d have to carry a charging case around, especially when using Bluetooth all the time, but so far those worries are unfounded.
Also loving how many charges the case gives you: I’ve only had to charge mine up once or twice so far in the month I’ve owned them

By Justin Harris on Tuesday, December 30, 2025 - 20:51

Hello everyone,
I have a few questions. I'm finally going to have my glasses in the next few days and I'm so excited.
1. Is there some way to tell whether or not the glasses are charging? Can you tell how much battery is left both in the glasses, as well as the case? This would be good to know, so you know when to charge the case.
2. I know most people use the glasses specifically with their phones. Is it possible to keep the glasses paired with the phone, but also pair it with a computer to get computer audio through the glasses? It's fine if not, just curious.
3. This is just a matter of preference, but I'm curious what voice everyone is using for the AI.
4. How do you feel that Meta AI compares to Gemini or Chat GPT?

By Brian on Tuesday, December 30, 2025 - 21:03

Hey,

Congrats on getting a pair of Meta glasses. You will love them I think.
Now for your questions.
1. If you open the Meta AI application on your iPhone, and double tap on the devices tab at the bottom right corner, you can see the battery percentage of your glasses when they are in use, in other words you're wearing them. Also, when they are folded up and in the case, charging, you can see the battery percentage of both the glasses and the case. Only when the case is charging the glasses however can you see the case battery percentage.
2. As far as I know, there is no way to pair these with a computer.
3. I am using the John Cena American English voice. I like it because it is loud and clear.
4. Personally I would say it is right up there with Gemini. I don't really have a whole lot of experience with ChatGPT, but I use Gemini often on my iPhone.

By Justin Harris on Tuesday, December 30, 2025 - 21:31

Thanks for the answers. Not on iPhone, but assuming same information is still available on Android, though I may have to hunt for it a bit more.
Just out of curiosity, and so I can temper my expectations a bit, are there any things that the glasses should, in theory, do incredibly well with, but in practice, just don't? I know that a lot of users have vastly different experiences with these, and even from person to person, what may work for one won't for another, be it the way they phrase things, accent, or any number of variables. While I know your experience has been quite positive, I still want the other side of it too. Any time they have totally left you hanging?

By Chris Hill on Tuesday, December 30, 2025 - 21:41

Just ask them, hey meta, how's the battery. It won't tell you the case, but it is quite simple to find the glasses battery that way. You can also see an estimate on the status bar of your phone like with bluetooth headsets, but it isn't very accurate.

By Brian on Tuesday, December 30, 2025 - 23:25

Labels on rounded containers. That's where Meta seems to struggle. For example, certain soup cans, over-the-counter medicine bottles, even bottled drinks. Sometimes rounded labels just don't translate very well with the AI.
Also, I don't know why I forgot you're no longer on iOS. I blame my old man memory. Lol

By Justin Harris on Tuesday, December 30, 2025 - 23:33

If my old man memory serves me correctly, we are either the same age or very close, close enough to both be in the same sinking boat. lol
Thanks for the info.
Another question. Have you ever been able to use the glasses to get an idea of what is on a screen, say for a program that just isn't playing well with NVDA, or a boot screen? Wondering how well they might do with that, or also with providing updates when screens change.

By PaulMartz on Wednesday, December 31, 2025 - 00:53

Put on a logo t-shirt, stand in front of a mirror, and ask Meta to tell you what t-shirt you're wearing. It gets it wrong every time, as if it can't grasp the idea that it's looking in a mirror.

I've been using the GPT+ subscription service lately. The AI quality is outstanding. There's simply no comparison. I'd expect that, as I'm paying for GPT. But I paid for the Meta glasses too, up front.

I love my Meta glasses, but Meta has a ton of catching up to do. And, sadly, recent releases of their Meta AI app seem to indicate they're more focused on social interaction than improving their lackluster AI.

By Justin Harris on Wednesday, December 31, 2025 - 01:23

Hmmm, sounds like I would be better off asking about a shirt before I put it on? Just a guess? Good to know this is an issue though.

By Brian on Wednesday, December 31, 2025 - 02:06

For T-shirts, I just leave the shirt on a coat hanger, and press it up to a wall. Then I say, "Hey Meta, describe this". And Meta usually describes it pretty darn well, nine times out of 10.
Granted, I have not tried the mirror thing. I might sometime just to see what results I get.
Oh, and to Justin, yes, I've used it to describe my laptop screen, when something is being wonky. Just have to remember to not be using NVDA's screen curtain mode (control plus NVDA plus escape by laptop default). It's helpful, even if just to tell me that there's something that NVDA is either not seen, or the app itself is just frozen and I therefore need to close/reopen it. Or if the add-on is not working properly.

By Gokul on Wednesday, December 31, 2025 - 02:41

Meta AI isn't the best out there; it's hard to compare it to GPT5 or Gemini 3. Like it was said above, they haven't updated their model in a while either. But it is reasonably good at doing what you would expect to be done with a smart glass on a daily basis. Just don't expect it to help you wibe code or tackle complex maths. The mirror test being a case in point. If I'm not wrong, this was one of the classic tests used to evaluate AI models back in the day (I mean, some 6-7 months ago). I think Gemini 2.5 was the first one to effectively tackle that one (I could be wrong). So yes, Meta's foundation models haven't got an update since those days, but if what I hear is correct, that will change soon...

By Justin Harris on Monday, January 5, 2026 - 13:41

So, after a few days of using the glasses, I have a few thoughts. I got the mat black frames, replaced the clear lenses with a mirrored silver. Got several compliments, told they look sharp. As for Meta AI, I find it to be a bit hit and miss. Live AI was able to scan for changes on a computer screen, tell me about some items on a display table at a grocery store, but there have been other things, like reading off a card in a Bible trivia game, where it totally failed. Also, the general unwillingness to describe people is a bit frustrating. I would also say, the audio quality is good, though if I want loud, they aren't it. My Galaxy Buds are much much better in that situation. What I do like though is that bluetooth will switch back and forth between glasses and buds. Say for example I'm out somewhere loud, if the glasses audio isn't doing it for me, I can throw in one of my buds, and it's good. No need to take glasses off. Definitely a fan over all, and excited to see what they do with future updates.

By Jonathan Candler on Monday, January 5, 2026 - 16:38

I'm not sure if anyone knows how to do this but say for example you're scrolling threw something and you want live AI to read what is being scrolled threw automatically, can you not do that? Currently I have to ask it each time what's on the screen with each scroll and that gets annoying.

By Seanoevil on Tuesday, January 6, 2026 - 05:58

Hi Jonathan,
I do not believe that the Live AI feature can automatically detect or respond to changes in the environment,. It would be nice if it could automaticaly detect and alert you to such a change, such as a screen refreshing or a person entering a roo.,, but I don't think this is possible yet.
Having said that, I live in a region where Live AI is not available, so I may be (in fact hope to be completely and utterly wrong.
Cheers,
@SeaNoEvil00

By Justin Harris on Tuesday, January 6, 2026 - 11:46

So tomorrow I get back in the gym after about a month off, between helping my girlfriend with a move, and Christmas break, watching the kiddo, no reason he should have to go to day care when I am fully willing and able to take care of him, and it gives us good quality time. Anyway, getting back to the gym will be great, but this gym has an interesting layout, as it is fairly open, but has a big curving staircase in the middle of the building, and I've managed to find the overhang of those stairs with my fivehead several times. The way sound bounces around in there makes locating and thus avoiding it by sound quite the challenge. Not to mention, since it has been about a month, you never know what they may have changed around. If live ai isn't gunna do the trick, might be a good excuse to test out the Be My Eyes integration, but would love it if Live AI could give me the info I need.

By Diego on Thursday, January 8, 2026 - 19:24

As it is under development, it is not available in the app store. You need to use third-party stores or tools that install apps externally on iPhone / Android for them to work.
The name of the app is turbometa-rayban-ai
Unfortunately, I don't have the glasses, so I can't test it.
If anyone manages to get it, share your impressions!

By Alexandre Toco on Saturday, January 10, 2026 - 02:19

I did some tests with the functionality called quick vision.
The goal is to take a picture and describe it using the llm you choose. In my case, I choose gemini.
The descriptions are much more complete then the one returned from metaAi and you can customize the prompt. So it can make the llm give atention for the aspects you consider more important.
The read-me says that we can create siri shortcuts to use this function. But I didn't figure out how. It looks likes this options appear only in chinese.
Hope that the developer improove multilang support soon!

By Justin Harris on Saturday, January 10, 2026 - 15:12

While I find this incredibly interesting, I do not really want to do that much hacking, and also I worry about where else my data might go, other than the hands of Meta, by using an app like this. While Gemini on the glasses does sound incredibly cool, I'll pass. If the app ever gets approved and published in the Play Store, then maybe I'd consider it.

By AbleTec on Saturday, January 10, 2026 - 19:07

My daughter & grandson got some for Christmas. Gen 1, I think. Just for the halibut, I asked my daughter if I could borrow hers for a sec, put them on, & said, "hey, metta, what am I looking at?" It said, There are 2 people standing in front of a Christmas tree wearing matching Christmas pajamas." That's just not something a blind person could know w/o being told. I thought it was very cool. I'm pretty sure there are other things it could help w/like that. It's an awful lot of money to spend, though. If I had a bunch of spare change jingling around, of course I'd buy some. But that's not the case, so I think I'll wait for awhile. Sadly, I could probably get more productive use out of them than either my sighted family members, but they like them for taking videos & stuff, which I can totally understand. But to me it felt, if only briefly, like getting a little synthetic eyesight. My heart rate sped up just a little. I think it's best to buy them w/a good window of return to make sure they're what you want or need. Obviously each individual has to assess whether the functionality they provide--& they do provide real functionality--is what they want, need, and will be reasonably content with. I'd like to cop 1 of theirs for a day and try walking around w/it, etc., but I'm afraid if I do I might want them more, so think I'd better not lol. Obviously it helped me locate the Christmas tree, so I now knew precisely where it was in order to avoid it, so that was a good thing.

By Justin Harris on Saturday, January 10, 2026 - 19:19

Yeah, I definitely think they are quite a tool, but they can also be off at times. See my comment above about the card I tried to have it read while playing a Bible trivia game. It said something about how if I had one of these plants in my garden, to keep this card. Nothing to do at all with the game in question. lol Go figure.

By Justin Harris on Sunday, January 11, 2026 - 01:42

The above comment about live ai not being all that live is rather accurate, as I tried to use it to gage the screen on my treadmill to determine the distance, and while it would sometimes read off numbers, it had a hard time telling which was for distance, which for calories, etc. The display cycles between multiple items, and it never would let me know when the display updated. And even when prompted, it would often give me old info. I think I'd be better off using Be My Eyes for it, but I would rather not wait on a volunteer to be available. It doesn't help though, that the display on my treadmill is down at the bottom by your feet, not up top where it is on most of them. So it makes getting a good picture a bit of a challenge. I think that treadmills at the gym should be easier for this.

By Gokul on Monday, January 12, 2026 - 18:35

to any of you who tried it, does it support english in live AI? and in other options too?

By Seanoevil on Tuesday, January 13, 2026 - 00:47

Hi All,
For those interested in using Gemini with their Meta Raybans, this Youtube Channel offers a workaround. I have not tried it myself, but link below:
https://www.youtube.com/watch?v=RvgjoQjROPA
The Creator is a chap named Stephen Sullivan, and He is all about smart glasses. Might be worth a look for anyone considering dipping their toes into the Smart Glasses market.
Hope this helps.
@SeaNoEvil00

By Justin Harris on Tuesday, January 13, 2026 - 02:05

So, from what I understand, this doesn't let Gemini actually see from the camera on the glasses, but rather just pipes your audio in and out of the glasses. Same as it would with any other bluetooth device. Nothing more.

By Brian on Tuesday, January 13, 2026 - 02:19

Do you mean like if I were to use the B my eyes app on my iPhone, while wearing my glasses?