Meta Connect - new glasses, a screen reader and an open platform - Seeing AI anyone??

By mr grieves, 28 September, 2025

Forum
Assistive Technology

I know this happened a week and a half ago, but I am amazed no one on here is talking about the Meta Connect event which feels a lot more exciting than the recent Apple events to me. I’m sure most on here have been following it anyway but I’ll give a very quick rundown.

There are two parts to this. Firstly, the new hardware.

There’s a 2nd generation of the Meta Ray-bans. Now it has 8 hours battery and a 3k camera.

Then there are the Oakley Meta Vanguard which do the same sort of thing, but are wrap-around with 9 hours of battery plus the camera is in the middle not on the left.like Garmin

And finally the Meta Ray-bans Display. These are similar to the normal glasses but have a tiny screen in the bottom right lens. I don’t know how useful this will be to the folks on here. If anything the more interesting part of this announcement is the separate controller called the Neural Band. I’m not sure I fully understand this but it is something that fits on the wrist and can be controlled with subtle gestures. Apparently Meta are considering allowing this to be used as a standalone controller for other things in the future.

The other intriguing thing about the Meta Display glasses is that they will have a screen reader from day one. This is pretty surprising and great. Whether it makes the Meta Display worth having for us over the other glasses I don’t know.

But without a doubt the most exciting thing is the Meta Wearables Device Toolkit. We’ve been talking on here for a long time about the restrictions of the Meta glasses - you can only use the functionality that Meta provides. Well, it seems, no longer.

And amazingly this is having a big focus on accessibility. Already it’s been announced that Seeing AI is going to make use of this, and companies like Be My Eyes (of course) and surprisingly, Humanware are coming on board.

According to Double Tap we probably won’t see the fruits of this until sometime next year.

But have a look at this: https://developers.meta.com/blog/introducing-meta-wearables-device-access-toolkit/

I think initially they are going to be providing other apps access to the camera, microphone and touch pad microphone and touch pad but they have also said that they are looking into integrations with Hey Meta at some point.

So hopefully this means the end of WhatsApp hacks for things like Aira and PiccyBot, and maybe we can start having the full hands-free integration we’ve all been dreaming of.

It also seems that they are open to use of other AI models so I don’t believe everyone will be forced to use Llama.

So my main question is - why on earth is no one talking about this on here?

Options

Comments

By mr grieves on Sunday, September 28, 2025 - 13:55

Can't find an edit option on my main post, but I almost forgot. The new toolkit is supposedly going to be available for all the Meta glasses, which I believe will include the gen 1 Meta Ray-bans that many of us have already.

By Brian on Sunday, September 28, 2025 - 14:06

I have been behind the times when it comes to news like this. I was aware of the Meta Oakley's, did not know about the other technology though. I'm thinking my next pair of Metas will be Oakley, as I would love to have Meta Smart glasses in a wraparound form factor.

By Gar on Sunday, September 28, 2025 - 14:11

While I'd heard about the new glasses, I hadn't heard about the screen reader or any of the other stuff. But honestly, it's a Meta product, and I really don't have any trust in them or desire to use anything they put out. But I recognize that's just my opinion, and others may vary.
With that said, I think the main reason it's not being discussed here is that it's not an Apple product, and I think that part of the website, the core Apple categories, are visited more frequently than this forum category. So, it very well could be popping up here or there, but not in plain sight.
I'm not throwing any shade here, I don't think the AV Mastodon account posts topics from some categories at all, either. It's highly possible it might be getting some traction on Mastodon or in other places though.

By mr grieves on Sunday, September 28, 2025 - 14:46

They have been talking about this on Double Tap a bit, and they have also been unable to quite understand why this isn't bigger news all round.

I can't quite see the point in the Meta Display glasses for us, but it really took me back when I heard it was going to have a screen reader up front. It was almost in the small print. Quite why they weren't making a big deal about this I don't know.

I've been dreaming of Apple glasses that would integrate all my favourite apps into the iPhone. Now it looks like I already have all I need and just need to wait until next year. What we need is something like Voice Vista that can then use the camera to provide door detection and things like that.

I was almost excited by the Envision Ally glasses, but the Ally app felt pretty hopeless to me. Whereas I really can't wait for this.

I hope the Toolkit thing won't be restricted to the US only to begin with.

It does feel like all the pieces of the puzzle are fitting into place for us.

I understand why people don't like Meta. They aren't a company I have been particularly fond of. But I really can't wait to see where this all goes, and it is pretty thrilling that we are right at the front of the queue for once.

By Oliver on Sunday, September 28, 2025 - 15:22

I nearly wrote about this but, as it's so far off, I didn't really think there is much to say yet. Yes, it's all coming, but we've heard that before.

I've actually applied for the dev kit, so we'll see where that goes.

Call me jaded, but a lot has been promised. I'm still cautiously optimistic that this will be the turning point for us where Meta Glasses go from something that's kinda cool, but not quite filling our need, to something that is really powerful. My dream, as I've mentioned elsewhere, is to be able to kick back and read a book, any book. This would be the app I'd look to develop, purely book reading. I know Seeing AI will have document reading, but book reading, at least, many pages, might be a little beyond its remit.

I'd go for the Vanguards. I find the Meta Ray-Bans uncomfortable. I've got a low nose bridge so as soon as I get a bit sweaty (gross), they slip down my snozz. I know the second gen have better fit and should stay put, but I'd rather go a little more expensive and get wrap arounds. Not sure about others, but if I've got a bit of a flare up, or a stye, I'd like to have that full hidy coverage. Centre camera is also cool. Plus, I can pretend to be like totally sporty and stuff... WEEEE

By jim pickens on Sunday, September 28, 2025 - 15:56

I have no idea how I never knew about this, the screen reader and neural band I knew about, but the integrations I decidedly did not. My only excuse is… I don’t watch events... or something?

By Tara on Sunday, September 28, 2025 - 16:20

oh dear, well it's a bit of a sore point I'm afraid. I've gone and ordered the Ally Solos glasses now, but if I'd known Seeing AI was going to be on a pair of glasses coming soon, I probably wouldn't have ordered the Ally Solos. Particularly as Seeing AI is better at identifying products than the Ally app is. I'm just hoping that when I wear the Ally glasses, they'll be better at identifying stuff, since they'll be sat on my face so to speak, and I won't have to position the camera. But the Ally app really did halucinate when I asked it what I had. It got totally the wrong type of beef burgers. It said they contained jalapeno, and they definitely didn't! It just made the whole thing up. They were only Birds Eye afterall! I could always send them back if I really don't like them, the glasses not the burgers, they've already been eaten and I liked those no question. I always wanted Seeing AI on a pair of glasses though. Maybe I could have both? The Meta glasses and the Ally Solos? We'll see.

By SeasonKing on Sunday, September 28, 2025 - 18:48

I know Meta has delivered some amazing things in recent years, and, my finger is itching to hit that buy now button, but, I am just holding for Google's version to launch, so can make an informed decision.
Also, I need to know if Seeing AI and other assistive apps make good use of the Meta's SDK. Because the ones from Google are certainly bound to have better integration with all of it's existing Android apps. Yeah, Google Talkback combined with Google's TTS is kind of frustrating, but, if they tackle that in glasses version somehow, and, if we get to see some nice capabilities in terms of accessible navigation and recognition, it might be much better deal.

By Missy Hoppe on Sunday, September 28, 2025 - 19:12

I'm just speaking for myself personally, but I honestly don't want anything to do with Meta as a company. Yes, I have Facebook, but I consider it more of a necessary evil than something I genuinely want. For everyone who is excited about all of these new announcements, that's great, and I'm sure I'll read as much as I can just for my own knowledge, but I just have a lot more confidence in Envision. Can't wait to get my Ally Solos glasses; they'll probably meet my needs better than the Metas did and will almost certainly have fewer privacy concerns.