In this AppleVis Extra episode, David Nason and Thomas Domville (AnonyMouse) interview Sarah Herrlinger, senior director of Global Accessibility Policy and Initiatives at Apple. They explore Apple’s ongoing dedication to accessibility, spotlighting exciting new features designed to better support users with disabilities. The conversation covers several highlights, including Accessibility Nutrition Labels, Braille Access Mode, Magnifier for Mac, and the role of AI in accessibility enhancements.
Key Highlights:
Accessibility Nutrition Label
- A new initiative that provides standardized accessibility info for apps.
- Developers will showcase features like VoiceOver and captions.
- Designed to increase awareness and help users easily find accessibility details.
Braille Access Mode
- Available on iPhone, iPad, Mac, and Apple Vision Pro.
- Enables quick note-taking, calculations, and BRF file access with Braille displays.
- Supports live captioning for DeafBlind users to improve communication.
Magnifier for Mac
- Turns your iPhone into a magnifier for Mac users.
- Uses a secondary camera to enlarge physical objects.
- Includes zoom, color filters, brightness controls, and OCR with text-to-speech via Accessibility Reader.
AI and Accessibility
- AI remains a vital tool in accessibility advancements.
- Enhances image recognition and descriptive capabilities.
- Continues to be integrated to improve experiences for visually impaired users.
User Engagement and Feedback
- Highlights the value of user feedback in shaping accessibility features.
- Encourages users to send suggestions to [email protected].
Share Accessibility Settings
- A new feature lets users temporarily transfer their accessibility settings to another device.
- Makes it easier for family members to help with troubleshooting and tech support.
Listeners are invited to share their thoughts on these features and suggest any other accessibility needs they’d like Apple to consider.
Transcript
Disclaimer: This transcript was generated by AI Note Taker – VoicePen, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Dave: Hello there, and welcome to another episode of the AppleVis Extra. My name is David Mason, and I am delighted to be joined once again by Thomas Domville, also known as AnonyMouse, of course. And this is an exciting episode that we, I want to say, annually, semi-annually do, and that is an interview with Apple's Global Head of Accessibility, Sarah Herrlinger. So, looking forward to this one, Thomas.
Thomas: Right. I mean, you're right. That is a mouthful. What is your, I had to look that up. Director of Global Accessibility Policy and Initiatives. I'm like, wow. I wonder if that actually fits on her business card in one line. There's no way. They only respond so small. But no, you're right. This has been long overdue. It's been, gosh, almost three years since we've talked here. So I can't wait to kind of ask her some of the things that are coming up this year. And plus some other questions we have always been wanting to ask her.
Dave: Yeah, so without further ado, Sarah, you're back. Welcome back to the AppleViz podcast. How are you?
Sarah: Thank you so much. It's such a pleasure to be here. I love getting the chance to hang out with you both. I'm doing well. It's been a very, very busy spring. I feel like every year things tend to speed up, not slow down. So lots to do.
Dave: Well, you get two WWDCs, don't you? Because you get God and WWDCs. We do. It's
Sarah: It's a really fun thing that we get to use Global Accessibility Awareness Day to make our announcements and kind of pre-announce for the community in a way that really nothing else comes out about our stuff until WWDC. So we love being able to have our day to shine. And then we also have a lot to talk about at WWDC this year. So super excited about that, too.
Thomas: Well, first of all, thank you for coming on. My gosh, it's been, you know, I had three years, I was like, oh, my, how time flies when you're having fun. But, you know, this is kind of an interesting trend that you guys are starting to set for the week of the Global Awareness Day, and that is to kind of introduce what's coming up for accessibility for us to enjoy at the year end. So I was like, this is great.
Thomas: So let's go ahead and dive into that because a lot of people are raving about a few of the new things that are coming. And I thought maybe we could just kind of highlight some of the big ones that we think that might be useful for our users to know what is coming. And let's start with the first one you guys announced, and that is the cloud. App Store or the Accessibility Nutritional Label.
Thomas: And I thought, you know, at first I was going to say, how many calories is that and how much saturated fat does that include? And we got a lot of comments saying, well, I didn't know that that was a thing. And so why don't you tell us what exactly is this new label and how is that going to help us?
Sarah: All righty. I got you, too. And one thing I will say up front, that term nutrition label is actually something that we've used in other places. So if you look at privacy in the App Store, it's also and has been for a while referred to as nutrition labels. So that's kind of where that came from. But this is something we're really excited about. We think it's going to have a huge impact. Now, currently, we've noticed that developers today find a variety of ways to share information about the accessibility of their apps with users.
Sarah: But we really wanted to create a consistent way for developers to highlight acce ssibility features. And we want it to be a way that's super easy for users to find and understand. Accessibility nutrition labels are an extension of kind of the longstanding work that Apple has done to provide developers with tools, documentation, and training to create great accessible experiences.
Sarah: We're really excited for these labels to come to the Apple ecosystem, and we expect it'll bring a whole new level of accessibility awareness, both for users and developers alike. So really, this is kind of the codification of being able to talk about different elements of accessibility in an app, to users in a way that's really clear and laid out in grid format, essentially.
Dave: That's great. Did this come about through user feedback? I wonder. I think we even had a conversation on this topic on this very podcast a few years ago, and I'm sure it's popped up in our report card and things as an idea people would like, so...
Sarah: Yeah, well, I think it's one of those things that we've talked about for a while. We've always tried to make sure that, you know, we are giving people the most accessible experience we can. And I think for us, it's really just an extension of how we try and do it within our own apps. And we see the app store as an extension of the larger ecosystem. So we just want to make sure that other developers...
Sarah: hopefully follow in our footsteps of trying to up-level accessibility as something that they not just talk about, but really prioritize in their work.
Thomas: So explain to us is what we will see when something like that is up. Does it just say this has been flagged by the developers that is support voiceover or how detailed does that get to?
Sarah: So the way that it will manifest or show up is a developer, when they're putting their app into the App Store, will now go through a process where it will say, does your app support a number of accessibility elements? And those are voiceover and voice control, which we know are essential tools for many users when they interact with their devices. There are also some vision features such as larger text, sufficient contrast,
Sarah: dark interface, differentiate without color alone, and reduce motion. And then for any app that has media elements in it, calling out whether that media, those videos, include captions or audio descriptions. So they're going to be presented with that. if their app does support these things, and within that there will also be information for them for what does it mean to say supported.
Sarah: So it's not just choose, it's here's what we mean when we say, does it support a dark interface? And then if they still don't understand exactly what that means, we'll also be connecting them to resources so they can understand, here's how you use our APIs and code for this. So we want to hold their hand in the process.
Sarah: But if they say, yes, I do support this feature, then as a user, when you go to that app's page, you are presented with which elements they support. And again, with that, it sort of gives you, you can drill in on that just as to, in a general way, what that means.
Sarah: Like this developer has said that they support a dark interface, which allows somebody to, you know, I'm sorry, I don't have the exact words, but, you know, deliver this in a, where there's a dark background with lighter text, but not lose color. the color of the photos. It won't invert them or things like that. So it sort of will give you a little bit more of that information of what we're telling that supported means.
Dave: Absolutely. And I think it means the user is getting that information, but also it's raising awareness among developers, I guess, because they're being asked upfront these questions. And if their answer is no, now this might give them a push to look into it.
Sarah: Yeah. I mean, our goal is for everyone to support everything. So wherever we can, we want to make sure that we are giving them that info and then, again, helping hold their hand in the process so that they want to learn more. And then as they do, that they will over time be able to Change knows the yeses.
Thomas: Is that something that is optional or is that a requirement for them to use? And will that be based on an honor system or is there some sort of certificate or certification that goes through that?
Sarah: So at the start, since accessibility and nutrition labels are brand new, we want to give developers time to compare and evaluate their apps. So at the start, it will be voluntary to... do it. But if they do not, if they were to not click on any of them, it will say on their page, this developer has not yet provided any information about the accessibility of their app.
Sarah: So in a sense, going through that process of the yes, no is a part of their process regardless. Whether they choose to say, yes, I support dark mode will be voluntary at the start.
Sarah: Over time, they'll be required to share their accessibility support, but we do want to give them ample time before it's required so that they, for those who may have not really thought about this in the past as much as we would have liked, we want to make sure that they understand what it means.
Dave: Will there be any, or if either now or at a later date you envisage any kind of user validation or if a developer says yes to voiceover but we find it's not, will there be a mechanism for us to kind of feed that back in?
Sarah: Well, as with all elements of the App Store right now, if a person finds that there is something that a developer is reporting that is not correct, the first step is to report it to the developer directly and tell them this isn't working.
Sarah: You can also do things, again, as you would do right now, and give them a low star rating or write a rating that says, a review that says, this doesn't do what this developer says it does. But you can also then report to Apple. And if we find that a developer is doing something that they don't
Sarah: say they're doing, but they're not, that is the point where we step in and investigate and make sure that they understand that, you know, or take action. I would phrase it that way.
Thomas: I think this is very promising. I absolutely love this, Sarah. I think this is a great start. And over time, I hope this just makes more awareness because now it's up in front rather than just I've heard of it or seen it. But now it's like when you actually have to put things up onto the App Store, you're going to see this. And so this is great news.
Thomas: Now, on to the next feature, I got to say, this is probably going to be the biggest one that I think a lot of people have been discussing and raving about, and that is the Braille access. Tell us a little bit more about this feature and what they can expect from us.
Sarah: Yeah. So Braille access mode is another thing we are super excited about. So it's designed to be used with a connected Braille display on iPhone, iPad, Mac, and Apple Vision Pro. It's a new way for Braille users with displays to interact with their Apple devices. And with it, when you launch Braille access, you're then presented with the Braille access menu where you can do things like quickly take notes in Braille format.
Sarah: You can use the Nemeth Braille calculator to perform calculations. You can open Braille VRF files directly from Braille access. So hopefully supporting being able to unlock a wide range of books and files that were previously created on Braille note-taking devices or through other means, but unlocks a lot more books and files in BRF.
Sarah: So lots of things you can do there for deafblind users. You can integrate a form of live captions so you can transcribe real conversations directly onto Braille displays, like live conversations under Braille displays. And yeah, so it's just lots of different things that we're now doing within this Braille access mode to support users.
Dave: Sounds great. I'm not a Braille user myself, so I may not fully grasp it myself yet, but do you have to have a Braille display connected or is there some kind of a mode that is just on your Mac, for example? I think it's just there was a few questions on the forum about that.
Sarah: Yeah, no, this is definitely built to require a connected Braille display. So its intention is for someone who is a Braille display user. And our idea here is just to kind of create a solution that's deeply integrated with the Apple ecosystem.
Sarah: So that ability to kind of do note taking and have access to the calculator and all of that that is meant to be done directly on your Braille display, but really giving you the advantage of being able to use any refreshable Braille display and use the one that works best for you.
Thomas: Wow. That is really, that is huge. You know, kudos to you, Sarah, for learning about the Braille users and display. You probably had no idea what the world was like on these things. And a lot of those Braille note takers do those applications, but to make it integrated to the OS level on Apple is just tremendously. And I think it has a lot of potential.
Thomas: So I imagine over time that you guys are probably going to expand on that to introduce some more things within the Braille access area. Have you been thinking about thoughts on the future that to implement a lot of the AI feature within the Braille access, meaning that I don't know if the rewriting tools is part of that tool and other options like that?
Sarah: Yeah, I think, you know, we're going to continue to implement AI anywhere that we think there is opportunity that works well for our users. I think, you know, we've been huge proponents of AI. artificial intelligence and machine learning within our accessibility team for a long time now.
Sarah: And so I think as we see more ways that it makes sense, we're always investigating and trying to figure it out. And yeah, I mean, even with the last element of Brown's, you know, Braille in general, I have to laugh. I've had a Braille display for years because I have always tested, you know, I kind of try and make sure things work. I am not by any means a proficient Braille user.
Sarah: I can proofread my own business cards, I'd put it that way. But it's been important for us as a company to support Braille and have done so really since we first kicked off VoiceOver. Gosh, we're at the 20th anniversary of VoiceOver on the Mac this year. So I don't know if we had it in our very first year there, but certainly Braille has always been
Sarah: That has been very important to us, and we do believe that, particularly for the deafblind community, this access should be something that as much as we can, we build into the products.
Thomas: Look at you. That's awesome. I love hearing that, that you've been dabbing with the braille display so you know what it feels like and how difficult it can be to learn, but that's awesome.
Dave: Yeah, and I think Braille has been a big winner two years in a row, really, now. Was it last year we had the BSI improvements and the Braille command mode as well on the iOS side? So, yeah, it's brilliant.
Thomas: Yeah, I'm actually curious. What about the BSI? I mean, what about the love for the BSI so they're not going to be able to access the Braille notes? Is this just for the display users only?
Sarah: In this case, yes, it is built specifically for external Braille displays. But duly noted, you know, we always take as much feedback as we can from people and try and see if there are ways to keep building out. I think one of our cardinal sort of rules on things is none of our features are ever, you know, quote unquote finished. We like to always iterate and see where there's room to do more.
Sarah: So I love getting that kind of feedback and I'll make sure that the other folks on the team get it as well.
Thomas: Awesome. And let's not forget our low vision folks out there. Now, this is a pretty slick little tool. I would love to hear this from you where we are able to take our iPhone as a magnifier for our Mac. So explain that feature.
Sarah: Yeah. Oh, gosh, this one I'm super excited about as well. So, as you know, we've had magnifier on iOS since I think it was 2016. And it is, you know, kind of a way to do sort of a Jack of all trades of being able to do many things to support the community. But at its heart, it's being able to take things out in the physical world and be able to magnify them up.
Sarah: So, you know, tiny letters on pill bottles and things like that. For Magnifier for the Mac, we really tried to design it in a way that people use the Mac and make it a different, you know, a different app to leverage the power of the Mac.
Sarah: So the idea behind it is that you can take a secondary camera, whether that be your iPhone or use the continuity camera or another type of camera, and mount it to your Mac so that it becomes your camera to see out into the world. And with the solution on the Mac,
Sarah: we are really trying to make the most use of the Mac's Canvas and enable you to be able to organize and store materials. So unlike iOS, which is really sort of more a feature for the Go, this is really a document-based app. So what you do is when you have that connected, you're able to launch Magnifier and then see whatever it is you are pointing the camera at. So imagine if you are a college student in a lecture hall
Sarah: that you don't have to sit in the very front row. You can be wherever you are in the classroom. And then same as with the magnifier on iOS, you can zoom in up to like 10 times or you can zoom in on things. But with it, you're able to change brightness, contrast,
Sarah: color filters, and also even perspective. So you don't have to have your camera exactly focused perfectly on the whiteboard in front of you. It kind of figures out how to get you a clean image of it. And then again, you can change color filters or whatever you might need to be able to see it. And then you're able to capture that image and kind of make a slide of it.
Sarah: So imagine if it's your science class, you can, you know, take a picture of whatever is on the whiteboard and then keep all of those files in a folder. So over time, you have all of the things that you captured from your science class or your English class or, you know, philosophy or whatever it might be.
Sarah: And use those throughout the school year, just as anybody else might have been capturing, you know, drawing whatever it was that was on the screen or in some other way capturing that information.
Dave: I wish you had this 25 years ago, Sarah, when I was there. Because I was that kid in the front row kind of still struggling to see the board. And that kind of thing, it's brilliant. And I know a lot of the stuff.
Sarah: you're not the first person to say that to me. And in fact, I've actually had one or two people who have said, I want to go back to school now.
Dave: And there's people spending a lot of money on specialist equipment to do this now. So again, it's being able to do that with the laptop and the phone that you have. Yeah.
Sarah: I agree. I think it's, you know, it's another example of kind of how we're trying to do something that is really low lift, but with a lot of value to members of the low vision community who, you know, can have these devices and want to be able to use them in a way that works best for them. Hmm.
Thomas: Now for the million dollar questions, there are two things here. this is a possible new revenue for you guys. I mean, you just said that you can mount the device onto your computer. Hey, that's a perfect thing, a little holder or stand that you could sell, right? And so now people are going to be scattering trying to figure out a good stand for that. But the million-dollar question is, you know, a lot of the devices that are similar to this technology that you are introducing here,
Thomas: will allow you to not only to snap a picture of the, say the whiteboard, as you were mentioning, maybe the PowerPoints and stuff, but have it be able to read to you. So AI versus, so OCR the screen. So you can get that jotted down to notes because sometimes a lot of written things are probably going to be easier if it could OCR the screen for you. Is that a possibility or a thought that you guys thought about?
Sarah: Yeah. And I've, I'll hit both of those. There are a lot of continuity camera mounts and iPhone compatible mounts that are out there and stands and things that are out there in the world. So definitely some great ones for people to find and to be able to use. But yes, one of the really cool things about Magnifier is we've also integrated it with a new feature we're adding this year as well called Accessibility Reader.
Sarah: And Accessibility Reader is a system-wide reading mode that's designed to make text easier to read for users with many types of disabilities. So that could be low vision, it could be using Accessibility Reader as a member of the blind community, or also things like dyslexia. So really built to be something that supports a lot of different types of disability. But it gives you new ways to customize text and focus on content that you want to read.
Sarah: When you think about what we've had in the past with Safari Reader, this is kind of a bigger better. It's not just in Safari. It's in any app out there that has text in it. You can launch Accessibility Reader as well as with Magnifier. So if you snap a photo of this, whatever it is in front of you,
Sarah: And as well, another thing, sorry, with magnifiers, you can use it to read, you know, physical pages. So if you are in a bookstore and you just want to read the, you know, a page in a book, or again, as a student, if it's your textbook, you could set up your mount over the page of your book and have it, you know, read off of that. But it is integrated with all of our support for spoken content.
Sarah: So as it's taking that picture, it will then also, you know, you can change the font and the color and the spacing of things on that page as well, but then you can also use spoken content to read it out to you.
Dave: Is it fair to say then that accessibility reader is kind of that speak screen feature now supercharged?
Sarah: Yes, I would. I mean, in many ways, yes. I think it's, it's, What we've done in the past with spoken content, being able to read, you know, obviously in things like the books app or in Safari or, you know, any of those types of things. But this is really trying to make this something that's available everywhere. There is chunks of text.
Thomas: That's awesome. You know, there's so many more features that's going to be coming out, and I think we pretty much dabbed on the highlights, the real big ones that I think that's going to be very impactful to our community. Kind of wrapping this down a little bit, I am curious. I've always wanted to ask this question to you, and that is, How do you guys as a team determine to figure out what new accessibility will be introduced each year?
Thomas: Is that something that a team come together and kind of vote on if we're going to do these and if we can fit this on a roadmap? Or do you get ideas from outside? I'm really curious.
Sarah: I suppose the shortest answer I can give to that is yes. By that I mean, you know, first and foremost, we always... believe in the disability mantra of nothing about us without us. And our goal is that we build with, not for. That starts with employing people with disabilities inside our own teams, and not just within the accessibility team, but within teams across Apple.
Sarah: So we get a lot of feedback from people all over within our own buildings, in the Apple world, who are saying, gosh, I just wish my device did this. I mean, when you think about something like people detection, That came first from one of our engineers on the accessibility team who just wanted to know when a line moved as he was out and about in the world as a member of the blind community.
Sarah: And it, you know, grew to be so much more. So, you know, we get some from our own ranks. we get some from the accessibility at apple.com email address. And I'd be, you know, it'd be wrong of me not to at some point in this conversation, make a plug for that as always. So when people write to us there and say, even it can be that they're saying, gosh, based on my specific disability or disabilities,
Sarah: I can do this and this, but then I hit up against a roadblock on something. And so I just wish my device would do this. And so we think about that and try and take in that feedback. I mean, anything that gets written to us there, gets processed and sent to the team. So we are constantly mining within that for feedback on whether it be on new ideas or bugs or whatever it might be.
Sarah: But we do that. We go to conferences. We talk to, you know, do sessions at conferences and talk to people at those to try and make sure we're understanding what the needs are. And also, we're just looking at whatever is the latest and greatest within Apple in general. So if a team says, hey, we're building a faster processor into this device, or we're improving the camera based on X, Y, and Z, our teams are
Sarah: constantly in conversations with those teams to say, okay, well, what does that mean and how might that benefit and allow us to do something we've wanted to do but we couldn't do until now because of, you know, whatever might be the issue. So, yeah, we're just, you know, constantly looking at what's the next thing that makes sense.
Dave: Speaking of that, of cameras and evolving technology, I suppose you guys were pretty ahead of the game, I think it's fair to say, with things like screen recognition, you know, which is utilizing machine learning, I guess, what we now call AI technology. Now what's become huge for us in our community over the last couple of years is image recognition through LLMs, through large language models.
Dave: And obviously we have, with Apple built in, you have the, you know, if you tap on an image, it'll give you a very basic image description, which again is from machine learning. Now with things like BIMAI, there's been huge development. What do you see as the future in terms of Apple's development of that technology and how it can help the blind and visually impaired community?
Sarah: Yeah, again, we're always looking at what's the right thing at the right time. And when we feel like we have the solutions that we can implement, we do it. So I think AI, it's funny. Someone was telling me recently that this is actually the fifth boom of AI in the world, and the first time it happened was 1950s.
Sarah: It's funny that it's now a big thing in this latest iteration, but we're always trying to look at how we can do more, and I think we'll see. Certainly, one of the things we didn't talk about, but with
Sarah: what we're doing with live recognition on Apple Vision Pro and bringing it there, giving you more information about your surroundings for things like people detection or furniture detection or text detection so you can read your mail with Vision Pro on. So there's certainly ways that we're already looking to implement this more and more. But this element of AI, I would say, is in its infancy. And so lots of room to see where we can go with it.
Dave: I know you won't be able to comment on this, but if you could give us a pair of glasses with the cameras.
Thomas: Put her on the spot. Put her on the spot.
Sarah: I'm sorry. Take my answer to that one. As you know, we always like to keep our surprises.
Thomas: There you go. That's a good answer. That's a good PR. The marketing people also appreciate that answer, right?
Dave: Exactly so.
Thomas: You know, that is a hot topic. I mean, you have got to be hearing left and right. I know you are. And you probably are getting eons or tons of emails about this. And that's Yeah, right now it's a big boom for AI, and we have had a lot of success and a lot of great stories coming out of these AIs being able to have things described more so than just objects.
Thomas: So, you know, this is like an explosion of overwhelming details that we're getting back and not only that, but we're also getting video description and things like that. So we can't wait for Apple to, if they will be thinking of something like that for us down along the line like that as well. So the one thing you mentioned, Sarah, and I really love is that you do take inputs from our blind community.
Thomas: So did you continue to send emails to the accessibility team of features and thoughts that you can include for the next year's iOS that would typically work?
Sarah: Yeah, for really any of our operating systems or devices, that is our, you know, easiest direct way to get feedback to our developers our engineers so whether it be giving that feedback of I wish it did this or here's a roadblock I'm currently hitting or whatever it might be or even just to ask us questions you know a lot of
Sarah: What we get into that account is also I'm brand new to needing an accessibility feature. Can you help me figure out? Can you provide me resources or things like that? And we get tons of feedback that comes into that every day. But there is a team that solely works on supporting that account and responding back to people. So we do get back on everything we get in there.
Dave: That's great, Anna. I wonder, you know, when we look at our own website and the report card that we do at the beginning of the year, that kind of thing, it feels like the operating system that people are maybe struggling with the most still is Mac versus iOS when it comes to voiceover, certainly.
Dave: Is that something that you sense as well, that that's where, not that it's in a bad place, but it's maybe the place where there's the most to do in terms of bringing it up to where, yeah, customers actually want it from a voiceover perspective?
Sarah: Well, I think, you know, we do get feedback and we appreciate the feedback. And I think also with it, there's a lot of different people who configure their devices in different ways. So I think, giving us as much feedback as you can on where people find problems is super important.
Sarah: I know sometimes it feels like you may be writing in and someone says, oh, I need your log files or can you record a video of this or can you give us more information on, you know, which version of the operating system you're using, which model of device it is. Is it an M1 device? Is it an M3 device?
Sarah: All those different things that feel like we're asking a lot, but it really is based on our trying to figure out exactly what is happening for the user so that we can then figure out what's going on and really pinpoint things. So I think We just want to make sure we're getting as much feedback as we can so we can help make the products better.
Thomas: Well, thank you, Sarah, for taking your time out of your day to do our interview with you. This has been wonderful, educational, and insightful. Is there, like, one last thing or one more thing that you would like to add? Is there anything that was not announced at GATT that we possibly could see in the upcoming WWDC when it comes to accessibility?
Sarah: You know, one thing that I don't think we touched on in much depth, but that I think is rather cool is the being able to share accessibility settings. Yeah. The idea behind this is being able to temporarily transfer your settings from your iPhone or iPad to someone else's. And what I really love about this is for years I've,
Sarah: always heard people talk about how they appreciate the fact that their family IT department may be someone who's a member of the blind community or somebody who is even a quadriplegic or whatever it might be, but how that is the person who is the most knowledgeable about tech. And in some cases, if some family member is coming to them and saying,
Sarah: gosh, such and such isn't, I can't get this to work on my device, that you can say, all right, here, give it to me. You accept it the same way you do kind of the airdrop or doing contacts by moving them close to each other. And then you can get on there and go, oh, I figured out what you did. You flipped this toggle or you did whatever and make those changes for them. And then when the session ends, it reverts back to their setup and says, Yours is on yours.
Sarah: But it's just a great option to be able to use someone else's device in the way that you need to do it.
Dave: As tech support for my family, I'm very appreciative of this one.
Sarah: Yeah. So I think that's going to be fun for people as well.
Thomas: Exactly. And I love how that works. So you just put the devices close to each other. They'll detect it so there's no sign in or anything. That's going to make it very easy. I love that. Yeah. Well, awesome. Any other questions, Dave, that you have for Ms. Sarah before we let her go?
Dave: Okay. I just want to say, yeah, huge thanks. You've been very generous with your time. And, yeah, we really always appreciate it when you come and join us on the podcast.
Sarah: Absolutely. Well, thank you very much for the invitation. As I said, it's always a pleasure for me to get to spend time with you guys and chat again.
Comments
very good interview
Thank you that was a very good interview. looking forward to iOS 26.
IOS 26 still feels weird
IOS 26 still feels weird to say, ah well, we have at least least 3 months of betas to get used to it, can’t wait to install the beta on my iPad
Will listened
Hope is a real interview and not an apple ad on what a wonderful job they are doing and how much they care for accessibility. Do not want to get diabetes.
Same
i will listen in a bit
sharing settings
OK. The accessibility setting is good. They need to evolve it into keeping it. If someone has a better way of setting VO than me, would love to get their setting for myself. Promising.
Awesome interview!
Awesome interview!
Very nice interview.
Hi.
The interview was very nice.
Great to see Apple is listening to feedback when it comes to accessibility so I think rather than complaining we need to give them as much feedback as possible in terms of bugs and all.
i do think that with AI Apple seems to be having more of a a Cautious approach however one benefit with Apple you do get is that if you pick up an Apple device you know it's gonna be accessible and accessibility is baked into the device but I do think that Apple does need to catchup with the AI side of things.
Looking forward to trying braille access
Also I am looking forward to trying braille access on the new IOS betas.
You're using it wrong
Such an apple reply on the issues with voiceover and mac.
Sarah: "Well, I think, you know, we do get feedback and we appreciate the feedback. And I think also with it, there's a lot of different people who configure their devices in different ways. So I think, giving us as much feedback as you can on where people find problems is super important."
So tone deaf.
Thanks for doing the interview guys. I'm not super impressed with the response. It's an issue in such cases that there is this sense of virtue by such big companies. It's patronising.
The mac comment triggered me.
But to be clear, I'm not grateful for the work they do. I expect it. I'm a paying customer and, if they want me to remain so, they must correct bugs and innovate because there are other people doing better work.
We're a customer base, not a charity case.
Sorry to get aggy. It just sounded like someone who seems pretty vague about the whole experience whitewashing on apple's behalf. Not knowing the name for colour invert whilst being the mouth piece for apple accessibility, the biggest company in the world, isn't what I want to hear.
Move slow, don't fix things... That's apple's silicon valley mantra.
Again, sorry! Annoyed.
My Thoughts
Agreeing with @Oliver above, quoting:
"We're a customer base, not a charity case."
Look:
That interview I heard (and I listened to all of it), its almost as bad as trying to get Samsung customer service to fix their interface these days while "hoping" they update their own inhouse version of Talkback on a timely manner (without skipping out on features!)
I'm ending my remarks here, just to be safe.
Oliver
Very soft interview. I would not even call it. Apple doing their press thing about how much they care and wonderful they are for doing accessibility. Nothing about asking about bugs with VO specially on the Mac.
Looking Forward to This!
I haven't listened yet but will do so soon. I'm curious though why iOS 26 and not 19? I guess I'll find out soon enough though. Thanks once again to Apple for keeping accessibility at the forefront, and never missing a beat.
Sarah Herrlinger is absolutely correct
If you like it or not, she is absolutely correct. Some people probably configure their Mac differently than others. This could cause some issues. Am I
saying it is? No, but if you are going to properly test, you need to know these things. So she was 1 million percent correct for asking you to correctly state your settings.
Another thing: remember there are always things they didn't cover at Global Accessibility Awareness Day that get found when the betas come out. So
let's see how the betas are first.
If you're happy with a buggy…
If you're happy with a buggy screen reader, that's fine, you do you. There are too many common issues for it to be just user error and, as the designers of the system, it's their responsibility to make sure there are no problems with different configurations. Apple can't run such a locked down ship as it does, even on mac, and then blame us for using it weird. That's having your apple and eating it.
As with siri, they need to start again with voiceover on mac. It's a mess. I can't even spell check this post in safari on mac.., which I've reported.
Truth is, not enough blind folk use mac because they know it's a mess. Professionals use windows. Apple aren't willing to invest in a low subscription area and gone are the days of, build it and they will come.
We all know the issues, we all know just how convoluted using voiceover is, even in apple's own apps.
This is no comment about Sarah specifically, more apple as a whole, with it's plastic smile lack of authenticity. What I'd love is for someone to say, yeah, voiceover on mac needs work... But no one will. Instead, they gaslight us with these silly comments. My configurations are not that odd, Alexa at speed 70, and that's about it.
Windows inside a mac body is the ideal. iPad OS isn't mature enough. It's a huge frustration.
Oliver
You will have those who worship on the alter for apple and speek against those who point out bugs. Will keep using apple but I know apple does not do accessibility for the goodness of their heart. It is all about the money. US government purchase apple products and their is law about accessibility that it need to be follow. Will see how iOS 26 is!!!
Yes, and for the most part,…
Yes, and for the most part, we do benefit from it but, I agree with you.
Apple isn't here to make accessibility solutions. It's not even there to make cool tech. It's there to make money... Which is fine, creating cool tech, creating accessibility solutions and making money are not mutually exclusive, they just don't always line up so well.
I think the biggest frustration with Mac is the lack of choice in screen reader, and I'm not entirely sure why that is. Variation and competition would be beneficial to us though, as Apple has shown more and more often these days, instead of innovating
and renovating, they rather exclude competition. I for one hope this hurts them. They could do with a big slice of humility, which it does seem they are getting.
the more feedback we provide, the better
I understand that this interview has discussed on what's going to come for the upcoming versions of all of its operating systems, but if we don't take the time to send feedback, the issues are going to get worse as the beta cycle progresses, and even after the beta cycle, we should still continue to send feedback regardless on whether we are beta testing versions or not. The more feedback we provide to Apple the better and faster the issues will be resolved. I do applaud Apple for taking the time to listen to our feedback, and maybe this year once they release all of the tools and resources, they will do something right for a change. As I have said in the previous post, bugs do come and go, but the more feedback that we sent to Apple regarding these bugs the quicker it'll get resolved both in and out of the beta cycle.
Some Thoughts
Let me start by saying that in the past, I’ve been somewhat of an apple apologist, mainly because many of the issues people were describing just hadn’t come up for me. Sarah is correct that in some cases configuration can cause problems, but as Oliver pointed out above, that is absolutely on Apple to test, not the users. They should be trying their software with multiple configurations to ensure that it works correctly before release.
As for macOS, I agree that VoiceOver is a mess at this point. I really started experiencing problems over the last two or three major versions, and voiceover is in desperate need of a redesign. I have a suspicion this may be coming with the next release of macOS, what with it being the 20th anniversary of VoiceOver and an alleged redesign of the interface, but of course I’m not sure and have no insider info. It’s clear that resources are mostly allocated to iOS/iPadOS, as the accessibility experience on those platforms is far superior to current macOS. I’m sure much of this has to do with the age of VoiceOver, I have a feeling some of the code dates back to Tiger or Leopard and has not been touched in years. I’m really hoping they’re working towards ripping it out and starting over, but they may also be replacing components gradually as they go.
Another thing to consider is apple‘s own apps. Logic Pro and Final Cut are good examples, accessibility is spotty at best and they don’t feel like they were designed by the same company who makes VoiceOver. Again, iOS is a far better experience. At least for apps like GarageBand and Logic on iPad, everything feels much more integrated and fluid. It may not be the most efficient to use, but you can tell accessibility was considered from the very beginning rather than just being added as an afterthought.
All this to say that we absolutely should be criticizing companies when they don’t perform to what we expect is paying customers, and this doesn’t just go for Apple, but anybody. People who come on here and get upset that we are respectfully pointing out where things have fallen short don’t really add much to the conversation. Sending feedback is very important, we should all be doing that, but companies should also do their due diligence and work on fixing bugs internally. If they need to hire more testers, they should hire more testers, it’s really not that complicated. Apple is in quite a predicament at the moment, what with the Apple Intelligence rollout and their anti competitive behavior on the App Store, and this cavalier attitude from executives where they essentially say we can do no wrong is not helping their case. If they don’t clean up their act soon, people will stop buying their products, that’s just how it goes. I really love Apple, I want them to do better, but I will consider switching if they’re not able to deliver on promised features or fix show stopping bugs in a timely manner.