Voiceover not crashing without apple checking before it releases, Get that beta tend to crash but when VO is no longer worker is not good. Do they not test their iOS on all devices? Thought they had an accessibility person who test the beta?
if you don't want software that could have bugs don't install it plain and simple. I don't care how good you test things internally you can't catch everything. again If you don't want the possibility of bugs don't run betas!!!!!!!!!!
Ai image descriptions of course. TalkBack has it. The company that's been putting neural chips in their phones for years and years should have it right now. But they won't on iOS 26, and probably won't on iOS 27 either. So meh. As usual.
I would like to see Apple Intelligence integrate with Perplexity, since there are rumors currently that Tim Cook is planning on purchasing Perplexity as an acquisition of Apple.
i agree with Devin, TalkBack's image descriptions are awesome and i use them all the time, really wish voiceover had them, oh and fix all this vo jumping around and getting stuck, no other screenreader i use does this so apple really needs to fix this
Would like apple to update voiceover and fix the bugs that some are having. Do not ad any new features but just check for bugs and make sure it works well.
I would like to see what Apple promised and hinted at last year, i.e., AI and Siri that works across apps to take actions on the user's behalf based on what is on the screen and the capabilities of apps.
We are now seeing AI agents starting to roll out and be useful. What Apple has that is unique is the ability to make these services run locally. There are many things that I don't like sharing with AI tools that run on the cloud. Apple can have a real edge here if they do this and implement it correctly.
That's it! That's truly all I care about. I can't even list anythingspecific because most quirks I've just come to accept and work around, but somehow, it seems that in a lot of ways, iOS 7 on my old iPhone 5S was the least quirky iOS I can remember.
The one thing I can think of that is always frustrating is how on some web sites or apps, we can't double tap on edit fields in order to fill them out. Sometimes, using a different web browser helps, but other times, especially when it happens in an app, that's not an option. So, at least asfar as I'm concerned, Apple shouldn't even be thinking about new bells and whistles until the underlying operating system is at least a bit less buggy.
One feature I wish existed—either on the iPhone or as some kind of universal keyboard—is the ability to copy text from anywhere, without restriction, and paste it wherever I want. I understand that some apps disable copy/paste for security reasons, like when you’re entering a one-time code into a secure field, but there are plenty of other platforms that block copy/paste for no real reason. This makes navigation harder, especially for me when I’m using a screen reader.
Sometimes I come across text in an app or on a webpage that isn’t accessible. In those cases, I’d love to be able to copy the information into my Notes app so I can read it at my own pace. It would be even better if there was a universal “copy to notes” feature that worked anywhere—automatically saving whatever I copy into a special section of Notes that keeps my clipboard history. That way, if I accidentally copy something else (like tapping on my home screen and wiping out my clipboard, which happens more often than I’d like), I wouldn’t lose the original text forever.
This would also be useful when I’m reading inaccessible webpages. Sometimes I want to copy the information in chronological order, skipping ads and irrelevant content, so I can paste it into Notes and read it smoothly. The “Reader Mode” in browsers does a decent job of isolating the main text from ads, but I’d love to combine that with a universal copy feature that captures only the content I want. Or the static text feature on the rotor is often pretty good at letting me skip over the advertisements so I can read the text a little better. I don't want to bypass security or copy sensitive information. I'm not sure if it would be a security risk to allow the copy paste options, if they are disabled, to become available with the detection of a screen reader. The same goes for books or documents that aren’t in true text format. A lot of PDFs and eBooks are essentially just images of text. It would be amazing if the iPhone could natively read that text without me having to open a separate OCR app. This could also help blind users who want to enjoy comic books, manga, or graphic novels. Instead of screenshotting every single page and running it through a separate app, the phone could simply recognize the text in each panel and read it aloud in sequence. Ideally, this feature would be smart enough to handle both simple paragraphs and more complex layouts, automatically translating the visual text into spoken or digital text. That way, whether it’s a webpage, a PDF, a picture, or a comic book, the iPhone could simply read it out without me having to jump through hoops. If Apple integrated something like this into the system itself—combining universal copy/paste history with built-in OCR—it would be a massive accessibility upgrade. There are books, studies, or research papers I’d love to read—info I know I would enjoy or benefit from—but they’re written in languages I don’t speak. Spanish, French, Chinese… it doesn’t matter what it is, if it’s not in English, I have to rely entirely on translation tools. And while those tools exist, the process is far from smooth. There’s this one website I’ve used from time to time that can translate entire documents. It’s an all-in-one package, which is great in theory, but in practice it’s slow, clunky, and loaded with advertisements that make it frustrating to navigate—especially with a screen reader. On top of that, it has a strict limit of around 150 pages per document. That means if I have something that’s 300 pages long, I have to split it into two separate files, upload each half individually, and then try to keep track of which one is the English version, which one is the original, and which one is the translated half of the original. By the time I’m done, my phone looks like a digital jigsaw puzzle of mismatched file names and duplicate PDFs. What I’d love is for the iPhone to have a native way to handle this. I’m not even talking about real-time translation while reading (although that would be amazing). An app where I could upload an entire PDF—regardless of size—choose my target language, and then have the phone or an app convert it into a clean, fully translated PDF. I could then open that file in Books or VoiceOver and read it like it was originally written in English. I know Apple has their own Translate app, but as far as I know, it doesn’t support uploading full documents for translation—it’s mainly for short text and conversation mode. And, OK, I would also love for chrome extensions to be supported natively on the iPhone.
I don’t want to complain, because I’m genuinely grateful for the products we have today. But does anyone know why dictation on the iPhone has gotten so bad? I remember when I had my iPhone 7 Plus, and even way back with the 5C, I never had these issues. I don’t use dictation as much anymore, but what happened to it? Did they change the way dictation is processed or something? I swear I read somewhere that at one point it was processed either on the cloud or directly on the phone, and one method was better than the other. I’m not a genius here, so bear with me, but I’m pretty sure whatever setup they were using before was way more stable. Back then, dictation would get 90% of what I said correct. Now? Not even close. Honestly, I think the decline started around iOS 7 or so, and it’s never felt the same since. The point is that dictation has gone downhill badly, and it’s frustrating.
One bug I wish they would fix is this strange thing that happens when I’m doing a task—usually on my camera, home screen, or browsing a website. Out of nowhere, the device starts picking up random bits of text from anywhere on the phone and reading them back to me at the end of whatever sentence it’s speaking. I could be scrolling and suddenly hear “screenshot for you… maybe dog… maybe cat” or just random letters and numbers. And no, it’s not related to image or text recognition—that’s all turned off. It’s pulling random scraps of text from who-knows-where and throwing them at me like it’s trying to make alphabet soup. The only way to get it to stop is to turn the device off and back on again, which is ridiculous. Then there’s the Speak Screen issue. Sometimes, when I use the Speak Screen feature, it’ll get stuck. I can minimize the window, but there’s no way to dismiss it completely. It sits there, frozen on my display, refusing to leave like an unwanted guest. Once again, the only fix? Turn the device off and on again. I promise I'm not trying to be that person who complains for the sake of complaining. But these bugs are difficult to describe and to try to replicate. If I remember correctly, didn’t Apple delay the release of iOS 12 at one point because they wanted to focus on making devices more stable instead of rushing new features? Feels like now we’ve got more bugs than ever. Don’t get me wrong—overall, the software is pretty stable, and VoiceOver works for the most part. But there are these little quirks here and there that I wish we could get rid of. I understand that the majority rules, and if most people aren’t experiencing an issue, it’s probably not going to be a top priority. But some of us are dealing with very niche problems that are incredibly hard to describe, let alone capture in a screen recording. It’s even harder to get them into the hands of a technician or the accessibility team in a way that truly shows what’s going on. It’s like—yeah, sure, we’re in 2025, we’ve got all these amazing tools, and the iPhones are better in many ways. But these little quirks still sneak in and make the devices awkward to use. Not unusable, but definitely not as smooth as they could be. And I know fixes take time, resources, and careful testing… but it would be nice to see some of these long-standing issues finally ironed out.
As per others, no new bugs, some existing ones fixed please. But ignoring that...
I would like Apple to open up access so I can use my Meta Ray-bans with more things. For example, I would like to be able to summon the S lady an tell it to open seeing AI and OCR the document I'm looking at without having to take my phone out of the pocket. Then this can pave the way for integration with navigation apps etc. Or even the magnifier things in iOS like door detection but powered by my face.
Built-in image descriptions using Apple Intelligence would be nice.
A slightly less stupid Siri that can cope consistently and work within apps to achieve tasks.
Voices in VoiceOver that are on par with those lovely Microsoft Natural/Neural voices (whatever they are called).
If age verification is going to suddenly become widespread, then I would like Apple to add something to iOS so I can verify my age there and then pass assurances to sites and apps to say "yes I am old enough, Apple says so" in a universal and accessible way.
Comments
full keyboard parity with macOS
keyboard parity on macOS would be amazing! there are several mac keyboard commands which arn't on iOS.
More Apple Intelligence features
I have a iPhone 15 Pro Max which runs Apple Intelligence. I would like to see better Siri integration with other services like Perplexity.
Checking beta
Voiceover not crashing without apple checking before it releases, Get that beta tend to crash but when VO is no longer worker is not good. Do they not test their iOS on all devices? Thought they had an accessibility person who test the beta?
I agree
I agree with you.
I thought Apple had a person that tests the software before releasing.
No software is bug free
if you don't want software that could have bugs don't install it plain and simple. I don't care how good you test things internally you can't catch everything. again If you don't want the possibility of bugs don't run betas!!!!!!!!!!
Dennis Long
No one disagree with that. I put the beta and I am responsible. Also apple is also. I do not kneel and worship apple and kiss their feet.
VoiceOver
Ai image descriptions of course. TalkBack has it. The company that's been putting neural chips in their phones for years and years should have it right now. But they won't on iOS 26, and probably won't on iOS 27 either. So meh. As usual.
Re: Perplexity
I would like to see Apple Intelligence integrate with Perplexity, since there are rumors currently that Tim Cook is planning on purchasing Perplexity as an acquisition of Apple.
Chat GPT 5
That is coming on iOS 26. Also apple have a secret team working on AI. Is a new team. Want stability, make VO better, stronger.
agree with devin
i agree with Devin, TalkBack's image descriptions are awesome and i use them all the time, really wish voiceover had them, oh and fix all this vo jumping around and getting stuck, no other screenreader i use does this so apple really needs to fix this
I would like to see a complete and utter
A complete and utter lack of bugs.
Bruce Harrell
Would like apple to update voiceover and fix the bugs that some are having. Do not ad any new features but just check for bugs and make sure it works well.
Amen Oliver!
You the man!
Local Agendic AI
I would like to see what Apple promised and hinted at last year, i.e., AI and Siri that works across apps to take actions on the user's behalf based on what is on the screen and the capabilities of apps.
We are now seeing AI agents starting to roll out and be useful. What Apple has that is unique is the ability to make these services run locally. There are many things that I don't like sharing with AI tools that run on the cloud. Apple can have a real edge here if they do this and implement it correctly.
--Pete
VoiceOver stability and consistency
That's it! That's truly all I care about. I can't even list anythingspecific because most quirks I've just come to accept and work around, but somehow, it seems that in a lot of ways, iOS 7 on my old iPhone 5S was the least quirky iOS I can remember.
The one thing I can think of that is always frustrating is how on some web sites or apps, we can't double tap on edit fields in order to fill them out. Sometimes, using a different web browser helps, but other times, especially when it happens in an app, that's not an option. So, at least asfar as I'm concerned, Apple shouldn't even be thinking about new bells and whistles until the underlying operating system is at least a bit less buggy.
Universal clipboard, automatic translation, and other thoughts
One feature I wish existed—either on the iPhone or as some kind of universal keyboard—is the ability to copy text from anywhere, without restriction, and paste it wherever I want. I understand that some apps disable copy/paste for security reasons, like when you’re entering a one-time code into a secure field, but there are plenty of other platforms that block copy/paste for no real reason. This makes navigation harder, especially for me when I’m using a screen reader.
Sometimes I come across text in an app or on a webpage that isn’t accessible. In those cases, I’d love to be able to copy the information into my Notes app so I can read it at my own pace. It would be even better if there was a universal “copy to notes” feature that worked anywhere—automatically saving whatever I copy into a special section of Notes that keeps my clipboard history. That way, if I accidentally copy something else (like tapping on my home screen and wiping out my clipboard, which happens more often than I’d like), I wouldn’t lose the original text forever.
This would also be useful when I’m reading inaccessible webpages. Sometimes I want to copy the information in chronological order, skipping ads and irrelevant content, so I can paste it into Notes and read it smoothly. The “Reader Mode” in browsers does a decent job of isolating the main text from ads, but I’d love to combine that with a universal copy feature that captures only the content I want. Or the static text feature on the rotor is often pretty good at letting me skip over the advertisements so I can read the text a little better. I don't want to bypass security or copy sensitive information. I'm not sure if it would be a security risk to allow the copy paste options, if they are disabled, to become available with the detection of a screen reader. The same goes for books or documents that aren’t in true text format. A lot of PDFs and eBooks are essentially just images of text. It would be amazing if the iPhone could natively read that text without me having to open a separate OCR app. This could also help blind users who want to enjoy comic books, manga, or graphic novels. Instead of screenshotting every single page and running it through a separate app, the phone could simply recognize the text in each panel and read it aloud in sequence. Ideally, this feature would be smart enough to handle both simple paragraphs and more complex layouts, automatically translating the visual text into spoken or digital text. That way, whether it’s a webpage, a PDF, a picture, or a comic book, the iPhone could simply read it out without me having to jump through hoops. If Apple integrated something like this into the system itself—combining universal copy/paste history with built-in OCR—it would be a massive accessibility upgrade. There are books, studies, or research papers I’d love to read—info I know I would enjoy or benefit from—but they’re written in languages I don’t speak. Spanish, French, Chinese… it doesn’t matter what it is, if it’s not in English, I have to rely entirely on translation tools. And while those tools exist, the process is far from smooth. There’s this one website I’ve used from time to time that can translate entire documents. It’s an all-in-one package, which is great in theory, but in practice it’s slow, clunky, and loaded with advertisements that make it frustrating to navigate—especially with a screen reader. On top of that, it has a strict limit of around 150 pages per document. That means if I have something that’s 300 pages long, I have to split it into two separate files, upload each half individually, and then try to keep track of which one is the English version, which one is the original, and which one is the translated half of the original. By the time I’m done, my phone looks like a digital jigsaw puzzle of mismatched file names and duplicate PDFs. What I’d love is for the iPhone to have a native way to handle this. I’m not even talking about real-time translation while reading (although that would be amazing). An app where I could upload an entire PDF—regardless of size—choose my target language, and then have the phone or an app convert it into a clean, fully translated PDF. I could then open that file in Books or VoiceOver and read it like it was originally written in English. I know Apple has their own Translate app, but as far as I know, it doesn’t support uploading full documents for translation—it’s mainly for short text and conversation mode. And, OK, I would also love for chrome extensions to be supported natively on the iPhone.
I don’t want to complain, because I’m genuinely grateful for the products we have today. But does anyone know why dictation on the iPhone has gotten so bad? I remember when I had my iPhone 7 Plus, and even way back with the 5C, I never had these issues. I don’t use dictation as much anymore, but what happened to it? Did they change the way dictation is processed or something? I swear I read somewhere that at one point it was processed either on the cloud or directly on the phone, and one method was better than the other. I’m not a genius here, so bear with me, but I’m pretty sure whatever setup they were using before was way more stable. Back then, dictation would get 90% of what I said correct. Now? Not even close. Honestly, I think the decline started around iOS 7 or so, and it’s never felt the same since. The point is that dictation has gone downhill badly, and it’s frustrating.
One bug I wish they would fix is this strange thing that happens when I’m doing a task—usually on my camera, home screen, or browsing a website. Out of nowhere, the device starts picking up random bits of text from anywhere on the phone and reading them back to me at the end of whatever sentence it’s speaking. I could be scrolling and suddenly hear “screenshot for you… maybe dog… maybe cat” or just random letters and numbers. And no, it’s not related to image or text recognition—that’s all turned off. It’s pulling random scraps of text from who-knows-where and throwing them at me like it’s trying to make alphabet soup. The only way to get it to stop is to turn the device off and back on again, which is ridiculous. Then there’s the Speak Screen issue. Sometimes, when I use the Speak Screen feature, it’ll get stuck. I can minimize the window, but there’s no way to dismiss it completely. It sits there, frozen on my display, refusing to leave like an unwanted guest. Once again, the only fix? Turn the device off and on again. I promise I'm not trying to be that person who complains for the sake of complaining. But these bugs are difficult to describe and to try to replicate. If I remember correctly, didn’t Apple delay the release of iOS 12 at one point because they wanted to focus on making devices more stable instead of rushing new features? Feels like now we’ve got more bugs than ever. Don’t get me wrong—overall, the software is pretty stable, and VoiceOver works for the most part. But there are these little quirks here and there that I wish we could get rid of. I understand that the majority rules, and if most people aren’t experiencing an issue, it’s probably not going to be a top priority. But some of us are dealing with very niche problems that are incredibly hard to describe, let alone capture in a screen recording. It’s even harder to get them into the hands of a technician or the accessibility team in a way that truly shows what’s going on. It’s like—yeah, sure, we’re in 2025, we’ve got all these amazing tools, and the iPhones are better in many ways. But these little quirks still sneak in and make the devices awkward to use. Not unusable, but definitely not as smooth as they could be. And I know fixes take time, resources, and careful testing… but it would be nice to see some of these long-standing issues finally ironed out.
Same as last year
As per others, no new bugs, some existing ones fixed please. But ignoring that...
I would like Apple to open up access so I can use my Meta Ray-bans with more things. For example, I would like to be able to summon the S lady an tell it to open seeing AI and OCR the document I'm looking at without having to take my phone out of the pocket. Then this can pave the way for integration with navigation apps etc. Or even the magnifier things in iOS like door detection but powered by my face.
Built-in image descriptions using Apple Intelligence would be nice.
A slightly less stupid Siri that can cope consistently and work within apps to achieve tasks.
Voices in VoiceOver that are on par with those lovely Microsoft Natural/Neural voices (whatever they are called).
If age verification is going to suddenly become widespread, then I would like Apple to add something to iOS so I can verify my age there and then pass assurances to sites and apps to say "yes I am old enough, Apple says so" in a universal and accessible way.