What features would you like to see in an update of voice-over?

iOS and iPadOS

Myself, I would like to see the triple tap and hold gesture which was introduced to describe images become more accurate, as it is often inconsistent in behaviour. Maybe it could also describe .jpg and .png files?
Another thing I’d like to see is better support on webpages which have embedded video content or images, as currently, these can cause voice-over to freeze up.
One other thing is more accurate dictation, I’m not sure if it’s just my accent but dictation doesn’t seem to get most of my words right, or misses out some words entirely.
What other features would you like to see if voice-over was to be updated?



Submitted by Brian on Monday, April 9, 2018

I will agree with you on the dictation front. There are times when it replaces word with a similar word, but there are also times when it replaces the word was a totally unrelated word that sounds nothing like what I said. Is also occasionally talks back with wrong word but actually insertS the correct word. I don’t think it already exists but I would like to see ability to repeat the last phrase spoken by voice-over.

Submitted by Dawn 👩🏻‍🦯 on Monday, April 9, 2018

I agree with you. I would also like to see VoiceOver described gifts. And also, I would like for voiceover to speak words with apostrophes correctly as it only speaks the last letter after the’ when I’m typing. But when I get out of the keyboard and I am looking at what I’m typing, it says the work just fine.

Submitted by Mohammed Alwahhabi on Monday, April 9, 2018

I want to see a real multi-language support. not as we have right now which is useless
I mean in Mac OS X

Submitted by Ben Vercellone on Monday, April 9, 2018

I greatly appreciate iOS’s Braille Screen Input, and use it on a daily basis with practically 0 complaints. However, I do not enjoy the same level of success when typing into text fields in iOS using the Braille keyboard of a connected Braille Display. I have been using the VFO Focus 40 5th Generation, as well as the Actilino (now sold by HIMS-Inc. in the U.S.). I am not very geeky beyond the experiential level. But I am indeed quite experienced with iOS and Braille. Here are my difficulties, which I admit may not be issues faced by others.
Even in iOS 11.3, when I type at my natural pace, which I would say is perhaps in the top 20-40 percent, iOS simply does not keep up with what I am typing. I know that the 20% to 40% range that I gave was not very accurate. I’m just trying to generally point out how I believe I am in the faster half of Braille typists. After typing only a small amount of text, some letters and whole words that I type do not appear. Sometimes, before this begins to happen, I realize that pressing the spacebar does not cause the word that was just typed to appear. After I type 1 or more additional words, then the last 2 or more appear in the text field without much trouble. But then as I keep typing, the more serious problem I mentioned earlier begins to occur. My Focus 40 display recently disconnected from iOS after I typed a message, which was probably not much more than a paragraph. I need to keep this somewhat brief. So I will just summarize this point by saying that I absolutely cannot use either of my Braille displays, paired with my iPhone, as a note-taker system, as much as I desire to do so.
I have not experimented with the hardware Braille input and insertion point issues since iOS 11.3, so I won’t comment.
I also haven’t checked if the following 2 issues still occur for me in iOS 11.3. I will check later, and I encourage others to check. But prior to iOS 11.3, I experienced the following behavior with my Braille displays and Voiceover on iOS.
1. When reading BRF files in the BARD Mobile app with Braille output properly set to eight-dot Braille, the character consisting of dots 4, 5 and 6 would not show up on my display. In other words, the word “world” appeared as “will”, and “had” as “have”.
2. When I had set Braille input to eight-dot Braille, and entered text into a text field, pressing dots 4, 5 and 6 together would actually delete the prior character.
So with the above 2 glitches involving dots 4, 5 and 6, perhaps there is an easy fix.

Submitted by peter on Monday, April 9, 2018

I would like to see VoiceOver have the ability to easily edit and review text using voice commands. I'm thinking of the capabilities now built into Microsoft's voice recognition system which provides many of the capabilities of more sophistaicated programs like Dragon. There is no reason a voice assistant can't do these things these days.

Unless you're using a blue tooth keyboard, editing and reviewing text, fixing spelling mistakes, etc., just using VoiceOver and gestures is inefficient and not easy to use. Having voice commands to manage all of this would really improve the experience. After all, if Microsoft can do it so can Apple!


Submitted by TheBlindKind.com on Monday, April 9, 2018

I’d love it if the three-finger single tap would speak the current item in addition to saying the hint and the position on the screen. Sometimes I forget where my focus is, and I have to flick away from it and then back again to hear it. I would also like it if we can specify that Voiceover speak the position of the insertion point before any text in the field instead of after it. When I focus on an edit field with a lot of text in it, I don’t want to hear all that text to find out where my insertion point is or whether I’m even in an editable text field/whether I’m in edit mode.

Submitted by cybe on Tuesday, April 10, 2018

On iPhone, I would like to customize jestures by assigning my prefered VoiceOver commands.

Submitted by Justin on Tuesday, April 10, 2018

In iOS, the one thing I've wanted for years now is the ability to supress elipsees thru all apps. It's annoying reading, then suddenly you hear elipsys, and I'm like "I don't care about this". Especially when reading weather discussions issued by the NOAA and they put elipsys there, so when I read them, I do a speak screen gesture and it works.

Submitted by Deborah Armstrong on Tuesday, April 10, 2018

Tired of having them read as whole numbers, especially when I have to read an address or phone or credit card number out loud while talking to someone and looking this info up.

Submitted by TheBlindKind.com on Tuesday, April 10, 2018

Sometimes, for instance, I would like slower speech in one particular app and would rather not have to slow the speech down after finding it in the rotor.

Submitted by Ann Marie B on Tuesday, April 10, 2018

I have to agree about voice over reading text within images. I am currently using an IPhone 6S and there are many times where Voiceover will read the text within an image and it sounds jumbled...I would also like to see Voiceover read text within videos. I also agree with you on the dictation front.

Submitted by TheBlindKind.com on Tuesday, April 10, 2018

I'd like there to be a way to hear text attributes, something we can enable or disable on the fly, including bold, italic, etc., and including capitalization. I'd like this to be possible without having to explore one character at a time.

Submitted by TheBlindKind.com on Tuesday, April 10, 2018

Instead of reading the character, word, line, etc., that the cursor just past, I'd like to see a feature we can toggle that will tell Voiceover to read the current of these elements. This is already possible on Voiceover for the Mac.

Submitted by Kerby on Tuesday, April 10, 2018

I would also like to be able to customize what VO sounds I want turned on and off.

The thing is I have friends who don't use periods and use elipsees all the time. If it doesn't speak the elipsees the message is going to be just one big run on sentence. lol

Submitted by Devin Prater on Wednesday, April 11, 2018

Okay y'all, I've been grumbling about all this on twitter, so time to dump it all here. Pay attention, Apple employees.
On the Mac, I'd love Voiceover, and the whole system for that matter, to use 3D sound. I think iOS could do this too. On Windows, there is Windows Sonic for headphones, which is free, and takes audio from 5.1 or 7.1 surround sound, and makes any pair of headphones sound just amazing, with audio in front, behind, and all around the user. There is also Headphone virtualization, which takes stereo audio makes it sound better as well. This is why I usually use my Windows PC for entertainment, instead of my Mac. Apple, I know you can do better.
With Voiceover, I don't want to hear that something is bold, plain. Italics, plain. No. I want to know that it is bolded or italicized with changes in pitch, volume, rate, intonation, head size, formant stuff, all kinds of speech qualities, especially with the neural net Siri voices, can be changed to show formatting, not just tell about it. This makes reading text, especially Bibles and other study material, beautiful and engaging. This can, and should, be done on iOS as well. Apple, you can do better.
On iOS, Braille screen is great. I use it almost exclusively. There's only one big problem, its not a full keyboard. It does not interface with Siri to tell it how I type, it does not allow for entering predicted text, it doesn't allow for spell checking the huge email you just wrote just within Braille screen input. Apple can fix this by allowing swipes from different fingers to do different actions. Swipe down with your ring finger, for example, and Voiceover may read the text area back to you. Swipe down with your middle finger and Voiceover could cycle between different ways of acting upon the text, like bold the next work, previous word, or clear all text, and double tap to activate that function. Also, with Bluetooth keyboards, you can do all these keyboard commands. You can even send a message just by pressing return (enter). Why not allow that with Braille screen input? Better yet, make Braille screen input its own interaction mode. Swipe left or right to move between objects, or type to search for them just like you can with the home screen, and swipe right with two fingers to activate. There is so much possible with Braille Screen input, and Apple doesn't seem to care about it. Apple, you can do better.
Sounds are a big part of interaction, and Voiceover was pretty much the first screen reader to implement sounds. Now, why not make them 3D? Show us, on the Mac and even on iOS, where objects are in 3D. Show us progress bars by playing their position in 3D. Show us those beautiful animations by playing a sound equivalent. App launches, bookshelves moving, waveforms as Siri listens and talks, all these are unknown to us. All we hear is the "new screen" sound. All we hear is the results. I, for one, want to know just how beautiful macOS and iOS are! I for one want to enjoy using my devices. Apple, you can do better.
Apple now has 100% renewable energy. Imagine, AppleVis, if Apple took accessibility as seriously as it takes the environment? Imagine, if you will, that Apple figured out that people are so much more important that trees and metal? Every person is unique, there can be only one. Trees can be regrown. Plants can be brought back from extinction. But when people die, when people leave, that's it. This isn't to say that the environment isn't important. There cannot be people without a planet to live on. But accessibility for everyone is something that they can do also. Imagine, Applevis, 100% accessibility for everything at Apple. Not just good enough. 100% accessibility is possible for Apple and all of their teams, and they could even go farther, making apps that are not visual in nature need to be 100% accessible. So come on, Apple. Let's do better.

Submitted by TheBlindKind.com on Wednesday, April 11, 2018

Voice pitch and alias changes in combination with certain attributes being spoken are how I use JAWS. Without that ability, I would not be able to do my job, which involves massive amounts of writing and proofreading. All screen readers would do well to implement this feature set. I bet plenty of users would wonder how they ever did without it.

Submitted by JDubz on Wednesday, April 11, 2018

In iOS:
1. I want the podcast app to tell me which podcast each episode belongs to! When I check for new podcast episodes, Voiceover does not announce the name of the podcast -- only the name of the episode. So I have to go through them one by one trying to guess which of my many subscribed podcasts it is... And then start playing episodes when I inevitably can't figure it out. The only way to know which podcast you're exploring is to navigate by show and check each show one by one for new episodes.
2. I'd like Apple to fix whatever is causing my iPhone 7+ to be unresponsive to text input from my Apple bluetooth keyboard. It works in some apps but not others. Additionally/alternatively, I'd like to have a way to temporarily access the on screen keyboard while my bluetooth keyboard is still connected.
3. I also fully agree with #6 and #9 above.

In MacOS:
1. I'd like to see the same text editing behavior across different apps (e.g. TextEdit, Pages, Mail, Notes). Each behave slightly differently, and my primary issue is that I can never figure out where my text cursor is jumping to!
2. When flagging and unflagging emails, Voiceover should announce "flagged" or "unflagged" like it does in iOS. Instead it just says "red" and "red" so you have no way of knowing if you just flagged or unflagged  an important email (without hopping the VO cursor around to confirm).

And I think this MacOS one deserves its own category:
In the latest High Sierra update to 10.13.4 the AppleScript Handler option was completely removed from the Messages preferences. I rely on messages to be automatically read to me aloud as they are received. Does anyone know of another way I can make this happen? It currently does't even play a sound for incoming messages that are in the active window, so I don't even know when it would make sense to navigate back to the conversation area to look for and read new messages.

My iOS 11.3 iPhone and Actilino act the same way, though disconnects are not as frequent. For some reason, turning off not just speech, but also sounds and hints, seems to improve input slightly. I have only experienced the first issue with the BARD app, which began even in Beta releases of iOS 11, and was reported at least twice by me. This happened with the Braille Edge I had at the time, and still happens on the Actilino. I Think that in trying to make it less necessary to use dots 7 and 8 while typing, they did something seriously wrong to the rendering of characters which use these dots. Recall that brf files use dot 7 along with this character, bard mobile just hides it from the user. I haven't exterienced the second issue, though I admit that once I found dots 456 missing, I gave up playing with the app at that point.

Submitted by peter on Wednesday, April 11, 2018

When I listen to podcast with my Downcast app, the length of the podcast is spoken in the following
hours colon minutes colon second
If the minutes field is "00", VoiceOver speaks this as:
hours seconds
so that a time like:
is spoken as one fourteen.

This gives me incorrect voice feedback for the time since it could mean one hour and fourteen minutes.

I understand I could choose to have VoiceOver speak punctuation, but when reading other text I really don't want to hear all the punctuation. In fact, even for this time example, I don't want to hear the punctuation, but I do want to obtain a correct speech indication of the time.

Speaking a time such as 1:00:14 as
one zero zero fourteen
would be much more helpful. VoiceOver shouldn't skip the zeros!


Submitted by cybe on Thursday, April 12, 2018

Of course the best feature I would like to have is a voiceover option where you can send all your suggestions and requests directly to the voiceover developers and they will listen and grant your wish.

Submitted by Vsevolod Popov on Thursday, April 12, 2018

I’d like to have more accurate dictation as it is on android! I hate that skips words while pasting! And i’d like an option to switch languages invbraille screen input and opened tts api.

Submitted by Endarion on Thursday, April 12, 2018

An object, such as a button, can technically be Voiceover-reachable but actually not visible on the screen. That is, you can swipe to it, but it may be covered by something else, or simply not in view. When the VO cursor is on an invisible object, double-tapping has unpredictable consequences as the object either has no screen position, or the screen position is taken up by something other than that object. Ever wondered why double-tapping sometimes doesn't even make the activation sound? Object has no screen position. Or wondered why it activates something else? Object was covered by another object. Sometimes apps even crash when this happens as they are unprepared for activation of an invisible item.
So please make VO more aware of these situations. For something that needs to be scrolled into view, please make VO automatically do so, or create a setting to toggle this behavior. For objects which are covered, please prevent them from being swiped to.

Submitted by Dennis Long on Friday, April 13, 2018

I would like to see third party tts be allowed system wide in Vapple. I would also like to be able to read numbers as digets. Give the users more flexability and options.

Submitted by MikeFont on Friday, April 13, 2018

Time and time again, I have gone into the verbosity settings for voice over and indicated to it not to read or speak any table elements at all, and yet I still have to wade through table heading announcements while I'm in a read from the top down mode. Also the other html elements need their own set of customizations, such as landmarks, images, and alike. I only use the native IOS browser so I'm not sure if this works for other browsers.

Also let me totally agree with the previous post, #23, having VO unable to click on invisible objects is not only confusing but down right maddening! The first few times I encountered this I nearly wore out my screen protecter. :)
please help our poor voice over become more aware of these off screen or invisible clickable objects .

Submitted by Ben Vercellone on Friday, April 13, 2018

Thank you Shawn, regarding #19. From my perspective, the iPhone and a Braille display could make a very powerful combination if all of the Bluetooth and/or Braille issues were eliminated. I personally cannot think of a single option on the market right now that is as good as the stated possibility, unless I were to give up the portability factor. Oh, how I wish that I could have a completely unhindered version of Windows 10 as the OS on a smart phone, in addition to whatever other features would be added. To be fair, I think that having Mac OS and iOS merged would also be pretty cool too from a Braille perspective. But regarding this last statement, it would take Apple ironing out a growing number of glitches. I am not much of a perpetual complainer regarding Apple, and have sometimes defended apple in light of certain complaining against them. Of course I'm not saying that I've always been right in defending Apple when I did. I'm just briefly describing the history of my thought process regarding Apple. But at this point, I am almost utterly dismayed with the state of hardware Braille support from Apple. Judging by my experiences, I think we're going to need a very distinct and intentional change in Apple if we want Braille support to move forward in an empowering way to all. It is not empowering to all if inputting text through a Braille keyboard is unreliable. When has this kind of text input been reliable for practically all people, other than those experiencing eccentric glitches? Educate me, please. As far as I can remember right now, text input through a Braille keyboard, obviously when using Voiceover, has been unreliable for more years and months than it has been reliable. It's hard to tell emotions via text. Let's just say I do everything I can to be cordial, and desire positive and professional communication at all times. However, part of positive communication involves realism. My view is that aside from a few very nice Braille enhancements in iOS 11, which I know of and greatly appreciate, Braille support is still quite rusty for Voiceover users. It does not need to be this way. Does anyone else get the Wizard of OZ feeling with Apple? I don’t think it mattered as much several years ago, when Apple's forward progress seemed more realistic, and people drooled by the thousands with almost all Apple developments and announcements. I was one of the droolers, so I'm not judging. I wish Apple the best. It is possible for things to get better. But one thing that factually must change is the communication paradigm. There must be an open dialogue, whether or not we’re talking about accessibility. If this does not happen, and if Android meets or exceeds the accessibility of Apple hardware and software, it is pretty certain that many of us will no longer have Apple devices. I hope it does not come to this. The only reason I'm still in the iPhone camp is because I think the accessibility is still a bit better than with Android for my personal needs. I will say that in my experience, this advantage continues to shrink. This regression is neither necessary nor intelligent.

Submitted by Vsevolod Popov on Friday, April 13, 2018

Hi! I agree with #26 poster! Accessibility of apple devices, in my opinion, started regressing. Even braille screen input doesn't work as needed, there are bugs with entering some combinations of russian letters and apple still didn't fix it! They also didn't fix the issue with reading capital letters with russian enhanced voices, it's really awful! It seams like apple never fix it!

Submitted by Vsevolod Popov on Monday, April 16, 2018

Hello! Russian voices don't read capital russian and English letters correctly. It seams like they read phonems or something like this but it's impossible to read english or russian text that's written with capital letters! It seams like apple will never fix it! It doesn't care about problems of russian users, we are not needed to them! For example, there was a big issue with music app and russian interface. VoiceOver was stuck in track's list in albums and playlists and in player screen. It appeared in iOs 10.3 and they fixed it only in iOs 11.3! That's horible!

Submitted by Claus on Saturday, April 21, 2018

Using a hardware braille keyboard. Find a reliable way to move to the top of an edit field instead of moving out of the edit field when you review what you have typed.

Submitted by Claus on Saturday, April 21, 2018

Like others it have suggested it would be nice to set vo settings not only globally but have an option to set settings for a specific app. Talkx and I think the other Symbian screen reader could do this so it should not be impossible in iOS.

Submitted by Claus on Saturday, April 21, 2018

In reply to by Vsevolod Popov

Here I will defend Apple. The makers of current voices used by screen readers do not care about this nitch market. For years we have been trying to get in contact with Nuance to fix several problems in the Danish voices, without luck. It is impossible to find people inside Nuance that knows anything about speech synthesis and they could not care less. Wrong pronounciations of words. Speaking something else than what is on the screen u yes it does happen, and other problems. The only thing Apple has been able to solve is a bug that they introduce from time to time in Danish wrong pronounciation of the 3 national characters we use. This can be solved easily it seems since it is done as soon as we report it as a bug.

Submitted by Claus on Saturday, April 21, 2018


voiceover actually does change the braille table in use when you switch to another language. Much better than the braille support in Android even though this has been improved lately.

Submitted by Claus on Saturday, April 21, 2018

Extend the option to turn off the emoji message to be able to turn reading of emojis totally off. On fb people misuse emojis a lot, and it is a terrible waste of time listening to meaningless emojis and it takes of much too much space on a braille display.

Submitted by SoundSchemer on Sunday, April 22, 2018

I also agree with Devin's poast. Also, I really would like iOS 12.0 to have the novelty tts voices, such as bells, bad news, cellos, deranged, good news, and the others as tts voices in iOS. Also, a tts engine selection list would be good, as demonstrated below.

Speech, heading.
Edit, button.
Text to speech engine in use: Vocalizer Expressive tts engine, button.
Voice: Bells, button.

That's the layout I think the speech panel should have.

Submitted by Vsevolod Popov on Sunday, April 22, 2018

Hi! I agree with the previous poster. It would be also really cool to see updated nuance V3 voices that we now have an android devices. These new voices have more inflection than the voices that we currently use.

Submitted by Erick on Tuesday, April 24, 2018

Languages would be great, like all support for every type of language.

Submitted by nohansa nuh on Tuesday, May 1, 2018

I like to see audio rooting on IOS, like on mac whent music is playing on BT devices, VO output still on mac internal speaker

More Like This