Hello all, previous topics I have seen on here talk in length about new voices, and new features for VO on the Mac. However, here, let's discuss what did you find in VoiceOver that's new in iOS 16? Before I start, a small disclaimer, this is beta software, meaning that any of this could be changed, modified or even entirely removed in the next update if it is found to be too unstable. I will do my best to update the post if that ends up being the case, but for users not on the beta, don't necessarily expect things to work exactly the same in September as they do now.
For my part, this seems to be a release mainly focused on bug fixing, which is actually a very welcome thing. I did however notice a few more things:
- Under verbosity, there is an option called web rotor summary. It can be set to either speak, braille or do nothing. For a long time, I was wondering just what is this web rotor thing? That's mainly because for a long time, I stopped using rotor to navigate by elements on the web, and instead use gestures. However, now I know what it's all about... When you are on a web page, and assuming you have one of the elements in the rotor, you will hear its count when you focus on it. For example, say you set your rotor to headings, after a brief pause, you will hear something like, 5 headings, assuming the page has 5 headings on it. This verbosity option mentioned above allows you to turn this off, if you so choose.
- Modernized sounds. Sounds for dictating, as well as starting / stopping Siri via the side button have been modernized. Not much to say about this one, sounds are quite subjective but I personally think these are quite nice. They are subtle, might not be ideal for people who are hearing impaired, but I personally can't comment on that. In addition to this, a new sound has been added, a small click you will hear while an item is focused using screen recognition. This is quite useful if you have enabled the feature by accident, since in apps that are already accessible, this can cause a worse experience. This sound will let you know you are currently focusing elements using screen recognition.
- Announcing directional changes. This one is actually a little specific, so it might be a bit hard to explain, but it actually helps to avoid one very annoying bug, or at least know when you are encountering it. As you may or may not know, there are two text reading systems, left to right (used in languages such as English and many others), and right to left (used in languages like Arabic). Depending on the one used, VoiceOver behaves differently. For example, if we are using right to left, all gestures are inverted. Swiping right would behave as if you are swiping left, and so on. Why is this important, if I don't speak any of these languages, you might be wondering? Well,, it actually is. In certain cases, developers use a method to display text right to left that actually confuses VoiceOver. Have you ever noticed a situation where, in an application, as you are swiping to the right, VoiceOver gets stuck and navigates back and forth between 2 different elements over and over in a loop? If so, you have encountered the bug I am talking about. Note that this never happens in Apple native apps, so if you encounter it there, that's definitely a different bug. This is only in 3rd party apps. An example of this is Google authenticator. If you try to add an account, and try swiping to the right after writing a code, VoiceOver will get stuck between the control to choose whether the account will be time based and the text field to enter a key. As you are swiping right, you will only go back and forth between these two controls. In iOS 16, before VoiceOver says the control for time, it will first say, "directionality changed to right to left" This means that you should rather swipe left to proceed after this control. In fact, if you do so, you will discover the add button, which was previously not focusable via swiping and VoiceOver will again announce that the directionality is back to left to right, meaning you can do the standard right swipe again. Note that this example is quite specific. It's entirely possible Google will address it via a future Authenticator update, and in fact I hope they do so, but nevertheless you have probably encountered this scenario in other apps and wondered why Vo got stuck like that. In iOS 16, it should be a little more clear when this happens.
- Time announcement. Time will tell whether this is a bug, or a feature, but it seems like when waking the device up, VoiceOver now announces the date rather than the time. I am personally leaning towards this being a bug, but since sighted people got many visual options to customize the lock screen, I am also really hoping that in the future Apple can make this customizable and let us choose exactly what we want spoken on the lock screen. Options for this could be time, date, notifications count, or nothing at all, so Vo would stay silent when you wake the device up if that is what you prefer. Of course, all of those could be individually enabled / disabled just like other verbosity options. I think that would be a welcome change, but of course that is more of a suggestion than anything else. For now though, if we can't get any customizations, I certainly prefer the time being spoken over the date.
And that's all, for now. Found anything else I missed? Feel free to respond. Let's focus on new features here rather than bugs, though just like my last point, if you aren't sure whether something is a new feature or a bug (which can absolutely happen), feel free to write it.