Hi AppleVis community, I'm giving a talk in a few weeks to iOS and macOS developers at the /dev/world conference in Melbourne, Australia: Practical tips to confidently support VoiceOver and Voice Control. The basic thrust of the talk is: "As a developer with relatively intact vision, it’s easy to feel uncertain what ‘good’ sounds like, and to know how to tell if you’ve missed some better practices that might make a big difference to users who are blind or have low vision. This talk presents examples of what a great VoiceOver experience sounds like, and practical tips for how to implement it."
With 25 minutes for the talk, I'll have time to make about five to eight points... Given I want to briefly cover the new VoiceControl features in iOS 13, that leaves time for about five VoiceOver examples.
Clearly, making things actually accessible to VoiceOver control is a minimum baseline. What I'm interested in focussing on is some "near miss" examples—examples where an app is accessible via VoiceOver, but where there is something specific a developer could do that would have made the VoiceOver experience better, more streamlined, closer to best practices, or in some way a pleasure to use.
I would love suggestions from the AppleVis community if you have any examples that you think about be good to cover—or any other 'pet peeves' that you'd like to see put in front of a roomful of iOS and macOS developers in a few weeks time. I will also be mining the AppleVis forums to look for examples that may be already here. :)