iOS App VoiceOver Accessibility Test Plan
I am proposing that, as part of accessibility advocacy work with iOS app developers, we provide an easy-to-follow test plan that will, at least in the vast majority of cases, result in an accessible app for people who are blind or have low vision.
I am asking for constructive feedback from Apple vis users on how I may improve this testing plan.
Please keep in mind that we want such a plan to be a simple and effective way for us to test the nonvisual accessibility of apps and to encourage app developers to perform accessibility testing and correct the accessibility bugs they find.
The purpose of this plan is to effectively test the accessibility of an iOS app with Apple's built-in VoiceOver for blind and low-vision users.
1. Make sure VoiceOver is turned on for all testing. VoiceOver may be enabled in Settings > General > Accessibility > VoiceOver on any iOS device. It is also strongly recommended to triple tap the screen with three fingers to enable the Screen Curtain. This feature blanks out the screen, resulting in a more realistic environment for nonvisual accessibility testing.
2. Open the app.
3. Tap the top of the screen with four fingers.
4. Flick to the right through all elements on the app's home screen.
A. Are all controls labeled in a way that makes sense when you listen to VoiceOver without looking at the screen? Make detailed notes of anything that does not make sense. This applies to maps and any other elements that may require special consideration in order to provide equal accessibility.
B. Are you able to choose all buttons and other appropriate controls by double tapping them as you hear them read by VoiceOver? In other words, if a sighted person would choose an element by tapping, does it operate correctly if double tapped by a VoiceOver user? Make detailed notes of any situations where a control does not function as expected when double tapped.
C. Are all elements available to VoiceOver? Pay special attention to anything that is skipped (not heard at all) while flicking. Make detailed notes of any skipped elements.
5. Flick to the left through the same home screen. Make detailed notes of anything that does not seem to function as expected with VoiceOver enabled.
6. Tap the top of the screen with four fingers.
7. Flick to the right, one element at a time, and double tap the first item where choosing it should lead to another screen.
8. Repeat steps 3 through 5 on every screen the app contains, testing and noting any issues found with all elements.
9. Using the notes obtained from testing, make all bug fixes necessary to deliver a fully accessible experience for users who rely on VoiceOver.
10. Check your work using blind alpha testers, followed by a select group of beta testers from the blind community.
So Apple already has resources to explain to developers how to test their apps for accessibility, why are we reinventing the wheel? https://developer.apple.com/library/ios/technotes/TestingAccessibilityO…
While Apple's accessibility testing resource is quite good, I do not believe it really provides blind accessibility advocates, or developers, a straightforward, step-by-step way to test their apps for accessibility.
I would, of course, certainly wish to avoid re-inventing the wheel, so, if anyone knows of such an accessibility testing plan, I would love to see it. :-)
In theory, your plan seems like a good idea. However, here are a couple of questions to consider. How many developers out there know of Voiceover? As a follow up, how many developers would you say know how to use Voiceover? And, finally, on a slightly different topic, are you suggesting this test be done for every app out there? Just a couple of questions that came to mind after reading this post.
they should also make sure that, while swiping, no elements are skipped, even if those elements are otherwise accessible via touch navigation. see spotify.
What is the four finger tap at the top of the screen? I'm not aware of this gesture. I'm on an IPhone.
The four finger tap (on the top of the screen) brings the cursor to the very top of a document or app. On the bottom of the screen, it takes it to the very end of the document or app.
I don't think that applies to the iPhone, it doesn't have any four finger gestures as far as I am aware which, if true, highlights another problem in testing if it is device dependent.
Yes the 4 finger jesture works on Iphone and Ipad.
Go into the Voiceover gesture practice area and try it.
Now there's a thing. Shame they've not implemented app switching with the four finger gesture as on the iPad... *Hopes to be corrected*
Thanks for the recommendation to make sure elements in an app are not skipped over for VoiceOver users. I have added it as 4C to the plan.
It is assistance like this I am looking for in this thread.
In addition to recommendations for editing this accessibility test plan, I would also appreciate any additional resources, such as the Apple resource already provided by Steve.
This site also has a number of resources that devs can use. And of course they can ask questions on forees like this if they so choose. THe best resource, in my opinion, is the people. There's never an excuse not to use people as a resource if you need to ask questions, or throw an idea out there, or just to get feedback. I've not seen a lot of devs do this, especially where accessibility is concerned, and I think that if they did, things would be better for all parties.
Another crucial thing is that if an app has a surface for drawing on, or controls that need to be manipulated by moving them as in a joystick, or has a panel that can accept multiple touches or holds, direct touch needs to be enabled. Furthermore, if there are such screens that use direct touch, they need to completely disappear if another screen, such as a settings screen, is activated. In the case of apps employing direct touch, swiping through all the various elements will not work well, since a blind person can accidentally swipe through the direct touch panel, so each button or element should be as large as possible so as to be easier to find while exploring by touch. This should always be the case, since many of us use this method much more often than swiping.
Please elaborate on your direct touch comments. Please give me an example of an accessible app that uses a drawing area, joystick-like controls or some other panel where direct touch is the only way to make it accessible. I would also appreciate any ideas you may have for testing these kinds of controls in the step-by-step manner as already outlined in this plan.
I have just edited step 1 to recommend that testing be conducted with the screen curtain enabled. Thoughts?
Once Vo is on a sited dev's device, they might not know how to get back to the correct place in settings to turn it back off when they are done testing. So you should tell them to make sure triple click home is on and also that they can turn VO on and off using Siri but to make sure you have a wifi or data connection before turning VO on if you plan on using Siri to turn it off.
I also think you should tell them to turn on VO hints so they can test if there hints work or not. You should also explain what hints are and how helpful they can be for us.
Hi again. I also think you should make a test plan like this, but for Mac apps.
The vast majority of developers will ignore accessibility test plans because they have no need for one. In school, we are pushed to consider accessibility. However, the typical university has a warped view of accessibility. Out of an 8 week course, accessibility is covered for a week, and it is a limited, but warped view at best. This gives developers a few points to consider:
1. Accessibility is not very important.
2. If it is considered, it isn't very complicated, and there are only a few things to look at such as colors of font size.
3. Once the developer sees the complete set of requirements, they give up on the entire idea because accessibility implementations take about 60% of a developer's time.
A good example is the WCAG 2.0 or 2.1 which has over 200 factors to consider when creating content ready for universal access.
Not only should a test plan be generated as a resource for developers, developers should educate themselves on the subject. Unfortunately, people who require accessibility implementations will most likely be left out of the majority of apps because we are the minority.
Until Apple follows suit and makes all of their native apps completely accessible and promotes this fact, developers won't do it either. Since Apple fails to make all native Apple apps completely accessible, they have no social right to force everyone posting in the app stores to make their apps completely accessible. We can create test plans and educate people. However, a huge social change will not happen until Apple takes charge and leads the way by example.
Direct touch is used in many apps. It's how you can play drums in Garage Band, guitar in Thumbjam, and quickly select tracks in Loopy. As far as drawing on a pad goes, you'll have to wait for the release of Noatikl 3 to see that in action. In most cases it's used you know it immediately because you hear a ssound, but in Noatikl you have to be given textual feedback of the drawing you did in the envelope drawing area. Oh yes, and it's the reason you can play games like Blind legend without having to turn off Voiceover. Hth.
I believe it is important to make things simple for the developers and for the users who want to contact the developers about accessibility.
I think an outline like this could be very helpful when sending an email or contacting the developer.
I think it would even be better if it was listed and put in the form of a table. That way we could all refer to individual items by number.
Links to further information to be provided with back links to make sure that it it is easy to stay in focus.
I don't write well so I'm reluctant to write to developers. I don't want to make them think it is to hard to make their apps accessible. I want them to believe it is worthwhile.
This is why I'm interested in having a simple form that I can customize to send to a developer to start the discussion.
The beauty of it is that many things, like direct touch, only involve a few lines of code that can be sent to the devs. It's one thing to say, "Could you please incorporate direct touch, but quite another to show them how easy it really is—just a few lines of code, and I think that's applicable for many differing accessibility selutions.
As an app developer, I find this post very helpful. And in general, I think that the easier the developer process can be made, the more likely the developer will implement VoiceOver in a way that is usable.
To me, using VoiceOver for anything gives me a headache. Maybe to someone who's used it for awhile, it's not so bad, but I don't know. My biggest hurdle right now is that I don't know what's expected behavior and what is not.
I'm glad the accessibility testing plan is helpful for you.
At one point, you said, "To me, using VoiceOver for anything gives me a headache."
I'm just wondering if you could provide a little more detail about what makes VoiceOver difficult for you to use.
This is a question, rather than a way to make the test plan better. Partly I'm looking ffor better tools to find out what is going wrong in some cases. I can't show you this because it is not a public screen, but I wanted to describe the situation, and then ask the question. I have a screen where I can swipe to all items, but by using a combination of the three-finger scroll and touch exploration, I can't explore to some items at the top of the screen, and in one case, an item will not work when I try to double tap on it. Now the question. Does this possibly indicate that VoiceOver thinks that the item or items are off-screen? I ask this because the app in question is nothing more than a web site in an HTML view.
There doesn't seem to to be a detailed explanation of how VoiceOver is supposed to work (user guide or something like that), so learning VoiceOver itself seems much more difficult than it needs to be. A lot of what I found was by just trying things and guessing. Most of the rest was in different write-ups like the one here. Some of the information I received was by talking to other users who would show me what they had learned.
Ultimately, with the help of a very determined engineer and a very helpful customer, I believe we have made significant progress.
Thank you again for your article.
This is an excellent plan. I do have one issue, and with many app developers, I'm sure you all have seen this.
Many apps start off fairly accessible, but at some point, an update breaks accessibility. There is no way to know that until you update the app, and after this happens, you feel the need to let the app developer know there's a problem.
I noticed this with the app used by my house of worship. I've had to get back in touch with the developer, and they've been slow to respond. I'm especially frustrated about this, since it's a targeted app and it may be used by one or two church members. In my latest post to this developer, I reminded them that they need to be deliberate about good accessibility no matter the numbers.
I've also, in this update, encountered the problem where Voiceover reads the labels, but double-tapping on an item doesn't work, so you have to turn off Voiceover and take a shot in the dark. Once you've hit something, you then need to turn Voiceover on again. It's very hit ant miss. Once you're on the next screen, things work, but the main screen is where the problem lies. Evidently, the developer placed the labels above the buttons, and that spoiled it. At other times, I've had to manually replace the labels in the app with ones that made sense. However, the next update will probably wipe them out.
With that, I think we need to ask this question: If an app is accessible now, how can we ensure that updates carry the accessibility features over without resorting to "hard coding" features so that developers can still be flexible yet stay accessible? Also, what do you do if the developer and you are unable to meet face to face? How do I as the user hold them accountable, especially if, like the app I have in mind, it's targeted to a specific church?
I wish we could have a blind person over every developer's shoulder reminding them about this, or a tool that constantly reminds them until they take it seriously, but I recognize this is not easy. Failing that, what can we do in a case like this? Can an automated tool be designed to help users report problems like this to the developer? What about a template, combined with a tool, so that I can describe the problems clearly? Any thoughts? Thanks.