iOS App VoiceOver Accessibility Teaching and Testing Plan

Last modified
Thursday, January 28, 2021


VoiceOver, a feature Apple has built into all iOS devices to enable Braille and speech access for users who are unable to see the screen, has revolutionized the lives of countless thousands of blind people around the world. It works best when apps are deliberately developed in ways that ensure compatibility with VoiceOver, blind people are considered during development and included in all facets of the testing process.

If you are a developer who has been asked to ensure the full VoiceOver accessibility of your app, following a step-by-step plan will help you get it right the first time, and keep getting it right through each subsequent update.

If you are an educator, following an organized plan will help you determine which iOS apps will best meet your blind students' needs and effectively teach them how to use each new app they encounter throughout their studies and beyond.

If you are a blind person who is new to iOS, or you are an advanced user of many apps, following a coherent plan will help you quickly come up to speed with the built-in capabilities of your device and each new app you install.

The purpose of this step-by-step plan is to provide a straightforward way for advocates, developers, educators and others to quickly explore, learn and improve the accessibility of all apps in Apple's iOS ecosystem.

  • Advocates may use the plan to easily identify the accessibility issues they report to developers.
  • Developers may follow the plan to test their apps as they move forward to improve accessibility with each iteration.
  • Decision makers may incorporate the plan into their user-acceptance testing and other quality-assurance procedures to insure their iOS apps comply with Apple's accessibility guidelines and other appropriate laws, policies and regulations governing accessibility.
  • Teachers may use the plan as a framework for evaluating the non-visual accessibility of iOS apps and providing instruction to blind students.

The Plan

  1. Make sure VoiceOver is turned on for all testing. VoiceOver may be enabled in Settings > General > Accessibility > VoiceOver on any iOS device. Alternatively, VoiceOver can be activated by asking Siri to "turn on VoiceOver" or by pressing the Home or Side button three times if the accessibility shortcut has been set to enable VoiceOver. In addition, select Settings > General > Accessibility > VoiceOver and make sure Speak Hints is turned on. It is also strongly recommended to triple tap the screen with three fingers to enable the Screen Curtain. This feature blanks out the screen, resulting in a more realistic environment for nonvisual accessibility testing.
  2. Open the app.
  3. Tap the top of the screen with four fingers.
  4. Flick to the right through all elements on the app's home screen.
    1. Are all controls labeled in a way that makes sense when you listen to VoiceOver without looking at the screen, i.e. would a VoiceOver user understand the purposes and functions of all on-screen elements? Make detailed notes of anything that you imagine a VoiceOver user would struggle with when using the app. This applies to charts, graphics, maps and any other elements that may require special consideration in order to provide equal accessibility.
    2. Are you able to choose all buttons and other appropriate controls by double tapping them as you hear them read by VoiceOver? In other words, if a sighted person would choose an element by tapping, does it operate correctly if double tapped by a VoiceOver user? Note any situations where a control does not function as expected when double tapped.
    3. Does VoiceOver stay focused throughout the user interface? Note any situations where VoiceOver is jumpy, seems unstable or is unable to retain its place while navigating in and out of lists and other similar controls.
    4. When one or more items in a list is highlighted or selected, does VoiceOver say "selected" or provide any other indication of its status? If we are selecting among the available side options for a food item, is the VoiceOver user able to select and deselect options and determine which options have already been selected? Note any lists where VoiceOver is unable to convey highlighted or selected status.
    5. Can the items in a list be refreshed using VoiceOver? If a list typically enables a sighted user to pull down with one finger, is a VoiceOver user able to update the list by swiping down with three fingers? Note any lists that can't be refreshed using VoiceOver.
    6. Are all elements available to VoiceOver? Pay special attention to anything that is skipped (not heard at all) while flicking. Note any skipped elements.
    7. Are there VoiceOver equivalents for all custom gestures? Note any app features that require the use of custom gestures that are not available to VoiceOver users through alternative techniques.
    8. If visual cues, such as color, are important, does VoiceOver provide the same information? If we are shoe shopping, does VoiceOver convey details such as color and style? If we are hailing a ride, does VoiceOver tell us the color of the car? Note all situations where VoiceOver is not able to describe important visual information a sighted user would take for granted.
    9. Are all elements presented in a logical order as you move through the screen? If the relationship between elements is important, is it clearly conveyed nonvisually? Note anything that seems out of place as you navigate the screen without sight.
    10. Listen for special hints, such as "double tap to play," spoken after the name of each element. If these hints are never heard, make sure hints are enabled in VoiceOver settings. Refer to step 1 for details. Note all situations where additional help could be supplied through this technique. Avoid using hints as the only way for the app to work successfully with VoiceOver.
    11. If audio is playing, does its volume decrease, or duck down, while VoiceOver is speaking? Note any situation where VoiceOver is difficult or impossible to hear.
    12. Does a two-finger scrub (Z-shaped gesture) activate the escape function of the arrow in the upper-lefthand corner of the screen? Note any situation where a two-finger scrub does not navigate to the previous screen or otherwise perform the appropriate escape function for the VoiceOver user.
    13. Does the app offer accessibility enhancements such as direct touch, keyboard shortcuts, magic tap or specific support for Braille displays, switches or other forms of assistive technology? Note any opportunities to incorporate these features.
  5. Flick to the left through the same home screen. Note anything that does not seem to function as expected with VoiceOver enabled.
  6. Tap the top of the screen with four fingers.
  7. Flick to the right, one element at a time, and double tap the first item where choosing it should lead to another screen.
  8. Repeat steps 3 through 5 on every screen the app contains, testing and noting any issues found with all elements.
  9. Using the notes obtained from testing, make all bug fixes necessary to deliver a fully accessible experience for users who rely on VoiceOver. Consider prioritizing the correction of accessibility bugs according to the order suggested in step 4.
  10. Check your work using blind alpha testers, followed by a select group of beta testers from the blind community.


The following additional resources will serve to help you put this plan into action:


  • As always, I thank the sunshine of my life, Allison Hilliker for encouraging me throughout the completion of this project and for putting up with my constant prattling on about just how critical accessibility and full inclusion is for the economic and social prosperity of blind people.
  • I would also like to give a big shout out to everyone on this AppleVis forum post who challenged me and supplied me with useful feedback.
  • Finally, I thank Pat Pound for valuable suggestions that helped me push this project over the finish line.

Creative Commons License

Creative Commons License
iOS App VoiceOver Accessibility Teaching and Testing Plan by Darrell Hilliker is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
Based on a work at
Permissions beyond the scope of this license may be available at


The article on this page has generously been submitted by a member of the AppleVis community. As AppleVis is a community-powered website, we make no guarantee, either express or implied, of the accuracy or completeness of the information.



Submitted by Morgan Watkins on Sunday, May 8, 2016

Member of the AppleVis Blog Team

Thanks for helping raise awareness of accessibility on these extraordinary devices!

Submitted by DPinWI on Sunday, May 8, 2016

This plan is well thought out and should help achieve accessibility goals

Submitted by TJT 2001 on Monday, May 9, 2016

I think that a problem with this guide is that it actually doesn't say the purpose of performing the various actions. If I, an unsure iOS developer, were to use your guide, I would certainly hear some things, but I would not understand why I had to tap at the top of the screen with four fingers rather than just tapping on the first on-screen element, or what VoiceOver should ideally announce as I flick through the screen, and what "normal" behaviour is.

Step 4a of your guide says "Are all controls labeled in a way that makes sense when you listen to VoiceOver without looking at the screen? Make detailed notes of anything that does not make sense. This applies to charts, graphics, maps and any other elements that may require special consideration in order to provide equal accessibility."

I would rewrite this statement to be something more like "Are all controls labeled in a way that makes sense when you listen to VoiceOver without looking at the screen, i.e. would a VoiceOver user know the purposes and functions of all on-screen elements if they were using the app without sighted assistance? Make detailed notes of anything that you imagine a VoiceOver user would struggle with when using the app. This applies to charts, graphics, maps and any other elements that may require special consideration in order to provide equal accessibility. However, it should be noted that it is not desirable for VoiceOver users to know about on-screen elements that provide no actual value to the app. This could include extraneous images and text that is solely for visual appeal."

Submitted by Darrell Hilliker on Tuesday, May 10, 2016

Hello TJT 2001,

Thanks for the valuable feedback.

I have incorporated some of your language to clarify the step on making sure all controls are labeled.

I happen to believe virtually all controls should contain some sort of useful non-visual label, without judgment regarding which ones a developer might or might think are helpful to me as a blind person, so I did not incorporate the last part of your statement in my revision.

I also removed the reference to sighted assistance, since I don't believe that has any place in the discussion of if or how anything should be made accessible.

I think everything does not always require exhaustive explanation. I believe the plan should be followed step by step to improve accessibility. The overall non-visual accessibility of any app should improve significantly by simply following all the steps and implementing all necessary bug fixes.

Having said this, I am open to strategic placement of explanations that would truly be useful in order for anyone to follow this plan more effectively.

Please keep the comments and feedback coming. They're greatly appreciated.

Submitted by Darrell Hilliker on Tuesday, May 10, 2016

Since it is critical for educators and others to evaluate iOS apps to determine their non-visual accessibility and ability to meet the needs of blind people, I have added language covering this fact in the plan. Thanks, Allison, for that wonderful suggestion.

Submitted by Darrell Hilliker on Tuesday, May 10, 2016

As each step is followed, I advise "make detailed notes..."

I received feedback indicating this is annoyingly repetitive.

I wish to keep this advice in each and every step, but I would be open to ideas on other ways to convey this important advice.


Submitted by Piotr Machacz on Tuesday, May 10, 2016

One thing that stood out to me while reading through this, especially when you mention going to other screens of the app, is that developers often don't use the Stock "back button" control type. It might often still be a button that VoiceOver announces as "back" when you flick through it, but thanks to the fact it's non standard the 2-finger scrub gesture is then broken as a result. I'm not sure where would be the best place to mention this in the guide, but developers should test if you can scrub your way back to previous screens.

Similarly I occasionally see lists that support the pull to refresh gesture, and for sighted people they work fine. With VO, you're supposed to do a 3-finger swipe down while at the top of the list to initiate a refresh, and in many cases this works, but I have seen instances of apps where it should work (as noted by text in the app itself), but performing the 3-finger swipe down doesn't work and you have to use the double-tap and hold and swipe down pass-through to force an update.

Submitted by Toonhead on Tuesday, May 10, 2016

This is really, really well-written! I do believe however that the phrase, "without sighted assistance could be really helpful. Remember, this guide is, for the most part I'm guessing, aimed at mostly sighted app developers. Keep in mind that because they're just learning how to use VoiceOver, that they may not even understand all the things VoiceOver can actually do, but it's up to them to author the app properly. Using the phrase without sighted assistance could help them understand because they're still going to be looking at it from a visual way, even if VoiceOver is turned on and the screen curtain is on. It just reminds them that we'll be able to use the app independently, without the use of sighted help. I see nothing wrong with using the phrase at all. Also, you may want to remove, or slightly modify the bit about using 4 fingers to touch the screen before swiping through the different elements. Lots of people I know don't do that, even if you might. The idea is to let them know that not everyone does that, so it's not absolutely necessary to place 4 fingers on the screen before you swipe. Remember they're not used to using VoiceOver, it's not a part of their daily routine like it is for us. They're using it for testing purposes and aren't relying on it for every day use. Other than those little bits this looks really good!

Submitted by TJT 2001 on Wednesday, May 11, 2016

Thank you for considering incorporating my ideas. My use of step 4a was an example; you should try to copy my suggestions for the other steps to make your guide as clear as possible. If you do not include sufficient explanation, a developer may try to infer the purpose of performing a particular step. If they infer correctly, then we can applaud them. If they infer incorrectly, they will not have any way to learn the correct way they can implement better VoiceOver support in their app. I too believe that all controls that can be interacted with should have proper text, and, if necessary, a VoiceOver hint. However, my comment about graphics and other visual elements was to say that not all graphics need to be described. For example, on many weather websites, there are graphics that show the weather conditions, e.g. sunny, cloudy, rainy, etc. A screen reader user would not need to know about these graphics (in fact, knowing about this would be unnecessary), and an equivalent would be to have a statement describing the weather which also included the temperature. Likewise, VoiceOver users do not need to know about other colours, visual effects or animations if they are not necessary for the use of the app. However, if they must be used, then VoiceOver users must have equal access to them, even if such functionality would only be very rarely used.

In another comment, you talked about how developers should make notes about behaviour that could be improved. Rather than saying "make detailed notes", you could ask developers to identify areas that could be improved. Then, suggest that they look at Apple's iOS Development Guidelines and choose an approach to render the area more accessible.

Hello Piotr,

I have revised the plan to test for the "pull to refresh" issue. Thanks so much for calling my attention to that oversight. Please let me know your thoughts on the update.

I did not include the two-finger-scrub test for the Back button because, while it is a nice-to-have capability, it is not a critical feature to have in order for an app to be accessible. I am open to having my mind changed, though, so please convince me. :-)

Thanks again for your valuable feedback. Please keep it coming.

Hello Toonhead,

Thanks for your thoughts on "without sighted assistance."

I feel I can't mention the concept of "sighted assistance" in the plan because it introduces the false notion that, if the developer does not incorporate accessibility, we always have the option of asking a sighted person to help us.

Instead, the correct assumption should always be that accessibility is a matter of inclusion. If it exists, blind people can use the app. If it does not, we are locked out.

I'm sorry, but any reference to utilizing "sighted assistance" in this context is a nonstarter with me because it goes against one of my core accessibility principles.

I also think anyone who reads this plan will realize we are talking about using an app from a blind person's perspective with VoiceOver enabled throughout the testing process, so the clarifying statement "without sighted assistance" should not be necessary.

Submitted by Darrell Hilliker on Wednesday, May 11, 2016

In reply to by TJT 2001

Hello TJT 2001,

Thanks again for your additional thoughts.

I am trying to keep the already-lengthy text in each step from blossoming out of control, but I understand the importance of describing the purpose behind following each step.

In order to address these concerns, I am considering adding an examples section to the plan that demonstrates a thoughtful real-world implementation of each step.

I would love any ideas any of you may have regarding this possible examples section, including, of course, descriptions of actual examples to include.

Submitted by TJT 2001 on Wednesday, May 11, 2016

To make an app fully accessible, blind people must not have to go and seek sighted assistance. However, I feel that mentioning this will clarify the common misconception that blind people are dependent on sighted people for help and that we are incapable of using our iDevices independently.

Submitted by TJT 2001 on Wednesday, May 11, 2016

I would like to help you with examples. What are you thinking exactly? Perhaps the comments section of this blog post is not the appropriate place for us to discuss this sort of thing as it could make the guide seem less professional. I understand the issue of the text getting too long. I do not think that asking the developer to make copious notes is necessary at every step of the process.

Submitted by TJT 2001 on Wednesday, May 11, 2016

Developers should implement the standard "back" control that can be activated with a two-finger scrub. This allows users of braille displays, and users of the touch screen on apps with complex and lengthy screens to more easily refocus on the previous page.

Submitted by Darrell Hilliker on Wednesday, May 11, 2016

It has been suggested that I should remove the step where I advise the use of the four-finger tap gesture at the top of the screen. I can't do this, because I consider this step to be one of the most important elements of this entire plan.

When one always starts at the top of each screen, one avoids the possibility of missing an important element in the testing process. In addition, for basic purposes of logical order of operations, I think it is always good to start at a consistent, known location with every step-by-step procedure.

I hope you all find this explanation helpful.

Thanks again for all your valuable suggestions. Please keep them coming.

Submitted by TJT 2001 on Wednesday, May 11, 2016

Keep the part about the top of the screen. Braille display users find this a logical place to start exploring and navigating, and it is a nice reference point. Also, it can help in apps that are quite complex. In some apps, I have seen that doing this gesture can take me into an area where there is no content, thus making that part of the app quite inaccessible.

Submitted by Michael Hansen on Wednesday, May 11, 2016

Member of the AppleVis Editorial Team


As others have said, this is a great plan...and the community involvement will only make it better.

Have you given any consideration to possibly including a "How to Use VoiceOver" section at the beginning of the guide? I think this would be helpful to developers who have never used VoiceOver. I don't see this being anything too lengthy, just telling them things like "Instead of single-tapping on an item to activate it, you double-tap when VoiceOver is enabled..."

I agree with Piotr that testing for the two-finger scrub gesture is important. My feeling is that if it is used universally across iOS and Apple apps, then the average user would expect that it would work in third-party apps as well.

In light of the comments of others, I started paying more attention to the flow of the wording for developers to take notes. I think, if a developer is committed enough to testing for accessibility, they are probably going to also take good notes without consistent prompting. For end-users not used to having to systematically document and track issues, more notation may not be as big of an issue.

Submitted by Darrell Hilliker on Thursday, May 12, 2016

I have just revised the plan to test for proper functioning of the Back button when the two-finger scrub gesture is used. As always, all ongoing feedback is appreciated.

Submitted by Darrell Hilliker on Thursday, May 12, 2016

I have also added View Controller Programming Guide for iOS: Supporting Accessibility to the list of resources.

Submitted by Darrell Hilliker on Thursday, May 12, 2016

The introductory paragraph includes a basic description of VoiceOver along with a link pointing to Apple's introduction. I have also provided a resource covering learning VoiceOver gestures.

I am open to ways of expanding a VoiceOver intro without bogging things down too much in the introduction section.

Thanks for all the feedback. Please keep it coming.

Submitted by Darrell Hilliker on Wednesday, June 15, 2016

Hello Everyone,

iOS 10 Beta 1 is already out to developers, and the public betas will start rolling out in July.

I hope everyone who decides to beta test iOS 10 will consider using the iOS App VoiceOver Accessibility Teaching and Testing Plan as a framework for exploring all aspects of the new iOS, documenting your findings and reporting bugs to Apple.

It is only through our diligent testing and bug reporting that most of us in the blind community can effectively influence the accessibility of the next generation of iOS.

Thank you,


Submitted by Andy B. on Wednesday, June 15, 2016

I'm sure the beta testers and developers already know this. Whether your framework is used or not may not be a factor or considered. From what I know, this plan is far from a time tested method of testing.

Submitted by Darrell Hilliker on Thursday, January 28, 2021

I have tightened up some of the language, added an example and included one more resource. I would be interested in anyone's thoughts on how to further update this guide to keep it relevant.

Submitted by Darrell Hilliker on Thursday, January 28, 2021

I have tightened up some of the language, added an example and included one more resource. I would be interested in anyone's thoughts on how to further update this guide to keep it relevant.