If you’re new to or thinking about getting an iOS device, you may be wondering, as I did when I got my first one, “How, exactly, can a person who can’t see a screen use a device whose primary input surface is one?”
I would soon realize that it wasn’t nearly as crazy as it sounded, and that I could do things with it that I couldn’t have imagined that would increase my level of independence and improve my overall quality of life. However, if you’re just getting started, the level of information on the Internet can be overwhelming. That is why in this guide, I will provide a series of tips, organized by heading and subheading, along with links to more comprehensive guides and podcasts from across the AppleVis website.
Keep in mind that this guide is not intended to describe specific features, but rather to explain the central concepts of using iOS with VoiceOver. One quality that I’ve observed in my over ten years of learning and using iOS is that once I had a few things mastered, I could apply those skills to accomplish a wide variety of tasks in both first and third-party apps.
iOS is the operating system that powers the iPhone, iPod Touch, and until 2019, the iPad. That year, Apple forked off the version of iOS for the iPad and now refers to it as iPadOS, which provides similar functionality to iOS with the addition of tablet specific features. As the features of iPadOS are in some cases identical to those of iOS, the iPad may feel very familiar to you if you’ve used an iPhone or iPod Touch. However, I don’t have an iPad, so cannot comment in this guide on what features and processes are the same or different between the two operating systems. Therefore, only iOS will be covered.
In addition to the iPhone, the iPod Touch, the last remaining device in the iPod line, runs iOS. This is a device with a similar design of the iPhone, with the ability to run the current version of iOS, but without cellular connectivity, an advanced camera, biometric authentication, or the same level of computational power as the iPhone and iPad. For reference, the processor in the current generation iPod Touch is the A10 Fusion, the same chip that powered the iPhone 7 which was released in2016. The iPhone and iPod Touch, and to a lesser extent the iPad, are known collectively as iOS devices or, “iDevices,” for short.
Applications or, “Apps,” are pieces of software that add specific functionality to an operating system. In Windows vernacular, these pieces of software are sometimes referred to as programs, particularly in the days of Windows 7 and earlier. As you will see in this guide and in your own use of iOS, apps are an integral part of the user experience and can greatly expand the functional potential of your device.
iOS device physical layout
While specific device models have distinct hardware features, there are some key design characteristics that all iOS devices share.
The iPhone and iPod Touch are rectangular slabs with a flat glass touch screen on top, two volume buttons on the upper left side, a power button, and a charging port on the bottom. On the iPhone, the power button is located on the right side, and is commonly referred to as the Side button. On the iPod Touch, the power button is located immediately to the left of the top right corner of the device. The iPhone also has a mute switch above the volume buttons on the left side; push it down to silence the ringer and other alert sounds.
On devices that retain the older design style, there is another button located at the bottom center of the screen called the Home button. On these devices, this button is used to, among other things, return to the Home Screen, discussed later. Devices that have a more modern design style rely on touch screen gestures to replicate these functions.
The earpiece, used primarily for hearing call audio, is located at the top of the screen, and another speaker, typically used for speaker phone and consuming other types of audio, is located at the bottom of the device. Note that the iPod Touch does not include an earpiece.
Functional differences between iPhone models with and without a Home button
As alluded to earlier in this guide, the iOS user interface is based around a grid of apps called the Home Screen. On devices with a Home button, pressing this button will return you to that screen no matter what you’re doing at that time. This button also serves other purposes, such as accessing a list of recently used apps, called the app switcher, and engaging Siri, the intelligent personal assistant built into iOS and other Apple platforms.
Starting with the iPhone X, Apple has been releasing iPhones that don’t include this button. On these devices, return to the Home Screen by placing your finger on the bottom edge of the screen until you hear a brief tone, and then sliding your finger straight up the screen until you hear a second tone. For more detailed information about the changed button functionality on devices without a Home button, check out this AppleVis podcast. For the purpose of simplicity in this guide, I will refer to returning to the Home Screen as, “Going home.”
Another difference between the two design styles is the implementation of biometric authentication, the act of proving your identity using inherent characteristics like your face or fingerprint. On devices with a Home button, biometric authentication is performed using a fingerprint recognition technology called Touch ID. Once set up, a user can simply rest their finger on the Home button and the device will unlock within seconds.
On devices without a Home button, a face recognition technology called Face ID is used. This involves holding the device in front of your face to unlock it.
While both technologies are fast, accurate and usable for people who are blind or visually impaired, I personally prefer Touch ID, as I find it easier to rest my finger on the Home button than to hold my iPhone in front of my face or try to approximate the location of the camera. However, everyone’s needs and circumstances are different, so I’d strongly encourage you to research the pros and cons of each technology before making a decision. The AppleVis forum features many conversations about this topic, where you’ll experience a range of opinions. For more general buying advice on iPhone models at any given time, check out the MacRumors Buyer’s Guide. Even more helpful, in my opinion, is to try one or more iPhone models in person if possible.
When you turn on and unlock an iOS device, you will be placed on the Home Screen, a grid of apps that can span multiple pages. If VoiceOver is on, moving a finger around the screen should cause it to speak what that finger is touching.
To open an app, move a finger to it, lift your finger, and then tap anywhere on the screen twice quickly. This gesture is known as a double-tap, and is used to activate the currently selected item; the equivalent of a single-tap for sighted users. In addition to exploring by touch, you can move VoiceOver focus directly to the next or previous item by swiping right or left with one finger.
At the bottom of the Home Screen, there is a row of apps that is present no matter what page you’re on, referred to as the dock. The composition and organization of this list can be edited, as can those of all apps on the Home Screen.
At the top of the screen, information such as the device’s cellular signal strength, Wi-Fi connection, and battery level is displayed. This area is referred to as the status bar, and is present whenever the device is in portrait orientation, where the charging port is pointing toward you. Unlike the Home Screen, the status bar cannot be customized.
While this section introduces you to some of the most essential VoiceOver gestures, there are many more that you can use to improve your experience and sense of comfort on iOS. From anywhere, you can access VoiceOver help, which allows you to perform any gesture without it having any effect on the system, similar to keyboard help on macOS or input help on NVDA for Windows. To do this, double-tap the screen with four fingers quickly, perform this gesture again to exit this mode. While this gesture may sound difficult, it should get easier with practice.
If the device mistakenly interprets the gesture as a three-finger double-tap, VoiceOver will announce, “Speech off.” Perform a three-finger double-tap to restore VoiceOver speech. To help get you started, here is a list of a few other helpful gestures:
- Access additional options for the item under your finger, similar to a contextual menu on a computer: one-finger triple-tap (one-finger double-tap and hold also works)
- Start or stop something, like answer or end a call or play or pause media: two-finger double-tap, commonly referred to as a magic tap
- Read from top of screen: two-finger swipe up
- Read from item under your finger: two-finger swipe down
- scroll down: three-finger swipe up
- scroll up: three-finger swipe down
- Jump to top: four-finger single-tap near top of screen
- Jump to bottom: four-finger single-tap near bottom of screen
Besides exploring by touch and double-tapping, one of the most important concepts you’ll encounter in your use of iOS is the VoiceOver rotor, which is used to navigate by different levels of granularity, and can also be used to quickly change some VoiceOver settings.
Think of the rotor as a circular dial which you turn by placing two fingers on the screen and rotating them either clockwise or counterclockwise. Alternatively, you can place one finger on one hand on the screen, and make a circular motion with another finger on the other hand.
Lists on the rotor by default include Characters, Words, and lines, among other things. When on a webpage, the rotor includes headings, links, form controls, and other web element types.
Once you get to the list you want, navigate the available items by swiping up or down with one finger. For example, if you’re on the Home Screen and place your finger on the Mail app and turn the rotor to characters, swiping down once with one finger will cause VoiceOver to speak the letter M; further swipes will cause it to speak the following characters in the word. Swipe up with one finger to reverse the direction.
At various points, VoiceOver may instruct you to, “Swipe up or down to select a custom action, then double-tap to activate.” If you have hints disabled, the phrase, “Actions available,” will be spoken instead. This means that additional actions can be performed with the item under your finger. Typically, rotor actions serve as the equivalent of left and right swipe gestures for sighted users, and are used in a variety of contexts.
You can change what’s included in the rotor by going to Settings > Accessibility > VoiceOver > Rotor. For additional information and tips for making the best use of the rotor, check out this guide.
Try before you buy
Note: at the time of writing, the world is facing the COVID-19 pandemic, and as a result, open stores in your area may be taking special precautions to ensure distance between individuals. If you decide to travel to a store, check the business’s protocols beforehand and follow all health guidelines when in public.
Now that you hopefully have a basic idea of how to use iOS with VoiceOver, it may be a good idea to see how you do in practice. For me, that was a major factor in my decision to first get an iPod Touch in 2010, and an iPhone a year later, as my experience testing out a device in an Apple retail store and getting the hang of it fairly quickly simply blew me away. At that point, I had never used VoiceOver on anything but a computer, and that required the memorization of keyboard commands over a significantly longer period of time.
Once you get your hands on a test device, VoiceOver can be turned on without sighted assistance by pressing and holding the Home button, or Side button if the device doesn’t have a Home button, and saying, “Turn on VoiceOver,” once you feel a short vibration. When you’re done, engage Siri again and say, “Turn VoiceOver off.” As an alternative to using Siri, VoiceOver can be turned on with sighted assistance by instructing the person helping you to go to Settings > Accessibility > VoiceOver, and toggling it on for you. If you’re unable to go to a store, you can try this on any iOS device, like one belonging to a friend or family member.
Once VoiceOver has been turned on, it might be a good idea if you’re using someone’s device to ask them to unlock it so you can explore the Home Screen. Also if you’re using someone’s device, it is a good idea to ask the owner before opening any apps, as they may contain sensitive information that the owner might not want revealed and spoken aloud.
Once you get your iOS device, if it wasn’t set up in store, you will be walked through a brief setup process.
If it is an iPhone, depending on your wireless carrier, you may need to insert a subscriber identity module (SIM) into the device to connect it to your carrier’s cellular network. This involves poking the recessed hole on the right side of the iPhone with a paperclip and placing and aligning the SIM into the tray that pops out. However, you may have the option of having the SIM installed at the time of purchase, eliminating the need to perform this somewhat difficult task on your own. Therefore, if possible, this is what I recommend.
Next, for an iPhone, turn it on by pressing and holding the button on the right side, known as the Side button, for about five seconds. For an iPod Touch, turn it on by pressing and holding the button near the top right corner of the device for about five seconds. After about a minute, turn VoiceOver on either by pressing the Home button three times quickly, or if your device doesn’t have a Home button, by pressing the Side button three times quickly. VoiceOver should announce, “VoiceOver on.” Either press the Home button or slide your finger up from the bottom edge of the screen as described earlier to begin the setup process.
To start, you’ll be asked to select a language; move your finger around the screen until you hear your language, and double-tap to select it. Repeat this process to select your country, and if this is your first iOS device, select, “Setup manually,” when asked.
From here, Setup Assistant guides you through connecting to a Wi-Fi network, activating the device with your wireless carrier, signing in with your Apple ID, setting up biometric authentication, and configuring various other basic settings. If this is your first Apple product, it might be useful to create a free Apple ID on a device you’re more comfortable with prior to setting up your new iOS device.
Your Apple ID is the account used to access Apple services and sync your devices. If you’ve used the iTunes Store, for example, you already have an Apple ID.
At various points throughout the setup, you’ll be asked to type using the onscreen keyboard. As an alternative to finding and double-tapping on each character, you can hold one finger on the character you want to insert, and with another finger, tap once anywhere on the screen. This gesture is known as a split-tap, and can be used anywhere a double-tap can be used.
Once setup is complete, you’ll be placed on the Home Screen.
Some immediate post setup tips
Updating your software
Periodically, Apple releases updates to iOS and bundled first-party apps. As updates may have been released since your device was packaged at the factory, it is a good idea to check for updates once initial setup is complete.
To do this, go to Settings > General > Software Update. If an update is available, double-tap the, “Download and install,” button; note that you’ll need to restart your device to complete installation of iOS updates.
Updates to bundled first-party apps, as well as future third-party apps you install, are available by locating the AppStore app on the Home Screen and performing a one-finger triple-tap. Double-tap the, “Updates,” button in the menu and then double-tap the, “update all,” button if updates are shown. Alternatively, app updates can be viewed and installed by opening the AppStore and double-tapping the, “My account,” button at the top right.
Locking device orientation
While using your device, you may notice that VoiceOver announces changes from portrait to landscape orientation, which can relocate interface elements and thus be incredibly frustrating.
Portrait orientation is when the charging port is pointing toward you; landscape orientation is when the device is turned to the side, and is most useful when extra screen real-estate is needed. However, if like me, you always use your device in portrait orientation, you can prevent it from being changed, regardless of the physical position of the device.
To do this, place one finger on the status bar and swipe up with three fingers to reveal the Control Center. Alternatively, on devices without a Home button, the Control Center can be revealed by placing your finger on the top edge of the screen until you hear a brief tone, and then sliding straight down until you hear the second tone. Double-tap the, “Lock rotation,” switch to turn it on, and go home to dismiss the view.
Disabling Raise to Wake
Note: if you prefer an audio demonstration, there is an AppleVis podcast that demonstrates this process.
For added convenience, iOS can display the lock screen when the device is raised, like when it is removed from a bag or pocket. However, some VoiceOver users, myself included, find this annoying, as it seems that even small movements can wake the device and cause VoiceOver to start speaking. To turn this off, go to Settings > Display & Brightness and double-tap the, “Raise to Wake,” switch.
Devices without a Home button also include a feature called tap to wake, where a tap of the touch screen will cause the lock screen to be displayed. This can be turned on and off by going to Settings > Accessibility > Touch and double-tapping the, “Tap to Wake,” switch.
VoiceOver settings can be customized by going to Settings > Accessibility > VoiceOver. In addition, certain VoiceOver parameters can be quickly changed using the rotor, discussed earlier, and VoiceOver Quick Settings, accessed by performing a two-finger quadruple-tap from anywhere in iOS.
Specific parameters to include in VoiceOver Quick Settings, and the order they’re presented in, can be changed by going to Settings > Accessibility > VoiceOver > Quick Settings. For an audio demonstration of this feature, check out this AppleVis podcast.
The lock screen
To lock your device, where the touch screen is unresponsive to finger input and authentication is required to unlock it, press the power button. By default, you should hear what sounds like a lock closing. You should do this before placing your device in a bag or pocket to prevent erroneous inputs from registering.
Press the power button again when you want to unlock the device. When you do this, the time and number of notifications should be spoken and the lock screen will be displayed. Navigate either by exploring or swiping left and right to move directly to the next or previous element. Elements on the lock screen generally include the time, date, and any notifications received since the device was last used, among other things.
To unlock the device, either rest your finger on the Home button or position your face in front of the camera, depending on your device model. If you did not set up any biometric authentication, or if your device lacks this capability, like the iPod Touch, attempt to go home and enter your passcode when prompted. You will then be placed where you left off when you last used the device. If you double-tap on a notification from the lock screen, you’ll be prompted to authenticate after which you’ll be placed in the app that sent the notification.
From the lock screen, you can access the camera either by double-tapping the camera button or swiping left with three fingers, depending on your device model.
Notifications are alerts delivered by apps and the operating system to signify when something requires your attention, regardless of whether your using the app that sent the notification at that time or not. Notifications include missed calls, texts, emails, social media activity, news alerts, and pending iOS update notices, among other things.
If you place your finger on the status bar and swipe down with three fingers, all notifications which you’ve yet to act on will be displayed. Alternatively, on devices without a Home button, notifications can be accessed by placing your finger on the top edge of the screen until you hear a brief tone, and then sliding straight down until you hear the third ascending tone. Double-tap on a notification to open it in the app that sent it, or use the Actions rotor to view additional options specific to the alert. For example, if you select the, “More,” action on an incoming text, a textfield will be displayed, allowing you to reply to the message without needing to open the Messages app.
If you’d rather certain apps not send notifications, you can turn this capability off on an app-by-app basis in Settings > Notifications. Additionally, individual apps’ settings may include more granular controls for determining what events or types of content that app will notify you about.
Even for notifications you find useful, you may find, as you use iOS more, that they can be quite distracting when received at the wrong time. To help you manage these common distractions, you can use, “Focuses,” profiles that allow you to configure what apps and people, at what times, will cause a notification to show up on your device.
If you, for example, do not want to receive notifications for anything, you can turn on do not disturb, which blocks most notifications, or you can create your own focuses for increased customizability by going to Settings > Focus. For an audio demonstration of how focuses can be created and used on iOS, check out this AppleVis podcast.
In addition to customizing iOS via the Settings app, some basic parameters can be changed by placing your finger on the status bar and swiping up with three fingers to reveal the Control Center. Alternatively, on devices without a Home button, the Control Center can be revealed by placing your finger on the top edge of the screen until you hear a brief tone, and then sliding straight down until you hear the second tone.
By default, you can toggle airplane mode, Wi-Fi, Bluetooth, sound volume, and other settings. For some items, you can select the, “Open controls,” rotor action to reveal additional options related to that feature. Additional settings for Control Center, such as what parameters to include and exclude, can be configured in Settings > Control Center.
Today View is intended to give you an overview of your day, displaying information such as calendar events, due reminders, top news stories, and suggestions to open apps based on your regular use patterns. Quick snippets of information are contained in extensions of apps called widgets. Both first and third-party apps can have widgets, and for added convenience, these can be added to the Home Screen. Further customization of what’s included in this view and how much information iOS can collect can be configured in Settings > Siri & Search.
This view can be accessed from the Home or lock screen by swiping right with three fingers; you may need to do this several times if you’re coming from the Home Screen, depending on what page you’re on.
The Home Screen
When you turn on and unlock your device, you will be placed on the Home Screen, which is a grid of apps. Out of the box, the iOS Home Screen consists of two pages of apps, but this number will expand based on how many apps you install from the AppStore.
To change the page, either swipe left or right with three fingers or navigate to the picker and swipe up or down with one finger; swiping below page 1 reveals the Today View, and swiping above the last page of apps reveals the App Library, discussed later.
If you want to get a different view of all the apps installed on your device, you can access the App Library by swiping passed the last page of apps on the Home Screen.
From here, you’ll be presented a list of all your apps, organized by category. Move between categories by navigating by containers in the rotor. Double-tapping the search field near the top of the screen will present an alphabetical list of all your apps. You can view this list and double-tap to open an app, or search using the keyboard at the bottom of the screen.
If you’d rather newly downloaded apps not show up on the Home Screen by default, you can configure them to only show up in the App Library by going to Settings > Home Screen and double-tapping, “App Library only” under the “newly downloaded apps,” heading.
Editing apps and widgets
As mentioned earlier, the organization of apps and widgets on the Home Screen, App Library, and Today View can be edited in various ways. To start editing, focus on an app or widget, swipe down with one finger to the, “Edit mode,” rotor action and double-tap. You can then use the actions rotor to drag apps and widgets by focusing on them, selecting to start a drag session, moving to where you want to move the items to, and then selecting to drop them at that location.
To add a widget to the Home Screen, double-tap the, “Add widget,” button and select the one you want to add. Note that an app can spawn multiple widgets to display different amounts of information and take different amounts of space on the screen.
If you opt to place one app directly on top of another, a folder is created, which iOS will attempt to name based on the categorization of the apps inside it; dragging all but one app out of a folder will disband the folder.
To delete or hide an app, triple-tap it on the Home Screen and choose the, “Remove app,” option from the context menu, then confirm whether you want to delete it from your device, move it to the App Library, or cancel the operation. Alternatively, you can delete or hide an app from within edit mode by moving to it and choosing the, “Delete,” rotor action.
When you first get your device, Apple includes several additional apps not part of the operating system that you may or may not find useful. To save space on your device, you should delete them if you don’t use them; they can usually be redownloaded from the AppStore later.
In your use of iOS, you’ll undoubtedly come upon situations where you’re working in one app and need to quickly switch to another one. There are two main ways to do this, the most direct being to swipe left and right with four fingers. Swiping right will take you through your last used apps, swiping left will move to the previous app. Additionally, if you, for example, double-tap a link in an email message to load a webpage in Safari, iOS provides a handy, “Return to…” button at the top left of the screen which you can use to return to your last-used app, eliminating the need to swipe back with four fingers.
You can also view your open apps with the app switcher, accessed either by pressing the Home button twice quickly or touching the bottom edge of the screen and sliding up until you hear the third ascending tone, depending on your device model. From here, swipe left and right with one finger through your apps and double-tap the one you want to use. Swipe up with three fingers on an app to close it, which is generally only necessary if an app becomes unresponsive or repeatedly crashes. While iOS is pretty good about shutting down apps that are problematic or resource-intensive, not all apps are created equal, and thus anomalies can happen.
Typing on iOS
While typing on a flat slab of glass may take some getting used to, you have a number of options that can make it quite easy and straightforward for you. In this guide, I will give an overview of several, but keep in mind that I am only scratching the surface; you can find more information on the AppleVis forum, other Apple centric user lists, and of course, your own exploration.
When double-tapping on a textfield, VoiceOver will generally announce that the field is, “Editing,” which means the keyboard is displayed near the bottom of the screen. If it is a secure text field, like the kind used to input passwords and other sensitive data, VoiceOver will indicate a key pressed with a click sound, as opposed to echoing what is being typed.
The default iOS keyboard is laid out like a tactile Qwerty keyboard. For me, it was helpful to glide my finger around the keyboard so I could gradually get a picture in my head of where different keys would be situated. I found I could then touch one key, and efficiently move my finger to the general location of my next intended key. As said earlier, rather than navigating to and double-tapping each key, it may be quicker to place one finger on the key and tap the screen with another finger.
If you need to type numbers or symbols, double-tap the, “Numbers,” keyboard on the bottom left. Use the, “Symbols,” keyboard if you need to type a symbol that is not on that keyboard.
As you type, a small strip just above the top of the keyboard will be populated with words that iOS thinks you might be typing; double-tap a suggestion to insert it. As you get more comfortable with the iOS keyboard, you can explore the different typing modes VoiceOver offers, and decide what works best for you.
To increase the ease of use of the iOS keyboard, VoiceOver offers three distinct typing modes, Standard Typing, Touch Typing, and Direct Touch Typing. This can be changed by turning the rotor to typing mode and swiping up and down with one finger until you hear the mode you want. Standard Typing is the default, where you must navigate to and double-tap or split-tap on keys.
Touch Typing allows you to move to a key and lift your finger to insert it. This can help to improve typing speed as long as you don’t lift your finger too haphazardly on keys you don’t want to type.
Optionally, if you rest your finger on a key and then slide your finger to another key, iOS will try to predict what you’re typing. For example, if you’re using Touch Typing and rest your finger on the letter H for about one second, and then slide it to the general location of the letter E, and then move it to the letter L, and then the letter O, iOS will predict the word, “Hello.” If the wrong word is predicted, simply press the delete key and it will be deleted. More information and an audio demonstration of this feature can be found in this AppleVis podcast. To turn this feature off, go to Settings > General > Keyboard and double-tap the, “Slide to Type,” switch.
Direct Touch Typing allows you to simply touch a key and have it inserted, as if VoiceOver wasn’t running at all. This may be useful for those who have enough usable vision to see which keys they are typing and don’t want to perform any extra VoiceOver gestures.
If you’d rather speak than type, you can use the iOS dictation feature. To dictate text, perform a magic tap when you’re in a textfield (a magic tap is a two-finger double-tap) and start speaking when you hear a ding sound. Perform a second magic tap when you’re done speaking, and the text will then be inserted and spoken by VoiceOver. In addition to words, you can also dictate punctuation and line breaks.
Braille Screen Input
If you prefer to type on a Braille keyboard, like the kind found on a Perkins braille writer or Braille notetaker, you can use Braille Screen Input (BSI) on iOS to mimic the typing experience.
Braille Screen Input allows you to place your fingers on the screen the way you’d position them on a Braille keyboard and have VoiceOver predict what dots of a braille cell you’re typing. For example, if you tapped the right side of your iOS device with one finger, VoiceOver would interpret a dot six being entered. If you place one finger toward the left or center of the device, it would interpret a dot one. To insert a space, swipe right with one finger; swipe right with two fingers to insert a line break.
To set up this feature, go to Settings > Accessibility > VoiceOver > Rotor, and select Braille Screen Input. Any time you want to type in Braille, turn the rotor to Braille Screen Input and if it is not already, turn your device to landscape orientation to maximize the space you have. This will work even if your device’s orientation is locked.
You can switch between contracted and uncontracted braille by swiping right with three fingers when in BSI, or change the braille table by going to Settings > Accessibility > VoiceOver > Braille. Additionally, you can change the typing echo for braille by going to Settings > Accessibility > VoiceOver > Typing > Typing Feedback, and selecting an option under the, “Braille Screen Input,” heading.
For more detailed information on how to use this feature, check out this guide.
Emojis are special characters that can give a sense of personality to what you’re typing. These can be used to convey facial expressions, expressions of individual identity, objects, and more.
The emoji keyboard is located near the bottom left of the iOS keyboard, and can be accessed from most textfields. A list of categories is given across the bottom, and you can search for specific emojis and symbols using the search field at the top of the keyboard.
No matter how easy it can be to type on a touch screen once you get the hang of it, there may be situations where the use of a tactile Qwerty keyboard is preferred. For me, if I’m writing a large block of text quickly, I much prefer typing on tactile buttons to any of the software keyboard methods.
Keyboards can be paired with your device via Bluetooth, and VoiceOver supports basic navigation with keyboard shortcuts. If you’ve used macOS, these commands will feel very familiar to you.
Similar to macOS, VoiceOver commands are denoted by the VoiceOver modifier, which by default is the Control and Option keys, referred to as, “VO,” for short. Therefore, if you are instructed, for example, to press VO-Space, hold down the Control and Option keys and press the space bar.
Move between elements with VO-left and right arrow, and activate items with VO-Space. For added convenience, you can turn on Quick Nav by pressing the left and right arrow keys together, allowing you to press the arrow keys without needing to use the VoiceOver modifier. With Quick Nav on, you can also adjust the rotor by pressing the up arrow with either the left or right arrow, and then use the up and down arrow keys to navigate the available items; press the up and down arrow keys together to activate items.
Other VoiceOver keyboard commands can be practiced by pressing VO K to access VoiceOver help, and many third-party apps include their own keyboard shortcuts, many of which are similar to those on macOS. For more detailed information on using a hardware keyboard on iOS, check out this guide.
In addition to spoken feedback, VoiceOver on iOS supports a wide variety of refreshable braille displays. Braille displays can be connected to your iOS device via the Lightning to USB 3 Camera Adapter, or paired and configured via Bluetooth by putting your display into its pairing mode (it is known by a variety of names) and going either to Settings > BlueTooth, or Settings > Accessibility > VoiceOver > Braille on your iOS device. Check your display's documentation for specific pairing instructions.
While I am aware of iOS’s support for refreshable braille, I do not own a braille display, so cannot comment on the quality or usability of these features. For an introduction and overview of using refreshable braille on iOS, check out this guide, or if you prefer an audio demonstration, this AppleVis podcast.
If you have additional questions or problems with braille on iOS, your best bet is to post to the AppleVis forum or a similar user list, or to contact Apple or the display’s manufacturer for assistance.
As mentioned earlier, Siri is the intelligent personal assistant built in to iOS and other apple platforms. It can be used to look up information such as weather forecasts, sports scores, stock prices, and calendar events and reminders, call or text people, turn some settings on and off, open apps, and more. It can be engaged by pressing and holding the Home button, or Side button if your device doesn’t have a Home button; let go of the button when you’re done talking. Siri should then respond with an answer, a followup question, or if it searches the web, an interface with the search results.
When you first set up your device, you were also probably asked to train Siri to recognize your voice, which allows you to activate it without pressing any buttons by saying, “Hey Siri,” followed by your command. For example, if I wanted to call a contact named John, I would say, “Hey Siri call john,” and it should respond by calling that contact. Likewise if I wanted to text them, I would say, “Hey Siri text John,” followed by the content of my message. The following is a list of some other things you can say to Siri, though it is in no means comprehensive. Also if you have third-party apps installed, Siri can perform additional functions related to those apps if the developer has programmed support.
- What’s the weather today?
- What’s the weather like this weekend?
- What time is it in London?
- What’s Apple’s stock price?
- How’s the Dow doing today?
- What’s the score of the Red Sox game?
- How tall is LeBron James?
- How many Super Bowls has Tom Brady won?
- How many ounces are in a pound?
- What’s 5 Dollars in Euros?
- How do you say, “Hello,” in Spanish?
- Set an alarm for 9 AM tomorrow.
- Turn my 9 AM alarm off.
- Remind me to get milk when I leave work.
- Read me my texts.
- Play Beyonce (supported streaming subscription or local song download required).
- What’s the song that goes something like “To the left, to the left?”
- Play Empire State of Mind (supported streaming subscription or local song download required)
- Turn on do not disturb.
- Open VoiceOver settings.
- Find the nearest 7-11
- Call BestBuy (it will automatically present listings around your current location)
Siri settings can be changed by going to Settings > Siri & Search.
As you use iOS, you may find that you do a lot of things repetitively. The Shortcuts app allows you to configure a phrase that when said to Siri, will cause it to perform a series of actions automatically.
For me, when I work out, I like to have a certain playlist up and don’t want to be interrupted by notifications. Instead of manually turning on do not disturb, and then opening the playlist, I can just say, “Hey Siri I’m working out” and the playlist will be opened and do not disturb will be turned on for me.
Siri shortcuts can also perform functions within third-party apps if the developer has programmed support for this feature in their app. For example, when I go to a specific fast-food restaurant, I always order the same thing. Rather than go into that restaurant’s app and place my order every time, I just say, “Hey Siri order my usual,” and a cart with my items is presented to me.
For a more in-depth description of how shortcuts can be created and used, check out this blog post.
If there is one thing iOS is known for, it is the vibrant market of third-party apps and the industry that Apple invigorated in the later half of the 2000s. All iOS apps can be found in the AppStore, itself an app on the Home Screen. Some are free, while others are paid; still others are free to download but require a purchase or subscription to continue using the app or unlock additional functionality.
If you need to check up on the news, read a book, watch a movie or tv show, get in touch with friends, join or host a video conference or webinar, communicate with a healthcare provider, play a game, and more, there’s most likely an app for that. The assistive technology landscape has also been affected, as there are now apps that can perform the functions of specialized hardware devices.
Additionally, many games have been released over the years that can be played without vision, such as interactive stories and audio games. Some of these are intended to be played by predominantly blind and low-vision gamers, while others can be enjoyed by sighted and blind people alike.
Chances are if you’re new to iOS, you probably do a lot on your computer, such as listen to music and books, communicate with people, pay bills, perform banking tasks, catch up on current events, and create contents such as documents or audio recordings. All these things can be done with iOS apps. Social networks like FaceBook, Twitter and others have apps, as do music streaming services like Spotify, video streaming services like Netflix, most major news organizations, and many banks around the world.
Unfortunately, however, not all of these apps are accessible with VoiceOver, which is one of the reasons why I’m deliberately not recommending specific apps in this guide; the fast pace of app development means that a single update can have wide-ranging and unpredictable effects on accessibility. Also, the logistics and economics of app development mean that not all apps are sustainable longterm, so by the time you read this, certain titles may no-longer be available or actively maintained.
Information about the accessibility of many apps can be found in the AppleVis iOS app directory, a place where blind and low-vision users can submit descriptions and information about apps they’ve used. If you’re looking for the best of the best in app quality and accessibility, check out the AppleVis iOS app hall of fame, where community members vote annually on apps that demonstrate excellence in usability and accessibility.
If you download an app that does not appear to be accessible with VoiceOver, feel free to submit it to the directory so that other users know. Even more importantly, you should contact the developer with your concerns, as many developers simply don’t know about VoiceOver and may become motivated to address issues when they hear from users who rely on the technology.
If you must use an app that is inaccessible or poorly designed, iOS may be able to analyze the interface and recognize text, buttons, and other elements that can then be read by VoiceOver. To turn this on, go to Settings > Accessibility > VoiceOver > VoiceOver Recognition > Screen Recognition, and toggle it on; a small download will be required.
As this feature is largely a work in progress and relies on machine learning, your results may vary. Due to this inconsistency, screen recognition is not intended as a substitute for good accessibility practices by developers, but rather a stopgap solution that could assist users in a pinch until the developer makes the changes necessary to make their app natively accessible. More information and an audio demonstration of this feature can be found in this AppleVis podcast.
Of all the different kinds of apps that exist today, one group has had a particular impact on increasing my independence, apps that describe the visual world around me. There are a variety of apps that serve a range of purposes, which I will give an overview of, while not endorsing any specific app. As mentioned earlier, apps and developers’ circumstances change too rapidly for me to ensure that information in this guide will remain sound going forward.
If you live life without being able to see, at least if the US dollar is your primary currency, you’ll know that all bills feel exactly the same, with no way to tell different denominations. Before the advent of currency identification apps, someone who is blind or low-vision would have to rely on sighted assistance or use a dedicated hardware device to identify the different denominations of bills.
There are currently several currency identification apps available in the AppStore, which support a wide range of currencies and all have their unique features. One of these is Microsoft Seeing AI, which is free to use, as it is a research project by Microsoft. Another is Cash Reader, an app that identifies money without needing an Internet connection, available with a number of subscription options. Whichever solutions you choose, the central concept of placing a bill under your iOS device’s camera and having the denomination spoken by VoiceOver applies near-universally.
No matter how hard you press people in your life to provide you with documents in either an accessible electronic or braille format, there inevitably are some things that slip through the cracks. However, document scanning apps, as well as features of iOS, that use the advanced camera technology on your iOS device can help turn that useless piece of paper into something actually worth your time and energy.
That said, in my opinion, document scanners are not a substitute for good document accessibility practices. Instead, it is better to think of them as an impact mitigation strategy, making something that is completely inaccessible somewhat accessible.
Out of the box, the built-in Camera and Photos apps can recognize text in images, and allows you to select and copy text as it’s being scanned. This may be useful when, for example, scanning an image for a phone number, Wi-Fi password, or other code you want to retain without needing to save or rescan the image again. For an audio demonstration of this feature, check out this AppleVis podcast.
The previous two categories I discussed refer to apps that perform a small number of standalone functions. However, apps like Microsoft Seeing AI and Envision AI have added a new style of assistive app, a Swiss Army Knife, if you will, of functions.
In the case of Seeing AI, from one app, you can recognize short text, documents, product bar codes, currency, light, and more. Seeing AI, being a research project by Microsoft, is free to use, whereas Envision AI requires a subscription after the trial period has expired; their respective feature sets are generally comparable.
Remote sighted assistance
While artificial intelligence has become incredibly powerful for certain things, there are still situations where help from a human is preferred. Over the years, services where a person who is blind or visually impaired can make contact with a remote sighted person have been introduced, the two most notable being Aira and Be My Eyes.
Aira is a subscription service that connects you to a professionally trained agent to help you with visual tasks and experiences. Be My Eyes has a similar purpose, but rather than a paid subscription service, it is composed of a network of sighted volunteers and thus it is free to use.
Conclusion and additional resources
Although this guide touches on numerous topics, I am truly only scratching the surface of what you can do with an iOS device; iOS is an incredibly complex operating system that has given rise to a number of unique and intimate uses that would be impossible to cover in one guide. However, having now read through all or part of this guide, you hopefully have an idea of what you can do and how to do it. More information is available on the AppleVis forum, and below are a few links to some more potentially useful resources:
- A Complete List of iOS and iPadOS Gestures Available to VoiceOver Users
- Found an Accessibility Bug in iOS, iPadOS, macOS, watchOS or tvOS? Here’s How to Let Apple Know and Why You Should
- How to Contact Apple for Accessibility Inquiries
- How to Set up your Emergency contacts and ID on your iPhone
- iCloud Explained
- iDevice Primer 105: How do I answer, Manage, and End a Phone Call?
- Quick Tip: Using the Text Selection Rotor to Select Text on iOS
- Spell checking using the misspelled words rotor option with VoiceOver on iOS
- Toggling VoiceOver On and Off Using the Accessibility Shortcut on iOS and iPadOS
If you have any other suggestions or want something clarified, sound off in the comments.