Fall is almost here, and in the year of unpredictability, we have something which remains consistent. iOS 14.0 is out the same week it has been in previous years. It will have a significant number of enhancements for users of the iPhone 6s and newer, as well as the iPod 7th generation. Mainstream changes include the ability to put widgets on the Home screen, an App Library, enhancements to Messages, new privacy settings, and many more. A lot of outlets will be covering these features in detail, so I will not be covering them here. This post looks specifically at the new accessibility features for those who are blind or deaf-blind. Please note that some features are not available on all devices. When this is known, it will be noted in the appropriate section.
While I have worked with a team of people who have been beta testing iOS 14 and iPadOS 14 since the first betas were released, it is likely that we may have missed some features. Like last year, I do not feel I have enough information to comment on the low vision features, so it is my hope that a low vision user will have taken the time to review these changes such as the new Magnifier app and the overall state of iOS 14 and iPadOS 14.
Let’s Jump Backward
If you find yourself deep in menus, such as Settings > Accessibility > VoiceOver > braille > More Info > Commands, pressing the "Back" button so many times to get back to the main Settings screen can be a real chore. Doing the equivalent of a double-tap-and-hold gesture on the touchscreen with VoiceOver on the “Back” button now launches a context menu with options to jump backward to any point from where you are currently in the Settings screen. With my example, I am able to jump back to any of the submenus previous to that point, including going directly back to the Settings main screen.
Dude, Where’s My Emoji?
It’s now possible to search for an emoji and insert it into where ever You may be composing something.
To do so, you will first need to jump to the Emoji keyboard. Then, find the search area, which is generally in the middle of the touchscreen, and double-tap it. Enter your search term, for example, “grinning,” and then flick right and double-tap whatever emoji you would like to insert from the list of results. After double-tapping your desired emoji, to continue writing, go back to your original keyboard and proceed as you would normally. Be sure you are editing in the correct text field, as sometimes the search field does not disappear and your VoiceOver cursor may be stuck there.
Infrequently, if you enable the Emoji keyboard and then flick left and right again, VoiceOver will alert you to the fact that possible emojis are available, and these will now be a rotor option. Turning the rotor to emojis and then flicking up and down will present you with a list of possible emojis based on the text you composed. After inserting the emoji you wish, switch back to your normal keyboard. Sometimes, you will find that VoiceOver places focus somewhere else. If the process seems convoluted, you are correct. After trying this feature, I have gone back to using Text Shortcuts for inserting emojis as it is ultimately faster. Trying to use the emoji search feature with braille only is even more of a challenge. Though this is not specifically an accessibility feature, it would be a welcome enhancement if it functioned as expected.
One of the new apps bundled with iOS 14 and iPadOS 14 is the Translate app. Specific to accessibility is the option to translate from speech to text. If you set both your “from” and “to” languages to be the same, and then activate the “listen” button, it is possible for someone to speak into your device and then the text will appear on your braille display. There are, however, some major limitations to this feature. First, once the person quits talking, your device will quit listening. This happens even if there is a brief pause between sentences. Second, like any automatic speech recognition, the accuracy will depend greatly on your environment. If you are in a situation where there is more than one person talking; if you are in an area with a lot of background noise; or if the speaker has a strong accent, these factors will make the translation less accurate. I have found, after downloading the English translation package for on-device translating, that it is less accurate than using the app with Apple’s servers. This is also noted under Settings > Translate.
If you wish to use Translate in the manor described above, after setting “from” and “from” to the same language, find the “listen” button located near the bottom of the screen.
Note that at the time of writing, “from” is what VoiceOver indicates both fields are, even though the second is “to”. After pressing the “listen” button, the person wishing to speak can do so, but must do it within about two seconds of activating the button. Once the Translate app thinks that person is done speaking, one of two things will happen. If the translated text is spoken out loud, the user will be returned to the upper left corner of the screen and will need to scroll to the right to the first thing that has been translated. Continue scrolling to either type a message, or one option further, to activate the “listen” button again.
If the text is not spoken out loud, it will flash up on the display once it has been translated. In this case, move to the right one option to find the “close” button and activate “listen” again. If the deaf-blind user cannot speak their reply, it is also possible to type text in return. To the left of the “listen” button, there is an option to enter text. After you are confident the individual reading the screen has read your reply, you can select “close” and then activate the “listen” button again. This could be a system someone may want to use for face-to-face communication with a hearing person who they interact with often, but seems too complex to use for an interaction between a stranger in public and a deaf-blind user. For my part, I will stick with a bluetooth keyboard and my braille display, even with its challenges.
It has been possible for quite some time to submit your camera roll to a service such as Seeing AI or TapTapSee to attempt to determine what a specific photo is. If you would like to give photos your own description, one way of doing this is by adding captions. When in the Photos app, find a photo that you would like to add further description to. Then, access “show details” through the VoiceOver Rotor and select it. Next, scroll right to “Add a caption”. You will land on a text field where you can enter any details you wish about the photo. Finally, select the “done” button and your caption will be saved.
In iOS 13, detected text was expanded to not only allow braille users the option of reading the descriptions, but it started reading more descriptions such as when a button had an image that could have imbedded text. With iOS 14 and iPadOS 14, this has been expanded even further to include the possibility to improve app accessibility. You can find these options by going to Settings > Accessibility > VoiceOver > VoiceOver Recognition. Please note that this feature is only supported on iPhone XS/XR models and later.
Image description is the first option. From what I can tell, this feature is unchanged from iOS 13 and iPadOS 13, other than that it moved from the Verbosity menu in VoiceOver to the Recognition menu. Your iPhone will attempt to provide a description of the images it encounters in apps and when using websites.
The second option is called “Screen Recognition.” This new feature can make some apps more accessible. It also will provide you with potentially useful information about screenshots you encounter in apps such as the App Store or Facebook. A friend of mine posted something about his pet dog on Facebook, which Facebook itself simply told me that there was no description available. I then enabled Screen Recognition, and learned that he has a black and white dog with red eyes. Keeping Screen Recognition on will slow down your device and will change how things are spoken/displayed with VoiceOver. At the time of writing, this feature seems to work well in some cases, such as the description of the picture, but not others. It seems to be a work in progress, and I’m encouraged by the progress made so far.
Though the settings allow you to have Screen Recognition enabled for specific apps, you can also easily toggle it by adding it to the rotor. There is also the option to assign a braille display command to automatically toggle this feature, but if you assign a braille display command, it currently won’t work.
The third option is Text Recognition. This seems to be slightly more responsive than what it was under iOS 13 and iPadOS 13, unless you are a braille user. Text Recognition is once again not accessible with braille.
Under the “Verbosity” submenu within VoiceOver menu, a few new options have been added. “Show more content” is an option that can appear on images that have been recognized. Flicking up and down, when this option is available should give you more information about the image in focus. This could include the Alt Text of the image, the file name of the image, or perhaps other things I have yet to encounter.
Container descriptions is another added option. You have the option of having a sound played when a description is available; to speak the text normally; to have the pitch of the speech lowered for the description; or you can turn it off entirely. Though the description through speech works with both of those settings, I was not able to get VoiceOver to play a sound. It’s also worth noting that the container information spoken by VoiceOver is not displayed in braille.
Where Was I?
Keeping up with text messages can be quite a chore, especially when text interaction is your primary way of communicating with the world. iOS 14 and iPadOS 14 now let you mention users, and there is a rotor option for that. “Pin conversation” is also a new option that will “pin” selected conversations to the top of the list of conversations.
Now, in a thread of messages, the last message you have sent in a thread will be at heading level 3, so you can quickly jump to where you most likely left off.
Auto Scroll Comes to iOS and iPadOS
This feature has been available on some of the braille displays sold by a German company called HelpTech, but Apple has now brought it to all braille displays. To set it up, you will first need to assign a braille keyboard command to start and stop auto-advance. You will find the option for assigning a new braille keyboard command under the braille category. There are also options to assign keyboard commands to control the speed of the auto-advancing. If you are unfamiliar with how to assign braille keyboard commands to certain actions, this guide will walk you through the process.
If you don’t wish to assign braille keyboard commands to control the speed of auto advance, you can also add this to your VoiceOver Rotor and control the speed that way. I found this feature to work reliably and enjoyed using it. The longest stretch I have read continuously was for nearly an hour, and I experienced no issues.
Making Things Smaller
In iOS 14 and iPadOS 14, certain elements have been abbreviated to conserve space. For example, TB is now what you will see instead of “tab”.
Apple made a big deal about machine learning in earlier Worldwide Developer Conferences, and many wondered what it could do for those with disabilities. VoiceOver Recognition is one example, and Sound Recognition is another. To use Sound Recognition, go to Settings > Accessibility > Sound Recognition and turn it on. After a small download, you will then be able to have your device recognize certain sounds. These sounds are organized by several categories including: alarms, animals, household, and people. Each category, including each type of sound, will be briefly discussed below. In general, results of my testing have found that to detect the noise, it must be louder than any of the surrounding sounds. When a sound is detected, it will show up as a notification and also send the user a vibration when on an iPhone.
Under the “Alarms” category, the options are fire, siren, and smoke. I did not have the chance to test for a fire alarm, other than to play it through a Youtube video. When I did this, the iPhone did not detect the sound. Siren detection required that the siren be quite loud. During a tornado siren test, my iPhone did not pick up the siren, as it was quite some distance from my location. When the siren was quite loud, the iPhone picked it up reliably and sent me a notification. The notification was not intrusive at all, though with the potential for alerts such as a smoke alarm, I would hope there would be a much more urgent type of notification such as those delivered in an emergency alert. The two times I tested the smoke alarms in my apartment, I was alerted within two seconds. However, it’s an alert as casual as a notification of a change in a sports score, a new message on Slack, or many other notifications I wouldn’t feel an urgent need to investigate. As I have a high frequency loss, the shrill sound of a smoke alarm is not something I can hear anymore, so I would feel more safe using this as a back up to another already installed system I utilize.
In the animals category, there are options for both a dog and cat. I found that bigger dogs, with a louder bark, often could be detected from quite some distance away. However, smaller dogs, with a higher-pitched bark, seem to be reported less frequently. The cat sound detection works for meowing and doesn’t seem reliable. The higher pitched the noise, the less often the device picks up on it. I also barked at my phone and felt like a good boy, as my iPhone reported that it thought it had detected a dog.
Under the household category, the sounds available include appliances, car horn, doorbell, door knock, and water running. It was not clear to me what exactly appliances Sound Recognition is referring to, as running a dish washer, washing machine, and dryer did not alert me to any sounds. Car horns I was often alerted to when commuting through cities. Doorbell sounds from Youtube reliably set off the recognized sound notification. Door knocking generated a few false positives if you hammer or knock on a table. Water running also produced a couple false positives, but alerted me to running water on one occasion when I left the water running on my bathroom sink. Like VoiceOver Recognition, this seems to be a work in progress; it’s my hope that the progress will continue as iOS and iPadOS continue to evolve.
If you go to Settings > Accessibility > Headphone Accommodation, it is possible to adjust your listening environment by amplifying some sets of sounds and dampening others. This feature is only available on AirPods Pro, the second generation of AirPods, some of the Beats headphones, and EarPods. When turned on, if you have a set of supported Headphones, you will be walked through a form of a hearing test which will ask you questions about what you hear. Based on your results, Headphone Accommodation will adjust your audio experience to better suit your hearing situation. There is also a “transparency” mode which allows you to hear the environment around you. This feature is only available on the AirPods Pro. Once configured, you have the ability to further enhance low, medium, or high frequencies. I was not able to test this myself, as I do not have a pair of compatible headphones.
I Recognize You are Signing This
FaceTime group calls have already had the feature which automatically detects when someone is speaking and focuses to their video feed; iOS 14 and iPadOS 14 bring this option to American Sign Language.
Back Tap is a feature available on all iPhones from the XS onward. It allows you to double-tap or triple-tap the back of your iPhone to carry out a specific action. That specific action can be any number of things ranging from toggling accessibility settings to performing VoiceOver functions or even Shortcuts. To turn it on, go to Settings > Accessibility > Touch > Back Tap. From here, you can set any number of accessibility options such as VoiceOver, Zoom, Assistive Touch, and so on. One way to look at Back Tap is as a secondary accessibility shortcut if you require the use of more than one accessibility feature.
There are also options to assign a specific gesture to Back Tap using VoiceOver, but you will need to navigate to the appropriate menu. Settings > Accessibility > VoiceOver > Commands > Touch Gestures will get you there. Then, scroll to the very bottom of the screen, and you will find options for both double-tapping and triple-tapping the back of your phone. Options exist for any other item that can be assigned a touch gesture. As an example, I have a shortcut set up which turns bluetooth off and back on. This is the double-tap. For the triple-tap, I have the VoiceOver Magic Tap assigned.
Regardless of your accessibility needs, Apple continues to innovate with new accessibility features with each major iOS and iPadOS release. iOS 14 and iPadOS 14 are no exceptions. However, there is also no exception to there being bugs present. As a general rule, though there are actually more bugs in iOS 14 and iPadOS 14 than 13, these bugs are not what many would consider showstoppers. Before upgrading, I encourage people to check out the AppleVis blog post detailing new and fixed bugs in iOS 14 and iPadOS 14 and the AppleVis iOS and iPadOS Bug Tracker. If you find a bug that is not tolerable, don’t upgrade, as returning to iOS 13.7 or iPadOS 13.7 may only be an option for a short time after iOS 14 and iPadOS 14 are released. If that is the case, stay tuned to AppleVis, as we will continue to update the status of bugs as the iOS and iPadOS release cycles progress. To download the update over the air, go to Settings > General > Software Update and follow the prompts on screen.