What’s New in iOS 14 and iPadOS 14 for Blind and Deaf-Blind Users
Fall is almost here, and in the year of unpredictability, we have something which remains consistent. iOS 14.0 is out the same week it has been in previous years. It will have a significant number of enhancements for users of the iPhone 6s and newer, as well as the iPod 7th generation. Mainstream changes include the ability to put widgets on the Home screen, an App Library, enhancements to Messages, new privacy settings, and many more. A lot of outlets will be covering these features in detail, so I will not be covering them here. This post looks specifically at the new accessibility features for those who are blind or deaf-blind. Please note that some features are not available on all devices. When this is known, it will be noted in the appropriate section.
While I have worked with a team of people who have been beta testing iOS 14 and iPadOS 14 since the first betas were released, it is likely that we may have missed some features. Like last year, I do not feel I have enough information to comment on the low vision features, so it is my hope that a low vision user will have taken the time to review these changes such as the new Magnifier app and the overall state of iOS 14 and iPadOS 14.
Let’s Jump Backward
If you find yourself deep in menus, such as Settings > Accessibility > VoiceOver > braille > More Info > Commands, pressing the "Back" button so many times to get back to the main Settings screen can be a real chore. Doing the equivalent of a double-tap-and-hold gesture on the touchscreen with VoiceOver on the “Back” button now launches a context menu with options to jump backward to any point from where you are currently in the Settings screen. With my example, I am able to jump back to any of the submenus previous to that point, including going directly back to the Settings main screen.
Dude, Where’s My Emoji?
It’s now possible to search for an emoji and insert it into where ever You may be composing something.
To do so, you will first need to jump to the Emoji keyboard. Then, find the search area, which is generally in the middle of the touchscreen, and double-tap it. Enter your search term, for example, “grinning,” and then flick right and double-tap whatever emoji you would like to insert from the list of results. After double-tapping your desired emoji, to continue writing, go back to your original keyboard and proceed as you would normally. Be sure you are editing in the correct text field, as sometimes the search field does not disappear and your VoiceOver cursor may be stuck there.
Infrequently, if you enable the Emoji keyboard and then flick left and right again, VoiceOver will alert you to the fact that possible emojis are available, and these will now be a rotor option. Turning the rotor to emojis and then flicking up and down will present you with a list of possible emojis based on the text you composed. After inserting the emoji you wish, switch back to your normal keyboard. Sometimes, you will find that VoiceOver places focus somewhere else. If the process seems convoluted, you are correct. After trying this feature, I have gone back to using Text Shortcuts for inserting emojis as it is ultimately faster. Trying to use the emoji search feature with braille only is even more of a challenge. Though this is not specifically an accessibility feature, it would be a welcome enhancement if it functioned as expected.
One of the new apps bundled with iOS 14 and iPadOS 14 is the Translate app. Specific to accessibility is the option to translate from speech to text. If you set both your “from” and “to” languages to be the same, and then activate the “listen” button, it is possible for someone to speak into your device and then the text will appear on your braille display. There are, however, some major limitations to this feature. First, once the person quits talking, your device will quit listening. This happens even if there is a brief pause between sentences. Second, like any automatic speech recognition, the accuracy will depend greatly on your environment. If you are in a situation where there is more than one person talking; if you are in an area with a lot of background noise; or if the speaker has a strong accent, these factors will make the translation less accurate. I have found, after downloading the English translation package for on-device translating, that it is less accurate than using the app with Apple’s servers. This is also noted under Settings > Translate.
If you wish to use Translate in the manor described above, after setting “from” and “from” to the same language, find the “listen” button located near the bottom of the screen.
Note that at the time of writing, “from” is what VoiceOver indicates both fields are, even though the second is “to”. After pressing the “listen” button, the person wishing to speak can do so, but must do it within about two seconds of activating the button. Once the Translate app thinks that person is done speaking, one of two things will happen. If the translated text is spoken out loud, the user will be returned to the upper left corner of the screen and will need to scroll to the right to the first thing that has been translated. Continue scrolling to either type a message, or one option further, to activate the “listen” button again.
If the text is not spoken out loud, it will flash up on the display once it has been translated. In this case, move to the right one option to find the “close” button and activate “listen” again. If the deaf-blind user cannot speak their reply, it is also possible to type text in return. To the left of the “listen” button, there is an option to enter text. After you are confident the individual reading the screen has read your reply, you can select “close” and then activate the “listen” button again. This could be a system someone may want to use for face-to-face communication with a hearing person who they interact with often, but seems too complex to use for an interaction between a stranger in public and a deaf-blind user. For my part, I will stick with a bluetooth keyboard and my braille display, even with its challenges.
It has been possible for quite some time to submit your camera roll to a service such as Seeing AI or TapTapSee to attempt to determine what a specific photo is. If you would like to give photos your own description, one way of doing this is by adding captions. When in the Photos app, find a photo that you would like to add further description to. Then, access “show details” through the VoiceOver Rotor and select it. Next, scroll right to “Add a caption”. You will land on a text field where you can enter any details you wish about the photo. Finally, select the “done” button and your caption will be saved.
In iOS 13, detected text was expanded to not only allow braille users the option of reading the descriptions, but it started reading more descriptions such as when a button had an image that could have imbedded text. With iOS 14 and iPadOS 14, this has been expanded even further to include the possibility to improve app accessibility. You can find these options by going to Settings > Accessibility > VoiceOver > VoiceOver Recognition. Please note that this feature is only supported on iPhone XS/XR models and later.
Image description is the first option. From what I can tell, this feature is unchanged from iOS 13 and iPadOS 13, other than that it moved from the Verbosity menu in VoiceOver to the Recognition menu. Your iPhone will attempt to provide a description of the images it encounters in apps and when using websites.
The second option is called “Screen Recognition.” This new feature can make some apps more accessible. It also will provide you with potentially useful information about screenshots you encounter in apps such as the App Store or Facebook. A friend of mine posted something about his pet dog on Facebook, which Facebook itself simply told me that there was no description available. I then enabled Screen Recognition, and learned that he has a black and white dog with red eyes. Keeping Screen Recognition on will slow down your device and will change how things are spoken/displayed with VoiceOver. At the time of writing, this feature seems to work well in some cases, such as the description of the picture, but not others. It seems to be a work in progress, and I’m encouraged by the progress made so far.
Though the settings allow you to have Screen Recognition enabled for specific apps, you can also easily toggle it by adding it to the rotor. There is also the option to assign a braille display command to automatically toggle this feature, but if you assign a braille display command, it currently won’t work.
The third option is Text Recognition. This seems to be slightly more responsive than what it was under iOS 13 and iPadOS 13, unless you are a braille user. Text Recognition is once again not accessible with braille.
Under the “Verbosity” submenu within VoiceOver menu, a few new options have been added. “Show more content” is an option that can appear on images that have been recognized. Flicking up and down, when this option is available should give you more information about the image in focus. This could include the Alt Text of the image, the file name of the image, or perhaps other things I have yet to encounter.
Container descriptions is another added option. You have the option of having a sound played when a description is available; to speak the text normally; to have the pitch of the speech lowered for the description; or you can turn it off entirely. Though the description through speech works with both of those settings, I was not able to get VoiceOver to play a sound. It’s also worth noting that the container information spoken by VoiceOver is not displayed in braille.
Where Was I?
Keeping up with text messages can be quite a chore, especially when text interaction is your primary way of communicating with the world. iOS 14 and iPadOS 14 now let you mention users, and there is a rotor option for that. “Pin conversation” is also a new option that will “pin” selected conversations to the top of the list of conversations.
Now, in a thread of messages, the last message you have sent in a thread will be at heading level 3, so you can quickly jump to where you most likely left off.
Auto Scroll Comes to iOS and iPadOS
This feature has been available on some of the braille displays sold by a German company called HelpTech, but Apple has now brought it to all braille displays. To set it up, you will first need to assign a braille keyboard command to start and stop auto-advance. You will find the option for assigning a new braille keyboard command under the braille category. There are also options to assign keyboard commands to control the speed of the auto-advancing. If you are unfamiliar with how to assign braille keyboard commands to certain actions, this guide will walk you through the process.
If you don’t wish to assign braille keyboard commands to control the speed of auto advance, you can also add this to your VoiceOver Rotor and control the speed that way. I found this feature to work reliably and enjoyed using it. The longest stretch I have read continuously was for nearly an hour, and I experienced no issues.
Making Things Smaller
In iOS 14 and iPadOS 14, certain elements have been abbreviated to conserve space. For example, TB is now what you will see instead of “tab”.
Apple made a big deal about machine learning in earlier Worldwide Developer Conferences, and many wondered what it could do for those with disabilities. VoiceOver Recognition is one example, and Sound Recognition is another. To use Sound Recognition, go to Settings > Accessibility > Sound Recognition and turn it on. After a small download, you will then be able to have your device recognize certain sounds. These sounds are organized by several categories including: alarms, animals, household, and people. Each category, including each type of sound, will be briefly discussed below. In general, results of my testing have found that to detect the noise, it must be louder than any of the surrounding sounds. When a sound is detected, it will show up as a notification and also send the user a vibration when on an iPhone.
Under the “Alarms” category, the options are fire, siren, and smoke. I did not have the chance to test for a fire alarm, other than to play it through a Youtube video. When I did this, the iPhone did not detect the sound. Siren detection required that the siren be quite loud. During a tornado siren test, my iPhone did not pick up the siren, as it was quite some distance from my location. When the siren was quite loud, the iPhone picked it up reliably and sent me a notification. The notification was not intrusive at all, though with the potential for alerts such as a smoke alarm, I would hope there would be a much more urgent type of notification such as those delivered in an emergency alert. The two times I tested the smoke alarms in my apartment, I was alerted within two seconds. However, it’s an alert as casual as a notification of a change in a sports score, a new message on Slack, or many other notifications I wouldn’t feel an urgent need to investigate. As I have a high frequency loss, the shrill sound of a smoke alarm is not something I can hear anymore, so I would feel more safe using this as a back up to another already installed system I utilize.
In the animals category, there are options for both a dog and cat. I found that bigger dogs, with a louder bark, often could be detected from quite some distance away. However, smaller dogs, with a higher-pitched bark, seem to be reported less frequently. The cat sound detection works for meowing and doesn’t seem reliable. The higher pitched the noise, the less often the device picks up on it. I also barked at my phone and felt like a good boy, as my iPhone reported that it thought it had detected a dog.
Under the household category, the sounds available include appliances, car horn, doorbell, door knock, and water running. It was not clear to me what exactly appliances Sound Recognition is referring to, as running a dish washer, washing machine, and dryer did not alert me to any sounds. Car horns I was often alerted to when commuting through cities. Doorbell sounds from Youtube reliably set off the recognized sound notification. Door knocking generated a few false positives if you hammer or knock on a table. Water running also produced a couple false positives, but alerted me to running water on one occasion when I left the water running on my bathroom sink. Like VoiceOver Recognition, this seems to be a work in progress; it’s my hope that the progress will continue as iOS and iPadOS continue to evolve.
If you go to Settings > Accessibility > Headphone Accommodation, it is possible to adjust your listening environment by amplifying some sets of sounds and dampening others. This feature is only available on AirPods Pro, the second generation of AirPods, some of the Beats headphones, and EarPods. When turned on, if you have a set of supported Headphones, you will be walked through a form of a hearing test which will ask you questions about what you hear. Based on your results, Headphone Accommodation will adjust your audio experience to better suit your hearing situation. There is also a “transparency” mode which allows you to hear the environment around you. This feature is only available on the AirPods Pro. Once configured, you have the ability to further enhance low, medium, or high frequencies. I was not able to test this myself, as I do not have a pair of compatible headphones.
I Recognize You are Signing This
FaceTime group calls have already had the feature which automatically detects when someone is speaking and focuses to their video feed; iOS 14 and iPadOS 14 bring this option to American Sign Language.
Back Tap is a feature available on all iPhones from the XS onward. It allows you to double-tap or triple-tap the back of your iPhone to carry out a specific action. That specific action can be any number of things ranging from toggling accessibility settings to performing VoiceOver functions or even Shortcuts. To turn it on, go to Settings > Accessibility > Touch > Back Tap. From here, you can set any number of accessibility options such as VoiceOver, Zoom, Assistive Touch, and so on. One way to look at Back Tap is as a secondary accessibility shortcut if you require the use of more than one accessibility feature.
There are also options to assign a specific gesture to Back Tap using VoiceOver, but you will need to navigate to the appropriate menu. Settings > Accessibility > VoiceOver > Commands > Touch Gestures will get you there. Then, scroll to the very bottom of the screen, and you will find options for both double-tapping and triple-tapping the back of your phone. Options exist for any other item that can be assigned a touch gesture. As an example, I have a shortcut set up which turns bluetooth off and back on. This is the double-tap. For the triple-tap, I have the VoiceOver Magic Tap assigned.
Regardless of your accessibility needs, Apple continues to innovate with new accessibility features with each major iOS and iPadOS release. iOS 14 and iPadOS 14 are no exceptions. However, there is also no exception to there being bugs present. As a general rule, though there are actually more bugs in iOS 14 and iPadOS 14 than 13, these bugs are not what many would consider showstoppers. Before upgrading, I encourage people to check out the AppleVis blog post detailing new and fixed bugs in iOS 14 and iPadOS 14 and the AppleVis iOS and iPadOS Bug Tracker. If you find a bug that is not tolerable, don’t upgrade, as returning to iOS 13.7 or iPadOS 13.7 may only be an option for a short time after iOS 14 and iPadOS 14 are released. If that is the case, stay tuned to AppleVis, as we will continue to update the status of bugs as the iOS and iPadOS release cycles progress. To download the update over the air, go to Settings > General > Software Update and follow the prompts on screen.
Just to clarify, when you say that something is available on Xs and newer, do you classify the SE2020 as newer than the Xs? I ask because although it is chronologically newer it shares the same form factor as the iPhone 8 and I wasn't sure if this made a difference.
I think so because it has the a13 chip that the iPhone 11 series has.
Just a clarification, backtap is also available on my IPhone 8, not sure how far back it goes. I understand the confusion though, because in earlier betas what you wrote in the post was indeed the case.
Also, one more option you have in verbosity is to choose how actions will be announced. You can either have Voiceover speak "actions available" as always, just play a sound, or nothing at all. The sound will also be played after the current icon is spoken with a brief delay, and it is what I currently use.
Also, under Voiceover commands, it is now possible to assign a gesture to quickly toggle audio ducking. This, for me at least is a lot more useful than the rotor, since if something starts playing, you can perform a single gesture you have assigned to turn on audio ducking.
Thanks for a great post as always!
yes. the XS, SE 2020, and newer phones which are currently out support the accessibility features mentioned in this post.
hello everyone. do we have new voices for the UK, and the US English. thanks.
Still waiting. It has pass the time for it. 1:00 PM, Central time US and nothing.
It's almost bed time for me and still nothing. I'd like to get the downloading to start. Don't want such problems tomorrow, as I'll be away from wifi.
I'd be curious to know if iOS 14 is more stable and less buggy then iOS 13 is now. I usually tend to wait to update until others have taken the leap to the other side and reported back.
I don't think there'll be new voices.
Very stable. Every time I saw a video in youtube they always mention how stable it was. I think it will be better than 12.
I've heard that too from sighted tech journalists, but here's hoping the same is true for VoiceOver users.
If iOS 14 is better then our current iOS 13.7 and less buggier, then I'll for sure update my iPhone.
Hi all, I'm responding to several comments with one.
Yes, Backtap and VoiceOver Recognition are supported on the iPhone SE 2020. I do not find BackTap on my iPhone 8 though.
I am finding it about as stable as 12 thus far. I'm even considering upgrading my primary device to 14 from 12. There are, of course, bugs, but we'll never see a bug free release. The sighted population never does either.
Yes. However the sighted people get their bugs fix ASAP and us not to much. It takes a major release to fix bugs if we are lucky.
Here it is. I am downloading it now.Next will be OS 7.
You can hide/unhide home screens using Voice Over, by going into "Jiggle Mode" on the home screen, (VO+Shif+F or "Edit Mode" action via external keyboard, Double-Tap and hold via the screen), then tap on the "Pages" switcher. You'll see (after moving to the left), page toggles that either say "Visible" or "hidden." Tap on the icon to either hide a screen or unhide it.
Once you're finished, tap Done.
It's thanks to a blind/VI youtuber, for the tip in the comments.
Hello all! I am really liking iOS 14 so far. However, I wanted to ask does anyone know how to perform the picture and picture feature? Thank you!
I set up the triple back tap for the accessibility shortcut. It works to turn off Voice Over but will not turn it back on. Anyone else?
I understand when the video is running, and if the feature is on, if you just go to home it will be play. If I am wrong someone will let us but I am not.
I have a comment to make on Screen recognition. one app that really works well is Smule, a karaoke app. with Screen recognition turned on, VoiceOver now correctly labeled the buttons to make the app more accessible. for example, on the activities screen, the first button that VoiceOver recognized is “chat,” followed by “Search,” etc. Even though the developers have labeled most of the buttons, the majority of the app is mostly accessible, thanks to screen recognition, and it’ll improve overtime with every iOS update. it’s only the beginning of something new, and apple is finally doing something right for once.
In message, when I sent a voice message, it will not work. Screen locks and I have to open message again. I did it several times but did not work. However using siri to send a voice message it did so.
Happy, happy. I am no longer getting the time at the end of any notification. Be it message or news. Cross finger that it will continue. After a while is a major pain if you get twitters, messages or news and at the end of each voiceover says the time. Hope is no longer an issue.
I thought backtap only worked when Voiceover was running, so if you turn off Voiceover, the backtap won't be recognized to turn it back on.
that's very weird and surprising that you don't see it on your 8. On mine, it's there and it does work as well, both double and triple. Wondering what makes your device not have the feature.
I think what's going on here is that you set up a VoiceOver gesture for the backtap, which therefore wouldn't work if VoiceOver is turned off. Think about it, if you set up a VoiceOver touch gesture of two finger swipe to the right for the notification center, it won't work if VoiceOver is turned off either. Try setting up the backtap gesture from the Touch accessibility options I believe.
I set up a back tap gesture on my phone to toggle audio ducking. I can get it to work sometimes, but not all of the time. How hard should actually need to tap my phone for this to work? I'm using an iPhone 11 and the Apple smart battery case. The case could be an issue, I'm not sure. I hesitate to take the case off and try, because i managed to break a lightning port on a case doing that once.
Hey guys! While I have not played with the new OS. yet because I just got it a few hours ago, I'm loving what i have seen so far.
I've got a few questions though.
1. Where is the screen recigonition settings for VoiceOver? I'm running an iPad 7. I've looked every where that I could think of, and these settings seem to be alluding me.
2. Where is the search for emojithing? I just did a Facebook post and I used emojis and was going to try this feature, but couldn't find the emoji search box. I also looked on the rotor and couldn't find the emoji setting where you can flick up and down and double tap to select your desired result.
Other than that, that's all I can think of so far. I'd appreciate any help. I just figured I'd try coming here and asking before I called apple accessibility, or at least started compiling a list of questions. So that when I finally do call, I can get some answers.
How do we assign the double tap gesture on the back of the phone to turn Bluetooth on and off.Do we need to create some shortcut initially?
Back Tap is not a VoiceOver feature, so whether VoiceOver is on or not shouldn't make a difference.
For what it's worth, I was able to set the triple tap Back Tap to toggle VoiceOver, and it did work most of the time. The only occasions it didn't were when I performed the two sets of triple taps too closely together (so, not leaving any gap between turning VoiceOver off and then attempting to turn it back on again) and when I wasn't firm enough with my taps (I have a fairly thick case on my iPhone, so this might interfere with things here).
I can't speak to devices running the A11 chip, but I can say for sure sadly, that you will need to purchase/trade in your IPad 7 for the 8th gen Ipad when it comes out.
They'll be a few people I know who are gonna be, taken back by that.
I have an iPad Mini fifth Generation, and since it's got the A12 chip, it definitely supports VO Recognition.
Some of my friends discovered a lot of interesting functions through the exploration of screen recognition during this period.
First, it can read the subtitles displayed in the video.As long as you turn on screen recognition and place the focus of VoiceOver in the video area, VoiceOver can read out subtitles refreshed in real time.Of course, sometimes this function is not reliable. When reading aloud, there may be some pages that have nothing to do with subtitles, but this is already very gratifying.
Secondly, the text captured by the camera lens can be read aloud by screen recognition.Through this function, my friends have expanded many usage scenarios, such as reading the numbers displayed by the remote control of the air conditioner at home, reading the express number and so on.If you think there are too many elements on the camera page and too complicated, you can also try to use a magnifying glass with screen recognition to achieve the same effect.
Third, let those irregular controls in the APP become easier to operate.If you use enough apps, you will definitely encounter some apps that are not so accessible. For example, some buttons and controls don’t have the correct text, and the focus of some controls is messy, and some apps even have a lot of them, the option has only one focus.If you encounter these situations, you can try to use screen recognition to solve them, and there may be unexpected results.
Of course, as the post said, this feature is still under development, and there are still many unstable situations, and the power consumption of the iPhone is very fast after turning on the screen recognition.However, we have reason to believe that Apple can further improve this function and make it more powerful.
How do we have pages in iOS 14? I was listening to a podcast by applevis but can not find it and when I went to overcast to listen is not there. Help.
Said podcast, can be found here:
Found it. Thanks. Much appreciated.
I've noticed two things about braille screen Input on my iPhone XR and Ipad Pro 9.7:
1. I normally have it announce both CHaracters and words (to ensure my fingers are placed right). Every single time I finish a word and swipe to space, it says space, followed by the word I typed. Makes typing very slow. I've worked around this by making it announce only words when I type, which leads me to the following:
2. I noticed some of my words were running together while typing. Turns out there are times when swiping right to add a space does not register. It's like you have to be super precise about it now, where as before it was never a problem.
3. Even not using Bluetooth, the input lag between brailling a letter and the announcement of that letter is great enough that it becomes distracting. It's actually as bad as when I'm brailling while using my airpods. I've heard others have been experiencing voiceover lag too, so here's hoping that's addressed in an upcoming update. But this braille screen input nonsense, this is a problem I haven't heard about. ANone else?
I've also noticed when I have direct touch typing enabled and try to unlock my phone, the passcode entry behaves as if standard typing is enabled. In other words, I have to select the number, then double tap to enter it? This too happens on both my devices.
I downloaded iOS 13 the day before iOS 14 was released so I've probably got the last version of iOS 13 there will ever be, which suits me just fine. I avoided iOS 13 up until now because I'd seen so many reports about iOS 13's problems and decided to just keep waiting. But when it was announced that iOS 14 would be released on September 16th, I decided to upgrade to iOS 13 on the 15th.
I doubt that I will wait so long to upgrade to iOS 14, but I will hold off until I can be reasonably certain that I wont regret the upgrade. I do not want to turn my iPhone SE (original, not 2020) into a sluggish and laggy brick.
First of all, I feel that iOS14 is turning out to be quite stable. I have taken the suggestion of making my secondary voice on the rodor my primary voice for now, because the default voice just seems to have a few milliseconds difference in lag time, which was really bugging me. Using the secondary voice, which, for me, is the same voice of Samantha (not enhanced), and it made voiceover quite snappy again.
Second, the back tap feature is really cool. I use the XS iPhone and have always had an attitude problem with the side button but no more. A simple triple tap on the back of my phone reliably returns me to the home screen from anywhere. It should be noted that I have no issue with the sliding gesture to get back to the home screen.
Now for my question... Does anyone know why my phone is saying screen dimmed? Low power mode is off, battery percentage is good, screen lock is set to never. Since the update to iOS14, I see this behavior and cannot track down as to why. Any ideas?
All information I found said that backtap wouldn't work on an iphone 8, but I have discovered that it certainly does work just fine on my iphone 8. Woohoo!
I'm not sure if this is a feature that existed before without me noticing it, but when I open the camera I hear a constant stream of information from VoiceOver about what is in the viewfinder. Again this is an iPhone 8, so it doesn't have the new VoiceOver recognition features for screen objects that's new in iOS 14 for later phones. I was extremely impressed with how fast and accurate the object recognition in the camera view was. If technology didn't tend to fail whenever it was needed most, i would suggest trying to use this VO camera feature to find things in big rooms. It didn't try to read any text. I suppose that would not be particularly normal or useful in a camera app, but no tool I've tried came anywhere near the speed and accuracy of object recognition in a picture and this is live.
I figured out how to do it it is very limited right now but hopefully it will expand and be even better! Thank you so much for all your help!
VO saying space for every space after typing in BSI is unnecessary and disturbing. Setting the typing echo to word is also risky, as sometimes it needs recalibration and one won't know one is inputing wrong characters. So any other workaround for this would be great!
I too, set the back tap up for the accessibility shortcut which is to toggle voice over.
It turns voice Over off, but not on. After rebooting my phone, it will turn it on once. it would be really good if this was fixed as it is far more convenient than triple tap of teh power button.
VO is no longer reading my notifications be it twitter, news or messages. Phone is on not mute but when phone is lock, nothing. Do get them when is unlock. Did great first day and no issues. Now, nothing. All settings are for getting notification on lock screen and nothing. Anyone is having this issue?
I am finding that the screen recognition feature can mess me up if I don't need it. For instance, I had it turned on and had YouTubeTV turned on and I couldn't get it to recognize anything. Same thing happened in the Message app. The cursor jumped all over the place and I couldn't figure out where I was. I'm sure this feature will come in handy but at the same time, if things start to go quirky all over the place, try turnring off the screen resolution to see if it clears up.
"Barking at my phone", that has to be the most funny scene that I've ever imagined.
I've checked all of my notification settings, and everything is as it should be. I depend on this for calendar notifications before meetings and other activities while I'm on the phone.
One thing I've noticed about iOS 14 that I really dislike is that VO does not read the caller ID information when a call comes in, nor do notifications about phone calls and voice mails have any information about the call, , it just says there was a notification from the phone app. I believe that iOS 13 also had this problem. It makes me wish I was still running iOS 12.
OK, after some more playing around I have determined that VO is reading the caller ID information as expected. The downside of this is that it renders the new "silence unknown calls" feature useless because VO reads the caller ID information regardless of whether you have the "Silence unknown calls" feature turned on or off.
I have also noticed that VO reads a fair number of excess strings that it shouldn't be reading, for example, when I select the "call aira" button on the home screen of the Aira app, besides telling me about the "Call Aira" button, VO also tells me about a document called "Animals".
Since it has been released, I can tell you that none of these issues has been fixed in the iOS 14.0.1 update.