What's New In iOS 13 Accessibility For Individuals Who Are Blind or Deaf-Blind
It's September, and that means a lot of things. For users of iOS devices, it's time for a new major iOS update.
Just like other releases, this latest build brings a lot of new features and enhancements to supported iOS Devices. Major changes include a system-wide Dark Mode, QuickPath keyboard, support for thumb drives and other USB external memory devices, call blocking features for spam callers, and much more. Many blogs will be highlighting these new enhancements to iOS, so I will not discuss them in great detail here. This article will deal with enhancements pertaining to accessibility: specifically, those changes which impact individuals who are blind or deaf-blind.
Those of you who pay attention to such things may have realized that I did not include low vision in the title of this article. This, sadly, was not due to an error, but is due to the fact that I could not get what I felt was sufficient feedback from low vision users to comment on the changes related to low vision. As such, it is my hope that someone from the low vision community can cover these features in more detail.
One of the joys and curses of getting a new release from Apple is that they do not actively document the changes in accessibility with their products. This is good for me, because it gives me the chance to share new features with my readers, but it is also a challenge. While I have taken care to work extensively with iOS 13 since the first Beta release in June, there will be things that I have missed. I'm confident this will also be the case with other people attempting to do the same thing. This is also part of the fun. Whenever I discover a new feature that was not previously written about, it's almost like solving a puzzle or getting an early Birthday present.
Check Before You Update!
Unfortunately, along with all of the new features discussed below, there are some bugs that users should be aware of before deciding to upgrade; this AppleVis post lists many of the bugs that have been found in the final release of iOS 13.0. While these bugs are present, Apple is working on iOS 13.1 which will hopefully resolve many of the issues found in 13.0.
In General, Apple Has Removed Accessibility
That heading may have you wondering whether this will be a very short blog post, but if you scroll down, you will see that I'm just as long-winded as in previous years. If you scroll down the Settings app, you will see that Accessibility has been moved out of the General menu, and onto the top level of the Settings app.
Another iOS Update, Another New Siri Voice
iOS 13 brings a new Siri voice to your iOS Device. This voice is only available as the female Siri voice. It adds more inflection, thus giving the speech a more natural quality, but sounds similar to the version that was on iOS 12 otherwise. At this time, the new Siri voice is only available in U.S. English.
Dictate From Anywhere
In most cases, there is now a “dictate” button beside almost all search boxes.. Selecting this option will allow you to dictate whatever you would like to search for within that application.
Speak To Me
Another new feature which may benefit some users is the new Voice Control feature. By default, it is turned off. You can either tell Siri to open settings for Voice Control, or navigate to Settings > Accessibility > Voice Control and begin setting it up.
If you select to turn Voice Control on, before using it, you will need to be on wifi to download 250MB of data so that Voice Control can work regardless of whether your iOS Device is connected to the internet or not.
After the download has completed, moving to the right one option will explain what Voice Control is. Continuing to the right, you will find the language that is selected. Further along in this menu, you will encounter several buttons: Customize Commands, Command Feedback, Vocabulary, Show Confirmation, Play Sound, Show Hints, Overlay, and have the option to help Apple improve Voice Control. A brief explanation of each option follows.
The Customize Commands option allows you to not only customize what will cause Voice Control to perform a specific function, but will also allow you to create your own. Selecting each option from this menu will allow you to see a full list of what commands are available. Within each of the commands, you will also find the ability to turn that specific command off if you wish.
Continuing down the Voice Control menu, you will find an option called “vocabulary”. This is a section where you can type in a word or phrase and have it be recognized by Voice Control.
Show Confirmation and Play Sound are features which will show or play a sound to confirm that Voice Control has heard your command.
Hints, the next option, come in handy when you are first starting to use Voice Control. For example, I wanted to select the Mail app with VoiceOver enabled. I said “select Mail”. I was instructed to say “VoiceOver, select Mail”. This suggestion selected the Mail app as requested.
Overlay is an option which can speed up your voice interaction depending on your use case. You have the ability to select to use numbers or names which will then be displayed over the contents of your screen. With VoiceOver enabled, if you say “select 23” for example, you will get a read-out of the number of the various elements on your screen and the associated number. Instead of saying “select 23” I would just say “23” and this would select the designated option for that number. The same is true with the names overlay. The Grid Numbers overlay is also an option, which seems to behave the same using VoiceOver. This menu also allows you to control how long the overlay will remain on screen through a feature called “Automatic Dimming”. There is also an option to control the opacity of the overlay.
For VoiceOver users, I have found Voice Control to work quite well; somehow almost always knowing when I’m talking to it and not something or someone else… Or even worse, myself. I have also found that speech through the internal speaker from VoiceOver doesn’t appear to interfere with Voice control very often. That said, turning speech off completely, which I almost always do anyway, makes it work even better with almost 0 false positives when I’m using braille support.
For more detailed examples of how to use Voice Control as a VoiceOver user, I recommend checking out the AppleVis podcast done by Thomas Domville.
Sounds and Haptics
One of the first things you may notice upon successful installation of iOS 13 is that your iPhone will start clicking and giving you other tactile feedback. Another thing you may immediately notice is that there are new VoiceOver sounds.
If the sounds or haptics are not something you are enjoying, you can modify these settings by going to Settings > Accessibility > VoiceOver > Audio > Sounds & Haptics. In this menu, you have the option to turn both or either sounds and haptics on or off. You can also choose from one of 3 intensity levels of the haptic feedback. You not only have the ability to toggle these option on and off, but can also choose which sounds are heard or haptics are felt. However, if you want any feedback from either of these settings, you will need to turn them on, and then turn all of the sounds and haptics you do not wish to use off.
It’s All About Context
Since Apple is dropping support for 3D Touch on its newest devices, it had to invent a way for users to access the menus that 3d touches generated. On the Home screen, or wherever one of the 3D Touch options existed, there will now be a new rotor option called “show context menu”. Selecting it will bring up the menu as if you had performed a 3D Touch.
Scrolling Made Easier
ON most screens, with VoiceOver enabled, you will now find what is called a "vertical scroll bar." The purpose of this scroll bar is to be able to quickly move to different parts of lists, messages, books, etc. You will typically find it on the right side of the touchscreen. When VoiceOver speaks or displays "vertical scroll bar," you will also find that you are on a specific page with a percentage value given as well. These represent how far down in the content you are. For example, if you are at 99%, you will be near the bottom of the content displayed. flicking up and down will control how far down you are in the content. If you flick up with one finger to 50% and then flick left, you will be halfway through the list of items on that page.
easier Snapping Of Pictures
When taking photos in the Camera app, VoiceOver will now give you a lot more hints to help you get the picture you want.
VoiceOver will indicate the level of the camera and also describe the location of people in the view finder. For example, if your camera is tilted 15 degrees to the right, it will let you know this. If one person is in the bottom right of the view finder, this information will also be voiced. Further, VoiceOver can identify who the person is in the view finder, if that person is known in your photo library. Finally, iOS will also attempt to identify the main object in the scene. This is one example of where machine learning and AI have come together to make this task much easier for users who are blind.
Have It Your Way
In iOS 11, Apple gave braille display users the ability to customize braille keyboard commands. This made it so that a braille display user could not only assign existing commands new keyboard shortcuts, but also gave the braille user the ability to choose from a wide range of other options that may not have a command associated with them. In iOS 13, this capability has expanded to include customization for Bluetooth keyboard commands as well as gestures for touchscreen, braille screen input, and handwriting. Head over to settings > Accessibility > VoiceOver > Commands to check out all that you can now customize.
One example which can make operation of a touchscreen more productive could be if you frequently navigate web pages by heading. You can set up a gesture to jump to the previous or next heading. This saves the user time by making it so that they do not need to fumble around in the rotor to get to the headings options. Instead, they can perform one gesture to carry out this action. This is a very powerful feature, with numerous options. I will illustrate by example below how to configure a new gesture, and let you explore all of the potential options at your leisure.
Since I discussed headings above, let's use jump to previous heading as an example. To assign this gesture, once I'm in the Commands menu, I would first select "touch gestures". Then, in the search box, type two-finger swipe left". This should bring up that gesture. Alternatively, scroll down to "two-finger swipe left" and double-tap. After you have selected the gesture, you can then flick right until you encounter another search box where you would type "previous heading". You will then be taken to the "advanced navigation" heading, and you can flick right one more time and double-tap this gesture. Alternatively, you can navigate manually to the "advanced navigation" heading, and then double-tap "move to previous heading." Once done, the new gesture is ready to use. If a gesture is already in use, VoiceOver will specify what the current action is for that gesture. You can either cancel the new gesture assignment, or assign it anyway.
If you really mess things up by re-assigning too many gestures, there is a reset option as well. Whether you wish to add new gestures, or if you just have trouble performing certain ones such as getting back to the Home Screen, this amount of customizability can be to your benefit.
Note, using Braille Screen Input and Handwriting work the same way as the process outlined above. Those custom gestures will only be available when Handwriting or Braille Screen Input are in use.
Watch This, Please?
A new feature that you will currently only find through the customized commands menu is called “Watch Item.” You will find it listed as an option under Settings > Accessibility > VoiceOver > Commands, picking a gesture, and finding, “Watch item,” under the, “VoiceOver," heading.
After assigning it a gesture or command, you can then put VoiceOver focus on an item anywhere in an app and have VoiceOver notify you when that element has changed. This can come in handy while tracking a download in your web browser, ETA in a rideshare or delivery app, or many other settings. Though VoiceOver will notify you with speech of these changes, this feature is currently inaccessible to braille display users.
There are also a few new gestures that don’t currently have actions associated with them. These include double tapping with three fingers, holding, and flicking in any given direction. While I was able to set up commands associated with these newer gestures, I was not able to get them to carry out the assigned task. When I went into VoiceOver Practice mode, my iPhone 8 wouldn’t recognize these gestures either. Whether they are not functioning yet, or whether this is just something my device doesn’t support, I’m not sure.
What Does All Of This Data Mean?
In certain areas of iOS, VoiceOver can now attempt to interpret charts and other graphical data for the user.
This function is currently available in Health, Stocks, and with the battery life diagram. To use this feature, you will need to find a chart within one of these three apps. You will see a new rotor option called “data comprehension.” Flicking up will take you to a verbal description of the chart; flicking up again will provide you with an audio representation; flicking up again will provide you with a description of the data series; and one more flick up will provide you with a summary of the data.
For braille users, these messages are provided in the form of VoiceOver announcements, which can be read at your own pace by pressing space with N once the desired option is selected. Obviously, the audio interpretation is not accessible when using braille exclusively.
I Can See What You Hear
One of a few new VoiceOver options that has been ported over from the Mac Operating System is the Caption Panel. The purpose of this feature is to get a visual read-out of what specifically VoiceOver is saying. This can come in handy for someone who wishes to observe what a VoiceOver user is doing, but cannot hear or understand VoiceOver’s speech.
To turn this feature on, go to Settings > Accessibility > VoiceOver > Caption Panel and turn it on. You will find the option near the very bottom of the VoiceOver menu, next to “double-tap Timeout." VoiceOver users will not notice a difference, but on the screen, there is a bar at the top which will display speech output.
Faster Touch Typing
In June, Apple announced that there would be a new gesture-based keyboard available on iOS called “QuickPath." QuickPath is a native swipe keyboard that allows you to type by sliding your finger across the onscreen keyboard.
It is enabled by default, so you should find it available on any text field system-wide after installing iOS 13. If you wish to disable QuickPath, head over to Settings > General > Keyboard and toggle off “Slide to Type”
The QuickPath keyboard works well with VoiceOver in most situations, but there are occasions where using traditional touch typing may be faster or more efficient. Fortunately, Apple has made switching between QuickPath and touch typing easy; all you need to do is add “Slide To Type” to the VoiceOver Rotor. To do this, go to Settings > Accessibility > VoiceOver > Rotor, and turn it on. This will now be an option in any text field within your rotor. To turn the QuickPath keyboard on or off, change your rotor setting to “Slide To Type” and then flick up or down and double-tap.
To explain how to use this keyboard, I will illustrate by example. If I’m in a text message, and wish to type “Hello,” I would first find the letter ‘h’ on the onscreen keyboard. I would then hold my finger on the letter ‘h’ until I hear a three-tone clicking noise; VoiceOver users will recognize this tone as the sound VoiceOver makes when invoking a double-tap-and-hold gesture. After hearing this tone, I would slide my finger to the approximate location of each other letter in the word. VoiceOver will speak the word it currently believes I want to type. If this is the correct word, and I have only gotten to the letter ‘e’, I can raise my finger off of the touchscreen and the word “hello” will be inserted into the text field.
Understandably, this method of typing can take some time to get used to, but it seems to be more efficient for those using the touchscreen to type. Apple allows you to configure the amount of time before it interprets what you have typed. You can find this setting by going to Settings > Accessibility > VoiceOver > Typing > Keyboard Interaction Time. The longer the duration is set, the slower VoiceOver will be to react to your moving around the screen.
Activities: The Customization Continues
The theme for VoiceOver users in iOS 13 seems to be customization. Another new feature, which Apple calls “Activities,” allows you to set up VoiceOver profiles that will permit you to quickly switch from one set of VoiceOver settings to another. They can be invoked by the opening of an app, or any time you wish through the rotor.
The options you can customize include the default VoiceOver voice, speaking rate, VoiceOver volume, amount of punctuation, whether emojis are spoken, the verbosity level of tables, general and text status cells within braille, the braille table being used, type of braille output and input, and the VoiceOver modifier keys on a Bluetooth keyboard.
As indicated above, you can specify within each activity if you would like to have it switched automatically depending on any number of variables. You could, for example, have an app in a language other than your default and a corresponding series of VoiceOver settings for that language. E.G. a different VoiceOver language, different braille table, etc.
Another simple use for this feature may be if you use VoiceOver to read books in the Apple Books app. You may wish to have a different VoiceOver voice read books to you than the default, or you may have status cells turned on for some situations but find they get in the way of your reading.
To create a custom Activity profile, go to Settings > Accessibility > VoiceOver > Activities and select “Add Activity….” From there, you can give that activity a name, and then customize any of the options I listed above to your liking. When done designing your Activity, Simply move back to the main Activities screen and it will be listed. If you specified an app, you can then launch that app and the activity will automatically kick in to change VoiceOver behavior.
To manually switch to your own activity profile, you will need to first enable Activities as a VoiceOver rotor option. Go to Settings > Accessibility > VoiceOver > rotor and select activities. It is located near the bottom, next to “slide To Type”.
Customization Of Punctuation
Under the Verbosity options in the VoiceOver menu, you will find that punctuation settings have been revamped and there are now many options which can be customized. As such, you will find an “edit” button, followed by the “Active Punctuation Group” and then the system options. These options may be familiar to you if you have ever adjusted punctuation with VoiceOver between the options of None, Some, and All. Moving beyond these three groups, you will find the “add Punctuation Group” button. To begin creating your own group, activate this button and you can begin customizing what punctuation symbols VoiceOver will read. In every new group, you can choose to do one of three things with each symbol:
- replace: this will tell VoiceOver to speak whatever text you supply when it encounters the symbol;
- ignore: this tells VoiceOver to not handle the symbol in any special way, allowing the current speech synthesizer to process it instead; or
- remove: this suppresses the symbol entirely, making VoiceOver skip over it as though it were not there.
For example, let's say you create two groups, one called "Reading" and the other "Proofreading." In your Reading group, you set parentheses, braces, brackets, and other symbols to be ignored entirely. After all, you don't want them to be spoken when you're just reading an article or email. But your Proofreading group is different. In this one, you set all those symbols to be spoken, because you need to know where they are when you're proofreading a document.
Another example is programming. Let's say the programming language you use has the exclamation point as a symbol you need to hear in your code. Anytime you encounter it, though, it means "not" rather than the usual English meaning. You could make a Programming punctuation group in which you set the exclamation point to be spoken as "not"; you might also replace the parentheses, braces, number sign, and other symbols with shorter text so you can more efficiently listen to code.
When you first make a group, you can base it off of one of VoiceOver's default groups--All, Some, or None. You customize the symbols from there. Switching between groups can be done on the fly, using the new Punctuation setting in the rotor. Just spin the rotor to Punctuation, and swipe up or down to choose the group you want to use. Also, if you set up a new punctuation group, you will be able to use this as part of the Activities feature discussed above.
Speaking Detected Text Expands
In iOS 13, the Speak Detected Text feature has been expanded to include both the text and image of a button. The option will automatically be active whenever VoiceOver detects an element in an application that does not have a label. If you then create a customized label, VoiceOver will recognize this and no longer attempt to provide an automatic recognition on the button.
Unlike in iOS 11 and 12, this feature is also accessible for braille users. If you wish to disable this feature, you can do so by going to Settings > Accessibility > Verbosity > Speak Detected Text, and make the change from there. You can also have VoiceOver play a sound along with the detected text, or simply have VoiceOver play a sound.
Watch Your Language… Or Not
iOS 13 brings the ability to specify whether you wish VoiceOver to automatically switch to another detected language or not. Since some websites and email providers set a default language, VoiceOver will sometimes switch to another voice in that language, even if the web developer or person writing you an email is still in a different language. You now have the option of whether you wish this setting to be on or not. You will find it in Settings > Accessibility > VoiceOver > Speech.
iOS 13 Brings a Lot to the Table
iOS 13 brings a number of changes and enhancements for braille users. One of these is the addition of the Liblouis series of tables; this gives iOS braille users access to over 70 braille languages. These are under a new option in the braille settings called “Braille Tables.” To access this feature, navigate to Settings > Accessibility > VoiceOver > Braille > Braille Tables. After the “edit” button, you will be presented with a list of tables which have already been added to your iOS device. After that, you will see any other braille tables you have downloaded, followed by an “Add Braille Table” button.
Selecting the “add Braille Table” button will present you with a list of supported languages. You can type the language you wish into the search box, or browse the list of available braille languages. If, for example, I wanted to add English U.S, I would select English, and then After selecting the U.S. variant, I’m returned to the English tables available. At the end of the list of choices under the System heading, you will then find the Liblouis tables available as well.
I’ve found some issues with translations of the Liblouis tables, so I typically stick on the System braille codes; System codes are the ones that iOS has been using since braille support became available a decade ago.
How Far Along Am I?
Whenever you are in a list of options, such as the Settings application, braille users can now easily determine how far down a specific list they are. For example, If there are ten options in a menu, and if VoiceOver focus is set to the first item, braille would display “1/10”.
Slow Those Commands Down!
Another new option available for braile users is called “Key Debounce.” You will also find this in the braille settings menu. This setting controls how quickly after a letter is entered it will be registered as a command. If you are, for example, someone who has difficulty with pressing the spacebar too quickly, you can slow the rate down so that pressing the spacebar will register as a space instead of a command. The duration ranges from zero to 1 second.
Apple continues to make changes and enhancements to its mobile operating system for everyone. Their work toward inclusive design continues to keep them ahead of many other platforms in terms of built-in accessibility options; certainly, the enhancements in iOS 13 prove this trend continues. Just like previous iOS releases, whether you should upgrade or not depends on whether the bugs present in the new release will impact you on a greater level than you can tolerate—and whether you feel the new features are worth the upgrade. If possible, it may be best to try out iOS 13 on another device before installing it on your own.
To download the update over the air, go to Settings > General > Software Update, and follow the prompts on screen. Alternatively, you can update your device through iTunes.
If you want to read about the general changes and new features coming with iOS and iPadOS 13, Mac Stories has you covered with what's become its annual ‘must read’ review by FEDERICO VITICCI.
Thank you for this post.
It is amazingly written and informative.
also, I had one question:
has apple done some improvements to its image describing feature in which we tap once with 3 fingers to have the image described with VO?
I love this post
Thank you Scott for the excellent blog post! I just have one question, does this mean that the new American English Siri female voice can only be used with Siri, and voiceover only has the earlier version of the voice? I’m still a bit confused here. Thank you very much!
I was curious about these, and could not get them to work myself either during the entire beta process on an IPhone 8. Either they are IPad specific, in which case I donn't know why they are showing up on IPhones, or they just don't work.
Something weird I noticed is that in the list of gestures to be customised, there's a 2 finger single tap and hold, which is dimmed and is not listed as mapped to anything. Wonder if that's a bug on Apple's side. Additionally, 2 finger double tap and hold is not listed at all, and if you perform that gesture, it plays the familiar sequence of sounds but no longer appears to be mapped to the label item command since it does not seem to do anything. That being said, if you assign another gesture to the label item command it works just fine. I also really like the watch item command, it can be useful in many situations. Additionally, something worth listing is that if you have multiple braille tables added, you can switch between them using the rotor in a similar way to switching languages. You can also switch between them using a gesture while in BSI. I'm not sure though how you set one up to be the default table, meaning it will continue to be used even when you restart VO.
You have done it again! I found this post extraordinary and learned quite a bit from your effort. Although I have experimented with the Beta, I came nowhere close to the knowledge you have just shared. The amount of time you poured into this document will greatly benefit me and many others. Fantastic!
Thank you sir!
Scott, thank you for this excellent post. I originally thought iOS 13 was going to be very problematic for a sister, but then I remembered the new Voice Control. She has had trouble finding things on her iPhone, and I think VC will be a Godsend for her. It will for me too, but my hand coordination is slightly better than hers. I love her dearly, but this is just how it is. The other features of iOS 13 will be great too, and I cannot wait to get it.
Scott, thank you for this information. It is clear that Apple has spent significant time and resources on VoiceOver for iOS this year, which is fantastic. If only the Macintosh received this level of attention. The only thing I would have liked to see this year is an automatic scroll feature for Braille displays. It could work using a timer that would pan the display backward or forward after a user defined set of seconds. Hopefully this will come next year.
Is it true that Apple has finally fixed contracted Braille input? I heard that there weren't supposed to be anymore weird translation issues, especially if you are typing too quickly. From my initial tests in the macOS beta a few months ago, this appears to be true. Has this carried over to iOS as well?
iOS 13 is going to be a major update, and I am definitely upgrading tomorrow! I want SMB file share support and these awesome VoiceOver features! None of the bugs seem to be dealbreakers, so I can live with them until the 30th.
I'm blown away by the amount of new accessibility features and customization possible now with Voiceover. Has anyone checked to see if these settings are backed up in the iCloud backup? I'd hate to spend hours configuring all of this to my liking only to have to do it again when iOS is updated or I purchase a new device and want to transfer my configuration to the new device.
Thank you so very much for this outstanding article. It's going to be so very helpful when I update my iPhone to iOS 13.
Didn't check it, but voiceover settings were always backed up along with all the others before iOS 13, so this should be the case.
In order for quick path to be enabled on an iPad, you will need to enable the floating keyboard at this time. I must say sadly, at least for me, turning off the floating a keyboard and going back to the normal fool keyboard, may be a difficult sure for some. Unless there is a way to do it via voice control?
Thank you for an excellent writeup, Scott!
As always thank you for this post Scott!
As always, I've put it in my favorites & will re-read it after I get ios. 13. It'll proably come later for me since I have the ipad mini 4, but hopefully bugs get squashed, or progress is made towards squishing the bugs.
I do have 1 question for you though.
I seen an article that said that voiceover users will get the ability to have Vo. speak from their device's speakers instead of the bluetooth speaker. How can we turn this setting on if possible? Because that's something I always wanted to be able to do. Have whatever audio I'm playing, like a book, play from my echo via bluetooth, but have Vo. speak to me from the ipad's speakers. Thdanks again Scott!
The thing that I think they should implement in VO should be the possibility to search for a word present in a screen and therefore also in a web page exactly as it is done with Jaws, but this Apple has not yet made it possible.
Besides VO+F for the normal find command, ya can now press VO+G for going to the next occurrence, and VO+Shift-G to go back to the previous occurrence.
Otherwise, ya can use the ‘Item Chooser” of course.
Just installed iOS 13 a bit ago. I'm currently short on time, but love it so far. I'm currently using the new Indian-English male voice for VoiceOver. I did pass by some of the new settings, but haven't played around with them yet. My phone prompted me to set up Apple Pay, which I did not do only because I don't use it at least yet. But I noticed some extra sound cues when on that screen. I will have more time this weekend to play around with this stuff, but I can already tell things have been enhanced.
What is the title of the podcast on how to setup voiceover gesture customisations? Can not find it.
One nice thing if you do opt to try Voice Control, is that it includes several VoiceOver commands for activation, navigation and reading. Also, the correction process for fixing text dication errors with your voice works fairly nicely with VoiceOver.
i just see indian male english when i deuble tap them nothing happening. very odd.
Holger, this is only for Bella to listen to. The link is:
Bella uses voicecontrol. I got it. I could not find the podcast. She is watching me from somewhere.
Ah ok, this thing of being able to find a text string with VO on Mac I didn't know. But can it also be done with iphone?
After upgrading to iOS 13, I've noticed one major new feature that wasn't listed here. It appears you can assign VoiceOver gestures and/or Braille/keyboard commands to Siri shortcuts. This is extremely powerful, as it lets you quickly execute useful shortcuts on the fly. Now I'll have to look at creating shortcuts I can execute with a single gesture or keyboard command. This is beyond awesome! iOS 13 is possibly the best version of iOS I've ever used for a variety of reasons.
Thank you Scott for the great article. You are awesome. Have everybody a great day
Thankyou very much Scott!
Hi. I can't use the new haptic feedback option on my iPhone 7 plus. Is this feature only available for newer devices, maybe since iPhone 8? If so, I can't understand the restriction because the iPhone 7 plus has the system haptics (settings > sounds and haptics) too.
Whatever I do on my phone, he says dollar dollar dollars. Apple! Thank you for giving me a lot of dollars on my iPhone. *smile*
Great post, thanks. You must’ve taken a lot of time to put it together. My respect.
I haven’t figured out how to control my device by voice. Hope someone could do a podcast.
their is a podcast on this site.
Hi i was having a problem with the memos app the option to send a voice message via the message app is gone.So i did a reset of the phone and now do not have voice over i will need to get sighted help to turn it back on.So is this normal for IoS 13.And itunes will not let me do a restore cos find my iphone was on.
Another change in iOS 13 appears when you do a screen recording. The audio from VoiceOver is now included in the recording. If you are running with screen curtain on, be sure to turn it off before you start recording or you will have nothing to show in the video.
Sticks, just use Siri to turn on Voiceover.
Notice that VO does not work sometimes in notification. When I slide right or left nothing happens. Also whenever I get a notification at the end voiceover reads the time. Did not happen in the beginning but now is consistent. Hope iOS 13.1 fix this.
I cannot do that cos siri is not working iether so will get some sighted help today to turn voice over back on.All the short cuts of tripple press of home key will not load speech.IOS 13 has really messed up my accessability.
I cannot do that cos siri is not working iether so will get some sighted help today to turn voice over back on.All the short cuts of tripple press of home key will not load speech.IOS 13 has really messed up my accessability.
To add to post 30, it also records in stereo.
I tested it while playing Feer and you can hear yourself moving from lane to lane.
To the person having trouble turning on voiceover, I was going to suggest turning it on using ITunes, but it seems they broke that as well.
I just tried it, and while it goes through the motions, the checkbox is checked indicating that voiceover is on, but when you tab to the OK button, nothing happens.
Hopefully that gets fixed as well, because if Siri doesn't turn on voiceover nor the accessibility shortcut, then you're stuck until a sighted person can do it for you.
I had a time when nothing I did seemed to turn on Voiceover. I did a forced reboot of my phone and still no speech. It turned out that somehow the system volume had been set to zero. Once a sighted person looked at the controls, we just turned the volume back up and everything worked. Just something to consider.
When I get a call with the earpods conected, and I hang up the call with the button on Apple's wired headphones, sometimes VoiceOver stops talking.
The only way to solve is to turn off the headphones, and receive a call and answer with the double tap with two fingers.
Another way is force the iPhone to restart.
This only happens when I receive calls, I had no problems so far when I make a call.
iPhone XR and iPhone 11 with iOS 13.1.