I'm back yet again. Just like in years past, September brings us a new major release of iOS. This latest edition includes many mainstream changes such as a revamped Lock screen; enhancements to privacy and safety features; Focus Mode enhancements; new functionality in Messages and Mail; along with many other improvements. For Apple's official list of iOS 16 features and changes, see this webpage which Highlights many. A lot of articles will cover these changes in detail, but far fewer will cover the accessibility enhancements and features for individuals who are blind or DeafBlind.
Noteworthy Mainstream Changes
Hello Automatic Verification, Goodbye Captcha!
CAPTCHAs may soon become a thing of the past thanks to a new feature in iOS 16 called Automatic Verification. By allowing iCloud to automatically and privately verify your device and account, it may soon be possible for users to bypass CAPTCHAs. For more detailed information on how Automatic Verification works, please see this TechCrunch article.
This capability will need to be further developed, so it could take some time before it works in action, but it's a wonderful idea which will hopefully gain the support of developers. Though this feature was enabled by default for me, you can insure it is setup for you by heading over to Settings > Apple ID > Password & Security > Automatic Verification and setting it to "on".
Each time I cover what's new in iOS, Siri gets a mention. This article is no exception. This time around, Siri gains the option to carry out new tasks. You can shut down your phone, restart it, hang up a call, and other actions. With "Hey Siri" enabled, simply tell it to carry out any of these tasks.
If you are someone who prefers to collect their thoughts before dictating, you now also have the ability to control the amount of time Siri will wait for a response. You can adjust this by going to Settings > Accessibility > Siri and choosing among the 3 options. The default setting is what we have always had, but there are now options for "Longer" and "longest".
Finally, there are new sounds for when Siri is listening and when it stops. These new sounds are much lower pitched and may be easier for someone with a high frequency hearing loss to detect.
With iOS 16, it is mostly no longer necessary to speak the punctuation needed to formulate a proper sentence. Even when I spoke with no inflection in my voice, the dictation was still able to correctly insert punctuation marks in many instances. That said, I've also had random times where no punctuation shows up at all, so it is always best to verify what you are sending before doing so.
If you want to add or delete punctuation, though, you can edit on the fly as the onscreen keyboard remains available during dictation. It's also possible to dictate emojis. For example, saying "smiling face with smiling eyes emoji" would produce the correct emoji.
I have found that punctuation and emoji insertion is about 80% accurate on my iPhone SE 3, but that it was a bit higher on an iPhone 12 Pro Max. It's worth noting that the automatic insertion of emojis and punctuation is only available on the iPhone XS and later, which does include the SE 2, 3, and XR.
There Are More Voices Inside My Phone!
iOS 16 brings a large number of new speech options to those who use them for VoiceOver, Speak Screen or Speak Selection. According to the above linked Apple page, speech options are now available in over 20 additional languages and locales, including Bangla (India), Bulgarian, Catalan, Ukrainian, and Vietnamese.
These new options aren't only new voices, but newer versions of various software synthesizers are included, such as a higher sampling rate on the voice called Tom, a new premium version of Ava, and new voices such as Nathan, Joelle, Noelle, and others. Explore these and other new options by going to Settings > Accessibility > VoiceOver > Speech > Voice.
Read Eloquently With Reed
You can listen to your iOS device with any other variant of Eloquence you would like as well. In English, this includes both the U.S. and U.K. variants. Many prefer this speech synthesizer to others, as it is the default with JAWS and is very responsive. Its comparatively robotic and more predictable nature can also help those who are hard of hearing, as it can sometimes be difficult to understand the more modern concatenative synthesizers at faster rates.
Much like food, music, books or many other things in life, this is highly subjective. With iOS 16, your choice of voices and languages is larger than it ever has been. Even the Novelty voices macOS users are familiar with are now available.
A complete list of U.S. voices and audio demonstration can be found in this podcast by Thomas Domville.
There is a second podcast which demonstrates and lists the English voices for those outside the United States.
Less Is Sometimes Better
When iOS 15 was release, it brought more Verbosity options to both speech and braille users, including how to convey availability of the Action rotor. The challenge was that the messages were either always on or always off. Always on caused issues with the Mail app, since the actions rotor is almost always available, but it was very helpful for learning about Actions rotor availability in unfamiliar contexts. In iOS 16, there is now a setting that when enabled, only displays or speaks this information once until it changes. Those using Mail and similar will no longer find themselves regularly disrupted by these repeated messages. Find it under Settings > Accessibility > VoiceOver > Verbosity > Actions. Turning on "First Item Only" is the only new option.
Time To Start
A new feature for Apple Maps is sound and haptic feedback for VoiceOver users to identify the starting point for walking directions. As I understand it, this feature is automatically turned on whenever VoiceOver is running.
There are other things beyond magnification which happen in this app. One of them is detection. After opening the Magnifier app, on the lower right side of the screen you will find a button called "Detection". Activating this mode reveals a new screen which will have 3 buttons on it: People Detection, Door Detection, and Image Description. Any of these 3 buttons can be activated and your iOS device will then find and report what it thinks it has detected.
Walking around my neighborhood with all 3 buttons activated, the amount of information was overwhelming. This is partially because there were many businesses on each block, people going about their afternoon, and lots of advertising and information to detect. Turning only Door Detection on seemed to slow things down some.
While we are on the topic of slow, this is exactly the speed you will need to travel for the LiDar and other hardware to give you accurate information. Which makes sense, when identifying objects and their distance measured in feet, 1 or 2 steps can change the entire perspective. However, if you are looking for a door or sign, you probably are going to move at a slower pace anyway. My issue was that the information was without context unless I already knew someone or something was there. For example, I was often informed of the presence of others, but had no information about where they were in relationship to me. I was also able to read address numbers, which was nice, but there were often 2 or 3 doors detected at the same time, so figuring out exactly what was being reported took work.
There is also the issue of not having any free hands when conducting this exercise. I was walking with my cane in one hand and the phone in the other. It has to face outward, which could be problematic if someone bumps into you or thinks they would like a new iPhone and tries to steal it. For a totally DeafBlind person, they would have to most likely secure their phone via a lanyard or some other method, as they would have the phone, mobility aid and braille display to juggle when utilizing this technology.
Navigating indoors with this new set of tools, I found they worked much better and were able to provide me with useful information. For example, when I stopped in front of a door in a random office building, the iPhone was able to identify the room. It also read me the name of the person who probably thought I was a weirdo for hanging outside their door.
As is the case with other kinds of machine learning, I feel that we are approaching the time when these tools will provide extremely valuable information. It is my view, no pun intended, that such technology would function far better in the form of glasses or some other head worn device. Detection in the Magnifier app is only available on the Pro and Pro Max versions of the iPhone 12, 13, and 14.
iOS 13 brought this feature to VoiceOver, but iOS 16 extends that same type of contextual flexibility to the Magnifier. You can save your preferred Magnifier controls, including camera, brightness, contrast, filters, and more as an Activity, and quickly switch between them. This makes switching optimized configurations for recurring tasks and situations like a restaurant menu versus an airport flight board more efficient.
New in iOS 16 is a Live Caption feature which will not only provide captions for apps like FaceTime, but is also available system wide. You can enable the feature by going to Settings > Accessibility > Live Captions (Beta).
There are possibilities to control the appearance of the captioning panel. Options are available to make the text bold, adjust the text size, text color, background color, and the idle opacity.
For braille users, the Live Captions feature is useful, though a bit frustrating to navigate. When turned on and using system wide captions, you can typically find the icon on the lower right corner of your Home screen with space and dots 4-5-6 and then press space with dot 4 to get to the icon. However, sometimes the Live Captions icon isn't there.
For those wishing to make regular use of this feature, I recommend setting up a keyboard command to jump directly to the Live Captions panel from anywhere. The command needed to be customized is "move to next app". To set up a customized command for the Live Captions feature, follow the below steps:
- Head over to Settings > Accessibility > VoiceOver > Braille > More Info (under the device you have connected), and then commands.
- Activate the Device button.
- Find "move to next app" among the list of choices.
- Scroll to "Assign New Braille Keys".
- Have a command in mind so that you can immediately press this keyboard combination. For example, Dot 8 with dots 2-6. If the command you have chosen doesn't already have something assigned to it, you will be done with this process. If the Braille keyboard assignment does have a command already associated with that keyboard combination, you will get an alert telling you what the already assigned action is, and asking you if you wish to change it.
- Choose "OK" or "Cancel", and the appropriate option will be selected.
Once active, you can go to the captions panel by pressing the command assigned above. Once in the panel, in addition to the captions, you will encounter a few options. If no captions are available, you will find "Listening..." After that is a "Pause" button, followed by a microphone button and "Minimize" button. Turning on the microphone will enabled Live Captions of the environment around you. If the microphone is not selected, you will receive captions of any audio other than VoiceOver playing on other apps. For example, I pulled up a stream of WCBS AM on the OOTunes app and received captions of what the host was saying. Frustratingly, the arrival of new text returns focus back to the beginning of the line and disrupts the flow of reading. This is an issue even when using Live Captions with those around you. This may prove to be a nuisance for even fast braille readers. When in a FaceTime call, and using VoiceOver, you can go to the "more Info" button and turn Live Captions on for FaceTime calls. On my iPhone SE, I was not able to get Live Captions to work with FaceTime Audio, only when making a video call. Even standard phone calls are supported, which means that it could eventually replace apps such as ClearCaptions which puts new captions on different lines. This makes it so that the text in focus does not change each time new text arrives. The advantages to this over a 3rd party app are that a different phone number is not required and that there is no complex identity verification required by the FCC regulations for relay and internet Protocol Captioning Telephone Service.
The captions themselves are about as reliable as dictation. Meaning that they sometimes come out well, and other times, not so much. I called a friend in South Carolina who has a very thick southern accent. It did not do so well with his voice. However, after calling my brother who lives in Michigan, it was much more accurate. Live Captions is a feature only available in the U.S. and Canada upon launch, and requires an iPhone 11 or later.
Your Sounds Identified
With the release of iOS 14, Apple introduced Sound Recognition. A few new sounds it could identify were featured in iOS 15, but now you can teach your device what to listen for.
To set this feature up, go to Settings > Accessibility > Sound Recognition and choose whether you would like to customize an alarm or appliance. After selecting "Custom" you will be walked through the setup process that first has you assign a name to the sound and then play it 5 times. I had some issues with all 5 attempts being registered, but after playing it 7 or 8 times, it seemed to work.
Another new feature in iOS 16 is the ability to choose a sound and custom vibration for each alert. So, if you have an urgent notification like a smoke alarm, you may wish to use one of the more persistent vibration paternss. For the few sounds I have active, I recorded my own vibration in the form of a 3 character morse code pattern so that the alert would be immediately recognizable.
Performance seems to have improved, and appears especially more reliable for the custom sounds. I found that, for example, the doorbell alert was not too reliable in previous versions of iOS. Now that I can record my own, the reliability has improved with no false positives. I still find, though, that the sound in question has to be quite loud in order for Sound Recognition to alert me. The threshold of noise with customized sound recordings also seems to be a bit lower than the standard sounds that are available from Apple.
Am I ready to throw out my smoke detector and other alarms? Not at all, as my default alerting system feels more safe and reliable.
I've Been Notified
For users of AirPods and Beats headphones, it was possible to have the notifications directly delivered into those devices via Siri when announced. IOS brings this ability to users of mFI hearing aids.
in iOS 15, users were given the ability to get Headphone Accommodations made to compensate for any type of hearing loss. The user would complete a short hearing screening that would then be taken into account when used with compatible headphones. iOS 16 expands this allowing the generated audiogram to be imported into the Health app.
Hanging It Up
By default, it is now possible to end calls by pressing the Side Button on your iPhone, however, I'm not able to get it to work with my iPhone SE. I tried leaving a Bluetooth audio device connected, disconnecting the Bluetooth audio and holding the phone to my ear and also tried the feature with VoiceOver turned off. Others have had success with this feature, but I still can't seem to hang up on people reliably. It is possible to turn this setting off if you wish, it can be found at Settings>Accessibility>Touch>Prevent Lock To End Call."
Apple continues to innovate and bring new accessibility features to users who need them. Having new features that I can specifically use as a VoiceOver and braille user each year makes me feel included in the process. Though there are certainly bugs in the iOS 16 release, I have chosen to upgrade my main device since I'm able to work around the bugs present. Many of the bugs I have found are minor in nature. While this is true for my circumstance, it may not be for yours. I would recommend checking out iOS 16 on another device before installing it yourself, especially as a low vision user. Apple has indicated, for example, in the article linked in the first paragraph that there are many new designs. Further, I would recommend checking out the list of bugs on the AppleVis website. The original post and comments often provide valuable information which may help in decided whether iOS 16 is right for you. If so, you can get it on the iPhone 8 and later. Sorry iPod touch lovers, iOS 15.6.1 was the end of the line for the iPod 7. You can download and install the update by going to Settings > General > Software Update. Remember, though, that you may only be able to go back to 15.6.1 for a very limited time, and that the process of doing so is far from simple.
UK English Premium voices
At least for me Ava premium sounds as artificial as the enhanced version just in another way. However new UK English premium voices (and there are two of them both male and female) sound really great. They are very close to match the best online available voices from Amazon and Microsoft which would cost you hundreds of dollars per year on some more heavy usage... I would say they are above any other currently offline available voice including Acapela.
For those who are wondering…
For those who are wondering how responsive voiceover is on older devices such as the IPhone8 which is now my testing device as I've retired it as my primary I will tell you I didn't notice any lag at all which honestly I was surprised being that this is probably the last year this phone will be supported. When I tested it I set it up as a new device so your milage may very.
Very happy about some up-coming features
Thank you for posting this article. I can't wait to get home from work tomorrow so that I can update my iToys! Looking forward to the new voices of course: most noteably Evan and Nathan. I'm very happy to hear that the actions available message will go away in mail. That has driven me bonkers for ages now. I'm also optomistic about the possibility of ending a call with the lock button. I seem to remember that being standard with my iPhone 5S, but somewhere along the line, that feature either went away or stopped working. So, if it's come back now, and if it actually works on my iphone 13 pro, that will make me a very happy camper. I am curious, though: have they fixed any of the weird pronunciation with the siri voices, especially what is currently known as voice 4? A few updates ago, they did something to change the way she says the word button, and I know it's a little thing, but it was just too weird for me to deal with, so of course, back to Alex I went. Anyway, it sounds like this is going to be a great update, and I hope I won't encounter too many quirky frustrations.
the lock button for ending calls
Thank God the lock button for ending calls is an option, because I work with headphones and I cannot afford to hang up on clients, because I cannot get them back and I depend on how long the call lasts. The longer it lasts, the more I get paid.
In the past if I did not want a call and want to send it to voicemail I press the power button to do so and see if they left message or not. Now how do we send it to voicemail?
I opened my magnifier, (I'm running iOS 16 beta on an iPhone13 Pro), and I did not see the detection button. Is there some other setting I have to enable first?
As for sending calls to…
As for sending calls to voicemail I would assume swiping to the decline button would do the same.
Did youactivate the start detection button? That's the one on the right side of the screen.
start dection button missing
The subject says it all. I do not find such a button in my magnifier. Do you have to find it via explore by touch? I sure did not find it by flicking through the options.
I can't find how you edit and unsend text messages or emails.
I also can't update any of my apps because when I hit the agree button I get an error message. I can only find one page out of 15 and there are some unlabeled buttons.
I don't think this feature is in Mail, only text messages. You can send an email later, but I don't think you can unsend them.
Unsending messages is a rotor option. All you need to do is put focus on the message you wish to unsend. The option is called "undo send".For editing, do a 1 finger triple tap to launch the context menu and select edit. Note that you only have 15 minutes after a message is sent to edit or unsend. Good luck!
Unsend in Mail
You can unsend in Mail. From what I’ve seen though, it’s only available for a few seconds after sending it.
After hitting Send on the email, tap near the bottom of the screen where the “Last updated” status normally is. There, you will find the Unsend button, but it doesn’t hang around for long.
Warning: Do Not Sample Gester Voice Without Sighted Assistance
After installing iOS16, I was sampling all the new voices that we now have. All was going nicely, until I sampled one of the novelty voices called Gester. This voice is a evil sounding laughing voice. The voice is strange, but that's not the issue. As soon as I sampled the voice, I could do absolutely nothing with my device, because no matter what I touched, it would only play that strange laughter.
I tried some common sense things like hitting the back button, in hopes of being able to select another voice, open it up, and selecting something else but this failed miserably. I consider myself to be an excellent VO user, but I could do nothing, until I had to phone a friend, with my primary device and she had to come and disable the Gester voice.
Please, do not touch this voice, unless you have someone right there to fix the issue that I faced .
Search Button Now Missing From Today View Screen
Does anyone know how to make the search button, normally located in the upper lefthand corner of the Today View screen become available again? After updating to iOS16 it's simply not there anymore. I utilize this method heavily and I hope that there is a way to restore this feature.
Any assistance is much appreciated.
Not experienceing this
But yes I'd still keep my hands off of it it sounds evil. I tried it and it work just find but I not trying it again because that laughter is just too creepy!
I had also played around…
I had also played around more with the novelty voices just for fun. Below are my thoughts:
Albert doesn't sound like a frog. I think that's pretty obvious. He sounds like he's struggling to breathe or has a really bad cold.
Bad News was exactly that. Meaning, I didn't like him very much.
Bells: good, I could actually tell what it was saying, but maybe that was just because I had my braille display providing translation, or else I might not have been able to exit any of those novelty voices because you have to try really hard to understand what it's saying.
I could also barely tell what bubbles whas saying although once I got used to it it was okay, it just sounded like it was whispering and/or underwater. A good combo.
Cello didn't really sound like a cello, and I liked it's sample, but I didn't like it in action.
Good news was the best out of the novelties—While not good exactly (I use Nicky Enhanced as my default voice and I think that's the best one) it was the most human-sounding novelty (other than suparster which actually sounds human and not like any of the others) except since it (and all the musical voices) always start on the same note if your flicking through stuff really fast it definitely sounds robotic—And not like Eloquence or Fred robotic more like those cartoon robots that only talk on one single note and more often than not sound very tinny—If you know what I mean haha.
Jester—Yeah, said that in my previous comment—Is creepy and I can see why people have issues with him.
Organ is like good news, but less coherent and like most of them I wouldn't want to use without a braille display translating.
Triniods and Zarvox I think sound sorta the same and they sound like I could give them a shot with VO but probably not.
Wobble, also sounds like he has a cold, and wouldn't want to use with Voiceover.
RE: I had also played around…
Personally, I think the novelty voices are nothing but a waste of memory space. Unfortunately, I don't think we can delete them, as we can the enhanced voices. I feel that Apple just added a bunch of unnecessary, childish crap in there. For why? I can't figure that one out. After all, who, in normal daily usage of these devices would ever choose those novelty voices in the first place? I want my precious ,memory space back. I'm not trying to disrespect anyone who may truly enjoy the novelty voices, just expressing my own thoughts about them.
Now with my rant over, to my surprise, I think there are a couple of new candidates for me, to include my beloved JAWS/Reed voice, Nathan, and Evan. I have yet to test drive these in practical application, though. I have only sampled them. Although many will disagree with me, despite all the VO improvements, I have yet been able to change my voiceover voice away from the compact or not enhanced Samantha yet. I use my devices at a high rate of speech, 85%, for my daily usage and all of them, including the Siri voices simply break apart and become disjointed when trying to use them in my normal workflow. It will be interesting to see how these new voices turn out. I have been wanting the JAWS/Reed voice for a long time, but we shall see.
Thanks for reading my long post.
Yes, when I went to the app store got message error when updating the apps. After several hours it went away and was able to update.
siri voiceover bug?
There use to be a bug where if you have any of the Siri voices as your voiceover voice and you restart the phone, you wouldn't have speech until you enter your passcode.
Luckily I use Fred so I don't run into that problem.
Has that bug been squashed?
You either had to restore from a backup or get someone you trust to enter your passcode, unless you're good at memerising where to tap the screen.
All in all, I like this update.
I tried the novelty voices and really had to listen hard with the bells voice so I could get my settings back to my normal voice, but they were mostly fun. Slowing them down might have helped.
Playing around with the other voices
Some of my candidates are Joelle and Noel which I think sound exactly the same (anyone know the difference), and Maybe Zoe mightbe be nice too. In my oppinion Ava enhanced and Premium sound pretty much the same. I'm also glad now you can speak sample without downloading the voice now. That's nice, especially for the new ones that I might not necessarily want but still want to know what they sound like. For now I'm sticking to my Nicky enhanced but I'm looking more into these others.
PS. I read on one of the Ios 16 beta topics that premium voices crashed alot? Does anyone know if this is still the case?
Something I noticed
On every major update, it goes to the "Hello" screen after it's done where you click the get started button and set up new features, right? At least if I leave it for too long. Yes, it was annoying that I had my iPhone downloading the new update, and left it charging while I was out so maybe it would update even though it wasn't night, and when I came back it was spouting out the Welcome message in all the different languages and since I was gone for a long time who knows how long it had been doing that, but then that screen was way more accessible I think than it used to be. I could easily swipe to the "Press Home To Unlock" button without having to scroll through the other parts of the message like I used to.
Something that I find…
Something that I find intresting about the hello screen, it did that on my IPhone8 when I updated but not my SE3.
The novelty voices bring me…
The novelty voices bring me back to my days back in Junior high when we used macs in computer class. I think apple owns them so that's probably why they have added them in.
I do like Nathan, I'm…
I do like Nathan, I'm currently using him. He does have some weird inflections but he's got a good strong firm voice and that's what I like about him. I use to use Samantha but I could not take the new voice, not sure if it's a bug or if she's suppose to sound like this but she sounds as if she had some teeth knocked out.
Re: No speech after reboot?
Just wondering, is it like voiceover is off and won't turn on or it's on just won't talk? Because I'm wondering if someone like me who using a braille display could just enter the passcode with the display?
It doesn't appear to be on…
It doesn't appear to be on at all.
Some Have Been Renamed
Some of the novelty voices have been renamed. I know this because I actually used them under previous versions of Mac OS. Having said that, I will have to wait to upgrade until I get one of the newer iPhones. However, I will be able to update to Ventura since I am now on an M1 MBA. Thank you Scott for yet another well-rounded post. I think there's a lot to look forward to in these new updates. I, too, used Eloquence under JAWS and it was very good.
The “first item only” option isn’t working
I turned on the new first item only setting in the VO settings.
It’s not making any obvious differences for me. And in mail, it is still saying, actions available, or in my case, playing the sound, every time I swipe right and left in the email message. Have you guys had any luck with it?
fnally found it!
Okay, so another individual helped me out. I had to add detection mode to my magnifier. Low and behold, the detection button is now there. I'm not sure if I'm intrigued, impressed, or disturbed by the image detection part. I turned it on, and next thing I knew it wwas describing everything in my room. It was wild! I have not been truly able to test the door detection and such just yet, as i just figured this out last night and have not been out and about with the new settings.
Hi, all! I haven't played around with the novelty voices very much since I've heard them on my mac. However, I noticed something very weird. I wanted to try Bruce and Vicky, but when I chose them, I just got compact samantha, even though I had downloaded the voices. Maybe I should try again as I've rebooted my phone since then? I was really looking forward to Nathan and Evan; use Nathan in JAWS, and like him pretty well for the most part. Unfortunately, on iOS, his pronunciation quirks were just too much. I mean.. I Mess Edge? Actually, all of the new voices seem to have that pronunciation issue, and none of them can say button either. So, I'm sticking with my good old buddy Alex. Was pretty sure that would end up being the case, but am disappointed that Nathan and Evan didn't sound as good as I wouldhave hoped. Also, as others have said, I can't hear any real difference between Joelle and Noelle. I think Noelle might be a smidgen higher, and the voice in general is somewhat pleasant, but not pleasant enough to make me abandon Alex. LOL!
Auto detection of different languages is not happening.
I have upgraded to IOS 16 and I have English UK for English and Lekha in Hindi to read the Hindi language.
However, the Voice Over is not automatically switching from English to Hindi as and when it finds the different language.
Please provide me some solution to this.
Re: Auto detection of different languages is not happening
Unfortunately it's broken even in the first 16.1 beta. The same is happening with Persian.
Re: New Voices
Yes! I have experienced this too, but with Alex only. I said it onthe Accessibility Bugs blog post. Have you noticed, too, that when you get to the screen where you play the voice as Samantha, it won't say the suffixes (such as button and heading) after anything and it will be very sluggish and almost stop speaking all together if you leave it too long? A workaround I found at least for Alex to to get out of the preveiw while it's downloding. That works for me. I'm so glad someone have experienced this.
And for the next thing, yes if you listen carefully you can here Noel is a bit higher and a bit scratchier but that's my oppinion. As for the newer voices, not many of them are very good. Agnes and Kathy as I can tell from the samples both have a weird staticky sound whenever they say the S sound. Joelle and Noel are the best of the new ones. Evan default sounds like Nicky Default's male counterpart (and I will never ever ever touch Nicky default again after it inhabited my siri voice somehow and required a force restart to return it to Voice 2, the one I prefer to use, I didn't like that one bit). Bruce and Ralph are nice low quirky guys, but I've found that they say there vowels much louder and clearer (which is not very clear) then any other letter. Nathan and Zoe, in my oppinion, sound better then the original Neo-Speech versions. Ava enhanced and Ava Premium sound pretty much the same
I personally have noticed…
I personally have noticed the sluggishness of voiceover in the voices section since 15.
Haha Albert says "I have a…
Haha Albert says "I have a frog in my throat". I think he was saying "I am a frog" haha. Which just proves how hard he is to understand and he definitely sounds like he has a real frog stuck in his throat
On Wednesday after I had went to a hearing test, I decided to use door detection to find Buffalo Wild Wings to drown my sorrows of being diagnosed with mild conductive hearing loss. Yes, I am older and I knew that mild hearing loss was coming but conductive hearing loss?
Anyway, door detection worked very well finding doors, telling me how far away I was from it, and what the text said. Denise and I found the restaurant with the help of someone, as this one is connected to the Nascar Hall of Fame here in Charlotte. The door actually said elevator. That entry doubles as an elevator entrance to go elsewhere in the building. I set a place in BSQ for next time. Yummy wings and beer, and oh yeah, Braves baseball!
One thing I'd like to see…
One thing I'd like to see that is still present in 16 is on the phones keypad the delete key be placed back below the 9. I think in ios7 is where the change was made where the delete key is not directly below the 9. Not sure why they changed this.
Delete is located below the pound or hash key. If it were placed under the 9, that would put it between 9 and pound. Instead it is on the row with the call button.