5 Years of VoiceOver: Look How Far We've Come
In June 2009, Apple changed the accessible smartphone market forever with the announcement of the VoiceOver screen reader on the iPhone 3GS. The device was officially released to the public on Friday, June 19, 2009; five years later, I thought it would be fun to take a look at my own early experiences with the iPhone, reflect on how much VoiceOver has changed (hint: more than I realized), and offer some thoughts on—and hopes for—the future.
For those new to how VoiceOver works, VoiceOver provides a way for blind and low-vision users to use iOS devices—iPhones, iPads, and iPod Touches—without seeing the screen. (VoiceOver is also available on all modern Macs.) Using a modified set of gestures (for example, double-tapping on an icon to activate it instead of single-tapping), blind users are able to access all of the built-in device functions—as well as an increasing number of third-party apps—with ease. To say that VoiceOver revolutionized accessible smartphone technology for the blind is, in my opinion, putting it mildly.
A Personal Reflection
I can remember quite clearly my reaction when VoiceOver on the iPhone was first announced: "How the heck is a blind person supposed to use a touchscreen?" "It’s a great idea, but I don’t see it working." (Pardon the pun.) And, this is the best of them all: "I don’t want an iPhone!"
For those brave enough to venture into the world of touchscreen devices—uncharted territory up until that point—VoiceOver seemed to be everything Apple said it was...and more. While I don’t have any documentation as to the early features of VoiceOver, I remember reviews from early-adopters being overwhelmingly positive. (Here's one review by Mark Taylor dated June 29, 2009.)
Even though people who had iPhones generally really liked them, I still did not want one; the idea of trying to enter text without a physical keyboard, in particular, was just too daunting. (Is there such a thing as touchscreen phobia? If so, I had it.)
Fast-forward a year and a few iOS (then called "iPhone OS") updates. At this point, I had a Windows Mobile 6.5 device with Mobile Speak. Before the iPhone, Windows Mobile was where it was at—as far as I was concerned, anyway. The device did just about everything I wanted, but there was that nagging voice in the back of my head telling me that Windows Mobile was quickly fading into obsolescence in favor of newer—and, I assumed, less-accessible—technology.
Of course, I could always switch to that touchscreen iPhone that an increasing number of blind people said they were really happy with. My attitudes towards the iPhone had softened over the past year, and I was becoming increasingly curious about the device and what all one could do with it. (The rapid decline in my current phone's performance, combined with the decreasing availability of accessible Windows Mobile handsets, probably had something to do with that.)
While attending a summer camp at the Illinois School for the Visually Impaired in June 2010, I had the opportunity to see an iPhone, try out VoiceOver for myself, and talk to a blind person who, like me, had used Windows Mobile pre-iPhone. After figuring out just how easy the touchscreen really was to use (making a phone call was a huge milestone), it was then that I knew I had to get one—and embrace the technology of the future.
I began my iOS journey with a new 16GB iPhone 3GS running iOS 3.1.3, which I quickly upgraded to iOS 4. When I finally got service on my device (I was using it on T-Mobile), I wanted to do everything: I was downloading apps, surfing the internet...you name it. I also got to be pretty quick at typing on the touchscreen as well, until I got my first Bluetooth keyboard and realized how much faster that was to use.
One of the things I really liked about iOS 4, believe it or not, was the original text-to-speech voice. Perhaps it is because hearing that particular iteration of Vocalizer brings me back to my very enjoyable first days using an iPhone, or perhaps it is because the voice was lower pitched and less whiny—I think the compact voices since iOS 4 have not been as good. The iOS 6 compact voice (for U.S. English) was about as close to the original VoiceOver voice as I think we’ll ever get, but that was replaced in iOS 7 in favor of Vocalizer Expressive. I remember reading on the AppleVis forum that the iOS 6 Compact Voice used Vocalizer for Automotive—a speech engine that is no longer in production. Ah well...
VoiceOver Through the Years
With each major release of iOS, Apple continually adds new features to VoiceOver. While researching for this article, I came across three great blog posts by Scott Davert, detailing what was new and changed with VoiceOver in iOS 5, iOS 6, and iOS 7, respectively. There is a lot more information than I have space to list, so be sure to visit the links above for a complete rundown. Below are the highlights as I see them:
- iOS 5: For me, the big new feature in iOS 5 was the premium voice, now called the Enhanced Quality voice. Of special note, too, was the introduction of many new rotor options, the Accessibility Shortcut, and custom element labeling.
- iOS 6: iOS 6 brought a better compact voice (see above), more rotor options (custom actions comes to mind), keyboard improvements, and Assistive Touch support—as well as some bug fixes for Braille users.
- iOS 7: iOS 7 included the ability to have more than one premium voice on your device, a long-requested feature to not automatically speak incoming notifications while on a phone call, Handwriting Mode, and even more rotor options.
- iOS 8: While not much is known about what new VoiceOver features will be included in iOS 8, Apple announced at WWDC that VoiceOver would include the Alex voice—the default text-to-speech voice from Mac OS X—as well as a method for direct Braille input on the device. While not specifically a new accessibility feature, Apple also announced that third-party keyboards would be able to be integrated system-wide, opening up even more keyboard options for all. (MBraille, anyone?)
Over the last five years, I have again and again seen demonstrations of Apple’s continuing commitment to accessibility. Besides the continual VoiceOver features and improvements with each major update to iOS and OS X, one of the best examples of this commitment came from CEO Tim Cook at a shareholder meeting this past March. While responding to a request from the National Center for Public Policy Research that Apple give an account of the costs of its energy sustainability programs and continue only those programs that were profitable, Cook said that a return on investment (ROI) was not Apple’s only motivation.
"When we work on making our devices accessible by the blind, I don’t consider the bloody ROI," Cook said.
"But There Are Accessibility Bugs..."
While iOS and Mac OS X do have some accessibility bugs—keep in mind that no operating system is bug-free—I do believe that Apple is actively working on solving these issues. You can help report bugs by writing to email@example.com if there’s an issue you’re having in iOS or OS X. (Users in the U.S. may also call Apple Accessibility at 877-204-3930.) You are your own best advocate, and Apple Accessibility won’t know your needs and your specific situation unless you tell them. In your e-mail/phone call, tell the representative how you expected the feature to work, what is happening instead, and how the issue is impacting your use of the device.
The same idea goes for iOS and OS X feature requests; Apple won’t know what you want unless you tell them. While the feature sets of iOS 8 and OS X Yosemite are likely already finalized, a request one makes now could very well lead to a new feature in a future update.
In June 2009, VoiceOver revolutionized the accessible smartphone market in a way nobody thought was possible. I know I certainly never expected that five years later I would have an iPhone, let alone be a member of the editorial team for a community-powered website for iPhone (and other Apple product) users.
To Tim Cook and Apple Accessibility: my hat is off to you for a job well done. I look forward to many great advancements in the months and years to come.
Regarding VoiceOver history, what did we have before VoiceOver on the 3gs?
I remember reading somewhere about some special app or something you had to install to make an iPhone speak, there might have even been a need for some hardware I am not sure on that one.
Anyway would be interested in knowing more just for nostalgia really if what I read was correct.
Hi Michael. Well done, your blog. My own experience went like this:
I used a phone like everyone else, not texting and the like, always needing smeone to delete calls. Suddenly the iPhone comes out, I to didn't think I wanted one. Note: anyone who wants to test out an iPhone, good luck trying it in the store. I had problems even entering text. It all came down to money. Upgrading my mobile speak device was ninety bucks, a better voice a hundred. I got a hundred for a birthday gift, to jumped in head first. Never looked back.
I don't own an i-device but I would like to comment here. I really enjoyed reading this blog post. While I can't say for certain whether an i-device is in my future, reading this makes me want to go purchase one just for the experience if nothing else. I'm wondering if outSpoken was ever included on an i-device, or perhaps that was before they even came out? In any case, hopefully I can at least get my hands on an iPhone one of these years.
nice post. As you mentioned we will soon have a default braille input on screen keyboard on iOS and in my opinion this plus the general opening up of third party default keyboards, will be the most important change since voice over. As you have already described typing on the touch screen has always been the one flaw of iPhone accessibility. I remember the first iOS update I ever did was 4.3 which gave us the ability to use bluetooth keyboards. perhaps even more important then bluetooth keyboards are the third party alternative text input apps we have seen flourish in the past few years. I have used every single one of these including some early ones that are not even commonly known of, such as type Brailler learn braille. This clumsy app was my first experience using the six key entry on the touch screen and I knew right then that such a typing method would be transformative. I would be interested to read a recap of the progress of voice over on the mac side of things. I have not been a mac user as long as I have been an i device user so I know less about the progress we have had. I know oSX has had a lot of improvements, although I feel as though voice over progress on the mac has been a little more stagnate then on the phone. pdf's anyone?
Actually I prefer touch typing as I can go very fast on it. Ok n not 80 wpm but still it is good brain training for me to do touch typing in various situations.
Hi Mary. As soon as I restore, buy, or otherwise configure a dvice i'll use personally, touch typing is the first thing I change. Along with use compact voice off and stop that annoying smoker's sounding pitch change. I understand others need to learn, but man it takes so long to split tap.
funny this is brought up about touch typing, because that is exactly what i also change first time i get a new device. I do agree that touch should be the default.
I just wanted to thank you for all of your kind feedback.
Tree: I, too, think Braille input and third-party keyboard integration will be huge.
Marrie and Siobhan: Touch-typing is one of the first VoiceOver settings I change on a new device...and downloading the Enhanced Quality voice is right up there as well.
Well the Mac OS isn't that bad. In fact Apple did a special update pack for VO inm the last version of maverix (they said voiceover fixes or similar in the change notes).
Because of many aspects, I have been able to update to maverix only now and I can say that for me the 10.9.3 is way faster and more stable than anything before I can remember.
In fact I love mac OS. I love it much more then my phone. If I had to give up an apple product it would not be my mac I can guaranty you. My point was simply that if Apple is amazing enough to do things as awesome and as accessibility proactive as giving us braille typing on the phone there is really no excuse for the state of things such as PDF's on the mac. I am sounding to negative though and I apologize. Both iOS and mac os have seen amazing improvements. My system is running much slower under mavericks, but its all worth it to me to have the new pages. It is in my opinion one of the greatest accessibility improvements we have had on Mac os. it was high time for an accessible fully featured word processor on the mac. But I apologize for I have gone far away from the original topic of this well written form.
Strangely for me it is faster than it was with 10.8. I am using a mac book air from late 2013 and its more stable.
What is slower is ios 7 with iphone 4s specially when openning apps. I have heard its because an animation is played. It would be great if we could deactivate such resource.
My phone has been a bit slower under iOS seven as well, although I think it has balanced out with the updates we have seen. The fact that your mac is running faster is a comforting thing for me to hear. I am running a mac book pro with a mechanical hard drive and I am hoping to eventually get a machine with a solid state drive. I wonder if that is the major difference between our experiences. Can anyone els back up this idea? that mavericks runs slower with mechanical drives and faster with solid drives? Sorry I know this question is very far from the original point of this post. If people want to ignore this and talk about more relevant things thats cool. Or perhaps if people think its worth going on about we could make a new post about our experiences with mavericks either slowing down or speeding up our systems.
Yes Voiceover is amazing.
I was only yesterday reading a blog post by someone called Marko in which he was saying how he was going back to windows after 5 years on a mac. The post itself was interesting to me because he was complaining about the interaction system and how braille didn't work for him as it should because he had a revolution braille display. What fascinated me about his entry was his overall sense of bitterness about the platform's progress. It was a complete 180 turn and i'll be interested to see if he actually comes back to the mac after 10.10. We, as blind people, realised that the mac would be hard going and we would need to think very differently about the UI and things. It was just amazing to me that someone with Marko's level of computing experience seemed to now want the comfort of Windows and NVDA. Folks, don't get me wrong, NVDA is very good but VoiceOver it is not. The mac platform, once adjusted too offers us blind folks so much, the automation facilities within OS X alone are worth it and are what keeps me here and will continue to keep me. I haven't touched a windows box since 2007, refuse to acknowledge Microsoft as viable any longer because of the disaster that is Windows 8 with 30 charms and whatever else coming up on our screens every 30 seconds, not to mention that terrible surface tablet idea that is just, well, yes!
I know this sounds like a fanboy, but tough, i'm a fanboy. The iPad Mini was the best portable computer I ever bought, the Mac pro I use for my scientific and music work is still running and great after 6 years and once I upgrade the computer to something more modern, i'm sure i'll be impressed again. People are indeed right when they say Windows machines are boring. They are boring, we are now in a lifestyle age in which digital delivery and content awareness are key. So what, powerpoint. That's not going to entertain us now is it? Even in big business, presentations are not just charts most of the time, it's flashy production. I know, have been to enough at conferences and you spend more time convincing yourself that the person knows what they are talking about rather than just impressing us with graphics and interactive nonsense. New media has changed how we view the world but this is off topic.
Seriously, great post though and you're right, we've come really quite far in such a short time. Apologies for the length of this, got carried away but it's a free country isn't it.
1- Marcos Eye is someone who should be taken seriously. I personally don't think he is right, but may be something he is pointing out is worth looking closer and reflecting upon, for whatever.
2- Internet is not a USA only service. Free country? May be or may not be, so care must be taken here.
3- Authomation? The key point which is almost making me return to JAWS and Windows is the possibility of scripting stuff that are accessible but not as useable.
I however don't switch back because I am almost sure there must be a way of doing the same thing on mac, but I was not able to find resources on it. Making keystrokes to avoid 17 steps to jump from a place to the other in sonme cases, notably xcode but with plentty of other samples, would be great, like the possibility of moving the VO cursor to given components identified consistently some way.
This and the opossibility of whatchuibng and taking action upon changes in components would be great. Can you point me out where I can find this resources?
http://macosxautomation.com being just one of them. Funny how we should take Marko's blog seriously because you say so however, I personally think it was flame bate quite honestly and won't ever consider his work of any value again. Just because you are a programmer doesn't preclude you from being hasty with an opinion does it? It's very easy to regard computer scientists as gods, they're not, they are as infallible as the rest of us, making stupid assumptive opinions based on personal bias because he felt the OS wasn't being developed according to his standards. Get out the dummy. Typical! He chose to use this platform, we all did, whining after the event just makes you look like a typical blindy.
As to your comments about my mentioning that it is a free country, it was an expression. I am from the UK, not the US.
Keyboard Maestro might also do some of the things you want, it's actually quite good when coupled with scripts. Xcode is getting better than it once was anyways. It's not like VO doesn't have hotspots and things like that to aid in some of what you are wanting, not ideal but better than paying hundreds for JAWS.
ooops ..... keep it easy.
1- The US thing was more like a joke.
2- Marcus is someone that has worked extensively in accessibility and is someone who perhaps is worth to be read. I personally do not agree with their statements on that post, but he is someone that must be respected.
Authomation is something cool, but I am talking about voiceover dictionaries specifically. I am using macs more than windows, but I imagine what we could do with a simple dictionary.
3- I know xcode is getting better. But we could make it even better with such a dictionary and it could be a big productivity boost.
4- I dont think and will never ever think that what FS did with JAWS, allowing us to script and enhance the screen reader, is wrong. It is right and Apple should give us the very same oportunity and the very same power.
There are functions specific to a screen reader, such as anouncing automatically whenever parts of the screen change, that are not part of the buseness rules of applications. These cases should be handled by the screen reader, and we do need a way of doing it as soon as possible with VO for OS X.
While Apple is doing a great job and making people like myself use its reader more than windows, we still must ask them for more resources.