I have a love-hate relationship with Siri.
When I tell Siri to set my alarm just before I go to bed, I appreciate the convenience of being able to simply tell my phone something and it get done. I’m the type of person who might possibly sleep through my first alarm, so I set several as an extra precaution. When I wake up and am sure I’m out of bed, I tell Siri, “Turn off all of my alarms,” and go on with my day. The same goes for setting my monthly hair cut appointments; almost always, it just works.
When I ask Siri things like, “What’s zero divided by zero?” I laugh, because its response is funny and because whoever programmed it clearly has a good sense of humor.
But when I dictate certain words, such as “horn,” (something I do regularly, as I am a railroad enthusiast and often send train horn recordings to friends) and Siri hears and inserts another four-letter word instead, I get really, really mad. But I know that text dictation will never be 100% accurate, and thus I either check all of my texts (paying particular attention to the ones I think it might have difficulties interpreting), or I just enter them by hand.
My biggest gripe with Siri isn’t about anything it does or doesn’t do, however. Rather, it is that Siri's potential to assist blind and low vision users is generally misunderstood and overstated. Too often, I read articles in the mainstream (that is, not specifically written for an audience of users with visual impairments) media about iOS accessibility which extol the access Siri provides blind users. While Siri is one of many accessibility tools, the sighted public’s idea of how we use it is ripe with misconceptions.
Many of the articles I’ve read over the years have generally touted Siri’s ability to dictate text, thereby negating the need for typing on the touchscreen. But as we saw with the “horn” example, dictation is far from perfect. Since the blind user relies only on speech (rather than being able to quickly scan the text Siri has dictated), closely reviewing the text—either by word, or better yet, by character—is necessary if there is any concern about erroneous dictation. If time is of the essence, I either must decide to send an imperfect text (hopefully with most of the correct words), take the time to proofread it and make any changes, or just enter it by hand from the outset.
That the sighted public, more often than not, assumes blind people would benefit from dictation rather than typing is nothing new. I’ve been asked many a time if I use Dragon Naturally Speaking on my computer, and my response is now usually a variant of, “Why would I need that?” Dragon and other speech-to-text software is great for people who aren’t able to type, but the idea that a blind person, with no other disabilities or extenuating circumstances, wouldn't be able to type and needs dictation software has always bothered me.
Sadly, this “Voice-controlled technology is easiest for blind people” mentality has seeped into peoples’ perceptions of how the blind use iPhones. In a recent iMore article, "Making the iPhone camera accessible for the blind," the author asserts that the first step Apple took in the process of making camera use accessible to blind users was making sure everyone could easily get to the camera app:
The first step in making the Camera app accessible is making Camera.app accessible. In other words, making sure everyone can get to it whenever they need it.
You can navigate to Camera using VoiceOver, but Apple's made it even easier. Simply tell Siri "Open the camera". Even "Siri, take a selfie" works. It doesn't automagically switch it to the front-facing mode or take a picture, though—at least not yet—but it gets you where you want to be.
While I hate to destroy the image of the courageous blind man who, just by talking to his iPhone, has a whole new world opened up to him...Siri generally does not make things easier for me. Anyone who has asked Siri a question and gotten the response, "Here's what I found...take a look!" will know what I mean; there is often a lot more involved than just Asking Siri any question and it directly telling you the answer—enough so that I often am faster at just looking the information up myself. The same goes for launching apps; when all is said and done, I'm quicker at using my iPhone by hand than I would be asking Siri to do it for me.
While on its face this may look like a criticism of Siri (and, by extension, Apple), I think my quarrel is really with automation and "smart" technology in general. What it comes down to is that I usually don't want to talk to my phone, and I usually don't want my phone to automate things I could simply do better myself. (Don’t even get me started on “smart” canes.)
But, I digress.
Coming back to the iMore article, I think it’s wonderful that mainstream journalists are giving much-needed coverage to accessibility. The more developers who can be made aware that blind people use their apps, the better for everyone. There’s no shortage of work to be done, and that one article may be the article a developer reads and realizes, “Hey, I should really make sure my app works for blind and low vision users.”
On the other hand, when I read articles which (albeit indirectly) suggest that the easiest way for a blind user to accomplish a task like opening the camera is to ask Siri, I can’t help but think that the expectations for my best possible user experience have just been lowered. Reading it as a blind user, I can’t help but think that the assumption, even if subconscious, is that I need voice-controlled devices because I can’t see.
Of course, I know lowering of accessibility expectations was certainly not the author's intent, and that misconceptions almost always arise because people simply do not know any better. But I can’t help but wonder: how much more productive could the accessibility conversation be if sighted journalists and developers better understood how people with disabilities actually use the accessibility features of assistive technology? What would happen if a person looking to undertake the development of an assistive app started the process by asking real users with disabilities, "What areas of daily life could an app help you with?" What would happen if a journalist writing a story, instead of assuming that Siri was the simplest way for a blind user to accomplish a task, researched the topic or asked a group of blind users their opinions? (On this latter question, I informally surveyed my Twitter followers, and about half of those who responded indicated they launch apps with Siri, either some of the time or all of the time. Others said they did not use Siri to launch apps, with reasoning ranging from simply not adjusting their usage habits to take advantage of the new Siri features, to finding apps with VoiceOver being faster and preferring to rely on themselves.)
In the end, I want the same great experience that sighted users have. I want fully accessible app interfaces, not simplified “blind-friendly” apps with voice control. While there is certainly a market for this type of app (such as for very new users or those who have physical disabilities), simplification and voice control should never be the first thing a sighted person thinks of when they consider accessibility or how the blind use iPhones. Accessibility is inclusion, and inclusion is designing a great user experience for as many people as possible—from those who cannot type at all to power users like myself. Thankfully, Apple has demonstrated time and time again that they get it, and that accessibility for blind and low vision users goes well beyond voice control.