For years, I have used your products. I took the path that many people take: an iPod Touch, which slowly drew me into the joy of a simple, powerful, and overall great operating system; to a Mac, which slowly replaced my Windows computer thanks to OS X's lack of crashes and errors, syncing with iOS apps, ease of downloading apps from the App Store, and other features; to the iPhone, because it's a more powerful iPod that can get online anywhere and make phone calls, so why would I not want it. The ease of use, power, usefulness, stability, apps, and seamless syncing have done their work, and I am a (mostly)happy Apple user. Oh, one thing before we go on: I'm blind, so I use Apple's wonderful VoiceOver screen reader on iOS and OS X.
Now, in 2014, I am an experienced user of your products and software, and am (mostly) happy with VoiceOver on the mobile and desktop operating systems you produce. I also have a Bachelor's degree in computer science, plus the time, knowledge, and desire to produce iOS and OS X apps, and I have been a casual Xcode user since version 4.5. To date, however, I have not yet released any of my work. The reasons for this are numerous, but the main one is this: Xcode, which is the only way to produce apps for your products, is only mostly accessible, and only partly efficient, for VoiceOver users.
Before we go on, I want to be sure you know this. I am, as I said, a satisfied customer, and I always recommend Apple products to other people. I admire the accessibility work you have done so far, and the strides you take with each new update to your products. I love that I can enable VoiceOver on anyone's Mac or iOS device and instantly know what I'm doing, and I love the independence that comes from being able to update, re-install, and troubleshoot my own computers without sighted help. I love my Mac, and it's fair to say that my iPhone has profoundly changed my life.
I don't say all that to try to butter you up, or to soften the blow. Rather, I say it so you won't see me as some Windows developer who grudgingly got a Mac, found it to be not what I was used to, and now wants to unload on you for not being Windows. I also say it so you will know that my perspective is an experienced user who would love to be able to bring apps to the App Store that will enhance the lives of others just as Apple's own software and hardware have enhanced my own life. I want to help you help us all, and you can't fix what you don't know is broken. I might sound frustrated at times in this letter, and trust me, I am. However, I realize that these things take time, and that Apple is not an accessibility company, but rather a mainstream company who holds universal access as one of its central beliefs. At the end of the day, you have many priorities across dozens of departments and engineering teams, spread over several product lines and hundreds of millions of users. I'm realistic about all this, but from this end user's perspective, I feel like the below issues are in serious need of being addressed.
IB is the core of the user interface design. Without it, you are hand-coding your interfaces, which is certainly doable, but is far from desirable, for a few reasons:
- If you share code, or collaborate with others on a project, most of your fellow coders will be using IB and will not want to wade through dozens of lines of code defining something they lay out in seconds with a mouse. If they want to change something in the UI, they must update your code, instead of relying on the IB mechanisms already in place.
- If you need to look up how to do something on the internet, you'll almost certainly find examples and tutorials using IB, not walking you through the necessary code. In other words, the documentation for coding apps in Xcode is mostly geared toward the use of IB.
- Debugging is hard, and removing an entire category of potential errors is always good. By hand-coding the UI, you are opening up one more avenue for mistakes and unforeseen problems to come up and bite you later on.
- Using IB is faster - as mentioned above, dragging items around and setting constraints visually is fast and easy, coding it all by hand is harder. Doable, of course, but not as efficient.
Now that we've established why the ability to use Interface Builder is important, allow me to remind you how the process works for a sighted user: open your .interface file (XIB or Storyboard), find the UI control you want in the Object Library, and drag it onto the canvas. To connect that control to code, open your view controller in an assistant editor, control-drag from the control on the canvas to the part of your code you want, fill out the information in the window that pops up, and hit enter. Simple, intuitive, and extremely well-documented. From official Apple articles to all the tutorials, blogs, and books on the internet that deal with Xcode and UI design, these instructions remain basically the same.
VoiceOver users have a very different experience, however. It was worse in Xcode 5 and earlier; in Xcode 6, more controls now have labels and hints, and I'm guessing that more work on IB's accessibility will be coming in future updates. As things stand now, though, IB is difficult, sometimes impossible, to use.
Adding Controls with VoiceOver
Now, here's how to add UI controls with VoiceOver: open the UI file, find the UI control you want, copy it with cmd-c, and paste it into the outline table, where it may or may not land where you wanted it, or may do nothing at all. This is still pretty simple, of course, assuming it works in the first place. If it fails, find the control in the Object Library, route the mouse to it, and perform a mouse click. Also easy to do, but the problem is that these methods are not documented on any official Apple sites. Therefore, the only way blind developers will find out about them is through sites like AppleVis.com or Maccessibility.net. Following Apple's own tutorials, which are otherwise great for getting started with Xcode and programming, will leave most visually impaired developers frustrated and confused since the instructions for sighted users and those for Voiceover users are not at all similar.
Connecting Controls to Code
Once the controls are in place, it's time to make them do stuff - the button starts the game, the scroll view fills with data, the other button becomes unavailable, all that. If you're sighted, the steps are simple: open your view controller in the Assistant Editor, then control-drag the UI control onto the view controller. A window will appear where you configure the connection; do so, press enter, and you're done.
VoiceOver users have a very different, and more tedious, process here. They must find the control to be connected in the Outline Table of controls, then stop interacting twice, go to and interact with the Inspector Group, select the Connections Inspector, jump to the bottom of the current item and interact with the scroll area there, then locate the button that has the connection type you want (referencing outlet, click action, getting different values, whatever it is). Once on that button, route the mouse to it, disable cursor tracking, lock the mouse button down, and move back to the Source Code Group, Find and interact with the source code of the view controller, move to where you want the code to be in this file, route the mouse there, and unlock the mouse button. Re-enable cursor tracking, fill out the information in the dialog that should appear, and press enter. In Xcode 6, this process is very unreliable, seeming to work only if you type the code beforehand (which misses the point - Xcode should fill it in automatically). Most times, the mouse pointer will not route correctly, and the connection will fail.
Wow, that's a lot of steps, and they don't even work all the time! As a coder, which would you rather do: a single dragging operation, or follow this convoluted list of instructions which may or may not work? Anyone would rather drag, right? While VoiceOver does include its own set of commands for drag/drop operations, any attempt to follow the instructions you'd find in most Xcode tutorials using said commands will fail. Below are the steps one might take while trying to do just that.
- Find the canvas holding your UI. Move to the item you want to hook to code, and hit vo-comma to start dragging it. Is it a button, checkbox, or other clickable item? If it is, VoiceOver will inform you that it can't drag items that have a click action, so you can't hook up your control that way. Again, this is how most online tutorials will tell you to proceed.
- Next, find the item in the table of controls, and try to drag it to your view controller. Drag with vo-comma, drag by routing and locking the mouse; try both on the table row, on the edit field of the control's name, and on the disclosure triangle. Nothing will work.
- Try the Connections Inspector. Go there, vo-comma on the connection type you want (specifically, the button next to it), and vo-period on the part of the source code you want to connect to. No good, VoiceOver reports that the item is no longer available. In other words, it will fail to drag the connection button, but the failure will not happen when you start dragging as it will in the case of the control on your canvas. Instead, it will happen when you try to drop on your view controller's source code.
How, then, are VoiceOver users supposed to use IB at all, let alone quickly and efficiently? Short answer: we don't, because we can't. As is pointed out in Maccessibility's response to this post, VoiceOver users can employ the Trackpad Commander to quickly locate the items with which they need to interact, saving several keystrokes. Hotspots can also be used to save some time, and using both tricks together is a big help. Still, the instructions VoiceOver users must follow are radically different from what they will encounter online, and (as of Xcode 6.0.1) do not always work.
Random Busy Messages
Xcode is a large piece of software, and it has a lot going on most of the time. You might think, then, that having it beachball every so often is normal, and not worth mentioning. I would too, except that it happens fairly regularly to myself and other VoiceOver users and, from reports I've seen on Xcode user lists, to no one else. (For VoiceOver users, "beachball" is used as a verb to describe what happens visually when an app goes "busy".)
Imagine this: you've just added a couple controls to your canvas, and you click the Attributes Inspector to change a few things. Suddenly, as you put your mouse in the edit field for the control's name… Beachball! You wait patiently, but nothing changes, until you hit command-shift-2 to open the Project Organizer window. You then immediately switch back to your project, and everything is working normally again.
This is a very common occurrence for VoiceOver users. Normally in the Navigator or Inspector, but not always, Xcode will go "busy" and won't respond to anything, unless you switch windows (the Project Organizer is the most reliable one to temporarily fix this).
Other inefficiencies exist within Xcode that are not bugs per se, but stand to drastically improve the experience for VoiceOver users if they were addressed.
First, every code file has what VoiceOver calls an Annotations Ruler beside the text field holding the file's contents. This item isn't a ruler, as far as I can tell, but rather a list of any issues Xcode finds in the associated file's code. Each issue is a button complete with the issue message and line number, and most users would probably assume that you can activate a button to go right to the line in question. However, activating any button in the Ruler does nothing at all. You can't interact with the button text to see exactly how something is spelled, or what the punctuation symbol is that Xcode says is missing, because interacting does nothing at all. You can't even copy the text using vo-shift-c to paste it somewhere else and examine it that way. There is a "jump to line" command that you can use to bring up a dialog where you type in the issue's line number to go to that line. Still, if something is a button, and already has full information about the problem it talks about, one would think that activating that button would do something to help you get to or solve the problem.
Next is the Issue Navigator. Select that in the Navigator Group, find an issue in the table, and go to the source code to fix it. Instead of the click that a sighted user would perform to go to the erroneous code, you have to stop interacting two or three times, vo-right to the Source Code Group, interact, find the group holding the code file that has the error, interact, and vo-right to the file's text field. Only then can you address the problem, and if you forget why you came there… Reverse the process, look back at the issue, and then reverse things again to come back to your code. A keystroke to quickly jump from the table of issues to the code file in question and back would be enormously useful. As previously mentioned, using VoiceOver's Hotspot feature helps here, though I've found it to be somewhat unreliable in the past.
Interface Builder is far and away the biggest hinderance for VoiceOver users trying to use Xcode. Drag and drop, even though VoiceOver fully supports it, fails, and hooking UI controls to code is difficult at best. Worse, the instructions used in Apple's own documentation and tutorials cannot be adapted to VoiceOver users, who must instead manage to find the alternative ways of doing things on their own. This goes against the rest of the Mac, where VoiceOver provides equal access to nearly everything in the operating system with little to no changes to any commands a user might be following.
While there are certainly other areas where Xcode needs some accessibility/efficiency work, IB is the worst offender. Things were terrible in Xcode 3.x, better in Xcode 4, still better in 4.5 and 5.x, and have continued to improve in 6. While there is certainly work to be done, this is a promising trend, and I ask you to please complete the journey to full accessibility. Build on what has already been done: allow dragging of controls right from the canvas using VoiceOver commands; indicate the state of a connection; allow dragging of the connection buttons from the Connections Inspector; identify the items in a .xib or .storyboard canvas so they are not "unknown" and so their help tags properly inform users of their purpose and content; bring the instructions VoiceOver users follow in line with what sighted users do, just as most of OS X already does; stop Xcode from locking up when VoiceOver is running. Allow blind programmers to make more apps to help blind Mac and iOS users. Let blind developers enjoy the app creation process just as much as sighted users do by making Xcode and Swift fun and easy tools for everyone, sighted or blind. Let VoiceOver users follow the same instructions as everyone else, instead of hunting through the internet for obscure lists of convoluted steps, or simply giving up in frustration. The fixes so far seem to indicate that work in this area will continue, and I can only hope I'm right in that assumption. I urge you to keep moving in that direction, and I thank you for the work that has been done so far.
If you are thinking about trying your hand at app development on the Mac, and you read the above letter, you may come away discouraged and feeling like it's not worth the bother. Please remember that there are several blind professionals using Xcode successfully already, so it can be done. What I'm calling for is the conclusion of a years-long path Apple has been moving down, where each Xcode release has been more accessible than the last. I hoped to see Xcode 6 be the culmination of this work, finally bringing complete VoiceOver compatibility to Xcode. That hasn't happened yet, and it really needs to, but that isn't to say you can't use Xcode as it is now. You can, but you need to know the tricks and shortcomings first. The fact that you still need to know any special information just to use Xcode as a VoiceOver user is what my letter is meant to address.