Why I'm still on a Mac: a guide for skeptics and those who feel that grass may be greener with Windows
This article will show you, the reader who uses a Mac with the Voiceover screen reader, exactly why I haven’t jumped ship and fled to Windows 10. I’ll compare the state of accessibility of Windows and Mac as operating systems, features of first-party apps, advancements in first-party screen readers, and the outlook of accessibility at both companies. I will attempt to be objective in this article, so as to present my arguments in a logical and useful way.
The core of Windows and Mac accessibility
Windows and Mac have grown up together, battling for the space on our desks and the pleasure of our minds. While accessibility has been a small part of both companies’ focuses, their differing ways of interacting with the user has translated into differing levels of accessibility. While Windows gives assistive technology vendors the ability to run on Windows with similar access as its own screen reader, Narrator, Voiceover is the only screen reader that currently exists on the Mac. While this seems like the stifling of innovation for some, most Mac users are happy with this, possibly because they are used to only Voiceover on iOS, Apple’s mobile platform which is extremely popular among the blind.
While Microsoft’s accessibility API’s focus on one window at a time, the Mac allows the user to know if a background app has been launched, and even if that app has a new window. This extends to so-called “system dialogues,” which are common on both systems, but whereas Windows automatically tries to put focus on them, the Mac merely alerts the user, and allows for the navigation to it at any time.
Windows uses keyboard access universally, as many Assistive Technology companies have screen readers on Windows. Tab, shift+tab, arrow keys, F 6, and Alt+Tab are most of the keyboard commands one uses to navigate Windows. This lends itself well for those who are good at memorizing where everything is in tab order, but the visual information has been lost, as focus can land either at the top of an app or wherever an app developer wishes it to. On the Mac, with Voiceover, one can explore the screen completely, either with the Voiceover keys and arrow keys, or with a trackpad. Not only does this give the user a sense of where things are, it also allows for some of that memorizing brain power to be used for more important tasks necessary for utilizing a computer to its full extent. App developers can still set where voiceover focus will initially be, although that too is able to be configured via the Voiceover utility, a blind user of the Mac isn’t nearly as likely to get lost. At to that the ability for VoiceoveR users to interact, or focus in on, a section of the screen, such as headers in email to read by only that particular column, and the Mac user may be more productive than their Windows-using counterpart.
Windows uses voices in many languages created by Microsoft. Apple has purchased the ability to use voices provided by Nuance, but also has the famous Alex voice, which sounds realistic, has the capability to recognize parts of speech based on whole paragraphs rather than phrases or sentences, and breathes, which may alert the listener to the possible word which will start the next sentence. While Microsoft’s voices are well-made, they are also very small and have a noticeable synthetic buzz sound when they are made to speak. Apple’s voices, along with the Nuance ones, do not suffer from those traits. While you can acquire other speech engines for Windows, you must often purchase them, whereas all voices that come with the Mac are free for Mac owners, and there are plenty of high quality voices, in many different languages, to choose from.
Windows has a system of navigation wherein a user can type the beginning of an item within the list they’re in, and the system focus jumps to that item. This system is usually called first-letter navigation, even though nowadays one can usually type multiple letters to “narrow down the list,” as it were. This system works well, and I use it often. However, this system isn’t always available, making it tricky and frustrating to have to type one letter over and over again until the right object is found. Within the Mac, this works everywhere, as far as I’ve found. You can type “IT,” on the dock, to move directly to iTunes. You can type “IB,” on the dock to move to iBooks. Note: “iBooks,” may change to “Books,” in a later macOS update. In the finder, where you can browse files, you can begin typing the name of a file or folder to jump to it. In System Preferences, you can begin typing the name of a pane you wish to explore, after interacting with the scroll area. Best of all, you can open system preferences by simply putting focus on the menu bar, pressing Down arrow, typing “SY,” then pressing Return (or Enter). This makes navigation on the Mac a breeze compared to the plentiful key presses required on Windows.
Writing is an essential part of using a computer. Word processors, notepads, email clients, Twitter apps, browsing the web, and using the Terminal all utilize written input from the user. How much help do we get while we write? On Windows 10, spell checking is available in many Universal windows apps, but nearly all desktop apps have no spell checking at all. Word completion is making its way to windows 10, although the current implementation isn’t very productive as words are spelled out, rather than spoken then spelled out. Punctuation marks, such as quotes, ellipses, and other such malleable symbols are simply printed out plainly, without care of position or style. There is no way, in Windows itself, to print symbols which are not on your keyboard without memorizing Unicode numeric values for each symbol. On the Mac, spell checking and auto-correct functions are available system-wide, as is a comprehensive dictionary, thesaurus, and search mechanisms for Wikipedia and other such databases. One can also have the Mac attempt the complete typed parts of words, helpful for long words which are easy to pronounce, but hard to spell. Symbols, like quotes, are paired into their left and right symbols, making work on the Mac not only a joy to type, but also a joy to read. Symbols, like • (bullet), ≥ (greater-than or equal-to), é (E acute), … (ellipsis), and π (Pi), are easily created by holding down the option key and typing a letter or symbol already on the keyboard. If this isn’t enough, there is an emoji and symbols picker which allows for the choosing of just about any symbol one can imagine. All of these options, and more such as text replacement, are all configurable in the keyboard screen of System Preferences, and all usable in any app.
First-party app accessibility
Program accessibility has come a long way. From the invasive capturing of information used in the last century to the accessibility API’s utilized today, apps have grown not only more powerful, but also more accessible. Within Windows, apps are caught between legacy desktop implementations, and the new Universal Windows Platform standards. These two standards handle accessibility differently. Legacy apps contain menu bars, toolbars, and plentiful keyboard commands. Universal Windows apps contain no menu bar, plenty of navigation buttons, and some have keyboard commands for many options, and others have none at all. The Mac in its current form, macOS, has one form of app, bringing consistency across all of its apps. There are menu bars, toolbars, and plenty of keyboard shortcuts in all of Apple’s apps.
Advancements in Narrator and Voiceover
Screen readers have been advancing nearly since the beginning of the 80’s. From text-based programs which could output text to a speech synthesizer the size of a modern desktop CPU case, to the screen readers built from web technology, third party screen readers have come a long way. It wasn’t until the beginnings of the twenty-first century, though, that Microsoft boldly stepped into the Assistive Technology game. Narrator was a minimalist screen reader designed to allow a user to set up their computer on their own before they receive their full screen reader, likely JAWS or Window-eyes at the time. In the next few versions of Narrator, this minimalistic design was perfected, and was pretty complete at the time of Windows 7. During the creation of Windows 8, though, something changed. Narrator became more robust, defying its earlier reputation of being simply a crumb and growing into a snack. During the years which followed, Narrator’s power grew, and now it can hold its own with the likes of NVDA when browsing the web, has Braille support, for now tied to BRLTTY, and can be used in many more circumstances than just to wait for your JAWS disks. Voiceover, however, took a different rout. Starting out as a prerelease module for MacOSX 10.3, Voiceover was designed from the beginning to be a full screen reader. Released to the public with MacOSX 10.4, Voiceover was able to not only read the screen, it could navigate the screen, allowing the user to know exactly what was on the screen on their Mac and act on it in an efficient manner. A year later saw the release of MacOSX 10.5, giving the Mac the Alex voice, which is still being updated today. Throughout the years, Voiceover was given the ability to work with Braille displays, use the newly acquired Nuance voices, set different “activities,” which allowed one to define a set of preferences for one app or web site, read complex web pages and emails, and Mac apps grew more accessible with it.
The Outlook of accessibility at Microsoft and Apple
Microsoft started out simple with accessibility. They allowed third-party companies to make technology for their customers. Apple also started out this way. Since Apple is very secretive, I don’t know if they were researching accessibility before OSX or not. Microsoft took a small but crucial set forward in creating Narrator, and Apple also took a step, much larger, in providing Voiceover. Narrator stayed simple for many years as Voiceover grew, but now Narrator is leaping forward asVoiceover’s growth has slowed. Microsoft now speaks of accessibility as one of its core values, but Apple is silent about this much of the time, but still pushes forward on features for Voiceover and iOS. Microsoft’s accessibility has improved over the last 6 years, but still hasn’t caught up with Apple. Microsoft still allows third-party screen readers to dominate on their platform, but Apple never had third-party screen readers on MacOSX to begin with.
This article has shown why I, as a user of assistive technology, have not turned from Mac to PC. I’ve discussed the accessibility of the Mac and Windows operating systems, first-party apps, how the first-party screen readers have advanced, and the outlook of the two companies regarding accessibility and their screen readers. I hope this will help someone choose an operating system to stick with, or an operating system to try.
The article on this page has generously been submitted by a member of the AppleVis community. As AppleVis is a community-powered website, we make no guarantee, either express or implied, of the accuracy or completeness of the information.
great article, but don't forget the infovox voices as well. smile.
Where is employment in this discussion? How many of you who use the Mac are working somewhere outside of your homes, and are you using the Mac to do so?
I invested enormous effort in the MacOS over the years. Even did quite a few Podcasts for this site. However my own conclusion is that Accessibility has nose dived since the earlier years, with the interaction between VO and MacOS becoming increasingly buggy and unreliable. I have lost count of the number of times I have had, increasingly abandon tasks started on my iMac and had to complete them on my Windows machine. I cannot reliably luse my banking website, my shopping websites. Basic features like reliable feedback on cursor position in editing windows, the ability to complete a Safari Session without constant busy messages all contributed to me for a negative experience. A big issue for me is that if someting does not work with VO there are no other options. My iMac has 16GB of Ram with a 256GB SSD but I could not rely on it. I did a clean re-installed 4 times last year in a desperate attempt to improve matters but to no avail. Even the install process was flakey with VO not speakign at critical stages with me having no option but to seek sighted help. this never used to happen and Apple seem to have slipped badly. With a heavy heart I have now donated my Mac to my son and apparently even he is now using it mainly as a Windows machine. I am incredibly sad about this as it was my intention to leave Windows and enter the new world of Apple for good back in 2011 but I have finally given it up. Things will have to radically improve if I am ever to invest in MacOS again.
There are several in accurate statements in this article about what a screen reader user can and cannot do in windows. Please do more research if you are someone looking to genuinely compare screen reader experiences on apple versus Microsoft products. I don’t have time to go through all the incorrect statements, But just know that there are several. Also, the focus on narrator is misleading because almost no blind Windows users use narrator.
Please provide me with examples of my inaccuracies. The reason I chose to focus on Narrator is because it is Microsoft's counterpart to Voiceover. Third-party screen readers are covered lightly in the article.
The main problem with your assessment of working with Windows vs. Mac as a visually impaired user is that you are really comparing two screen readers, VoiceOver vs. Narrator, and then jumping to the conclusion that one operating system is inherently more friendly to a visually impaired user than another.
The fact is that there are many more access options available for visually impaired users of Windows systems than there are for Mac users. Also, although VoiceOver may be perfectly workable for many users who work with applications developed by Apple, I don't believe this is necessarily the case for some Applications developed by third parties that run on the Mac and haven't been as tightly integrated with VoiceOver.
The special needs of some users make it imperative for them to have options that suit their particular needs and work flow. The variety of screen readers available in Windows as well as the wide variety of custom scripts and plug-ins available for these screen readers to enable the user to work efficiently with third party applications do not have as many counterparts in the Mac world.
Also, I always make the distinction between applications being "accessible" vs. "useable". Just because a user can navigate to and "see" all elements of an application doesn't necessarily mean the application is useable efficiently and productively. For example, the customized scripts and plug-ins available with many Windows screen readers can make it a simple task to perform functions that would otherwise take a visually impaired person many keystrokes to perform.
Basically it all comes down to what one intends or needs to do with their computers. The more options available to people the better and more efficiently they will be able to perform required tasks.
I agree with Pete on this hole heartedly. The mac works very well for some, but not for others. As a long term Mac user, having used the OS since Mac OS snow leopard full time, I can say that for my needs the mac works great. But for others, it may be not necessarily a good choice. Also, while we are on the subject of screen readers, windows only has two options, excluding Narrator. JFW, and NVDA. As window-eyes has folded, now there really is only one decent option in my book if I was going back to windows ever, NVDA.
Like I said before, for my needs the mac works very well. It does what I want it to do, and in my book that's all that matters in my mind. It's a use case senario.
Subject line pretty much sums it up. I've had my Mac since the end of 2013, and prior to that I used Windows for many moons. Both have their places. I used NVDA for awhile in Windows, and I definitely agree it is fantastic. As a matter of fact I just might be using it again if I get this new tech job. But VoiceOver is also quite good, and has been advancing rapidly thanks to Apple's excellent implementation. So I'm not going to tout one over the other either, since everyone has their own likes and dislikes.
This article points out that Microsoft is forging forward with accessibility, while apple has largely stalled. Let's see where we are in a few years. Apple VS Mac and one PC screen reader over another among blind people will always be a religious rather than a useful debate.
There are some good points in this "guide," and I appreciate that it's well-written. As others have noted, it boils down to what tasks one needs from a computer.
I tried for 4 very expensive years to love Mac. Unfortunately, I got in right after the close of the "big cat" era, which saw so many amazing accomplishments by the VO team while Jaws just kep-on-a-crashing and Windows got less and less usable OTB. Since then, though, it's rather plain that the core developers of Mac OS and IWork don't hit command-F5 at any point during their development process, much less consult adequately with the VO team. Six months after the bug appeared, I still can't hit VO-space to activate Finder and desktop files, while even Pages keeps having odd moments of not reading selected text or skipping lines of text that contain comments. Even so, I fully appreciate all the things about Mac that others do, many of which stem from the Unix origins of the OS. Meanwhile, Windows 10 is at least as snappy as Mac now, and NVDA has finally become my primary screen reader, though I also maintain a Jaws subscription through my employer.
After owning an MBA for several years and going back and forth between bootcamp and Mac, I got my employer to get me a 2016 MBP with touchbar. I really hate the touchbar and keyboard. They look great in the store, I'm sure, but seem to symbolize an attitude that design is all about the look, rather than usability. One of my favorite aspects of Macs used to be the start-up chime. Apple took it away. Then someone figured out a terminal command to bring it back. Then Apple took *that* away, too! Apple really doesn't want my computer to bong...
I tossed my MBP back this week and recovered my old XPS 13. I'll probably hang onto the MBA to keep an eye on Mac. Even after becoming very comfortable with VO commands and using a Mac exclusively for a year, simple tasks like file management and Web browsing were so much slower to complete for me on Mac than Windows that it was embarrassing at work. While all the other bugs basically make for a usability tie with NVDA on Windows, and I loethe the Windows "screen reader shuffle" when trying to use a funky Web page, it does indeed come down to employment. I spent more than a hundred dollars looking for Word processing apps that I might be able to use on Mac to replace Word, but no-go, because I have to use Word commenting and track changes features with my colleagues. Microsoft claims Office works with Voiceover; however, Excel doesn't read column or row headers and, last time I booted it, Word was silent when cursoring up and down a line. Next week, Word will doubtless behave differently, with a new set of bugs. MS told me they don't support Voiceover because it's a "third party" software, which it isn't. Meanwhile, Apple isn't going to rework VO to accommodate Microsoft's oddities. Only a third party screen reader is going to invest the time to keep on top of whatever weird crap Microsoft or Google want to do next. I'm convinced this fact isn't likely to change.
For anyone who can make a Mac work for their purposes--programmers, music composers, bloggers, etc.--it's a great choice. It'll stay accessible. For me, it was one of the biggest time and money sucks I've experienced
Mac did actually once support a third-party screen reader, btw: it was called outSpoken, by Berkeley systems. It was pretty rudimentary.
The outlook for the future after 2018 is worrisome. Windows is moving quickly to become a cloud-based OS with progressive Web apps that supposedly run locally like regular apps. How will NVDA do with PWAs? Meanwhile, Apple is moving toward their own version of universal apps by incorporating IOS into Mac OS, according to rumor. I predict some rough times ahead for desktop screen reading on both platforms.
Windows has a powerful accessibility api and screen readers made by blind for blind people taking the needs of blind people in consideration.
1- Want to make JAWS more useable for you in places where you might need to do it to keep your job or to advance your productivity? Open JAWS Scripting editor, customize, relax.
2- Want to make NVDA more useable for you in places where you might need to do it to keep your job or to advance your productivity? Open a text editor, code some python, customize, relax.
3- Don't know how to do it but need it anyway? Pay someone to make it for you or look on the internet for already built solutions for your case that mightt exist for both screen readers and might by or not be free.
4- Is there a bug on your screen reader? You have where to fill a bug report and you can be prety confident that this will be heard.
5- Do you happen to be bornt in a place where english or other languages from rich countries are not spoken? No problems, choose from several voice synthezizers to find one which speaks your language (eSpeak probably does). If there isn't one, build or pay someone to build it for you and others.
6- Is there a new version of the screen reader around? Great, try it! Oh, does it have a show stopper bug and you only realised it after upgrading? Roll it back untill the bug is fixed!
7- Is there a new version of the operating system around? Great, try it! Oh, does it have a show stopper bug and you only realised it after upgrading? Roll it back untill the bug is fixed!
8- Is there a new version of a given essential software you use to perform a daily task around? Great, try it! Oh, does it have a show stopper bug and you only realised it after upgrading? Roll it back untill the bug is fixed!
9- Are you unsatisfied with your current screen reader? Switch to another!
10- Are you having to use different screen readers because each one performs best to help you with different tasks? Switch them while running the same operating system!
Mac OS has a very poor and mainly undocumented accessibility api and the only screen reader made for it is not likely made by blind people. As it stands, it is definitely not taking the needs of blind people in consideration and it is either not being tested by blind people or is being tested by very unqualified blind people (not likely) or the tests, if any are being made, are having results solely ignored by the responsable QA department.
1- Want to make VoiceOver more useable for you in places where you might need to do it to keep your job or to advance your productivity? Sorry for you. Not everything can be scripted and the VoiceOver scripting dictionary, compared with JAWS Scripting API or NVDA API is ridiculous. Hotspots implementation is buggy and use a (accordingly with I could deduce because its implementation is of course not documented) based on counting children of windows. This makes hotspots fail idf other windows are injected dinamically on the windows hierarchy.
Even using the accessibility API's a significant part of the apps theirselves do not expose components apropriately and I have not been able to track events such as new text being written on specific windows. Even if I could, making the screenr eader speak and controlling the speech queue doesn't seen possible. To sumarize, forget it.
2- Don't know how to do it but need it anyway? Well, there might be a mac OS guru able to eficiently script VoiceOver. He or she will either have worked for Apple or have had access to undocumented stuff or to someone who has or had access to the internals. If such person exists (and I have never heard about them because XCode is a really crap in terms of productivity for blind people and we really are in needs of someone like this) then be prepared to pay a very expensive price, because this knowledge is definitely not available for everyone. To summarize, forget it.
3- Do you happen to be bornt in a place where english or other languages from rich countries are not spoken? Sorry for you. Because integrating other speech synthezizers is nearly impossible. Building new synthezizers is nearly impossible. And if it is possible then see item 2 to understand why it is not likely opossible at all ... summary is forget it.
4- Is there a bug on your screen reader? You have an e-mail address. Nothing else. No guarantees, no bug tracking report ... and the historical bases show us that important bugs might be never fixed cinse we do have several long standing bugs that have never been fixed. Want to fix that via scripting? Well, read the item 2 to understand why it is not likely possible ... To sumarize, forget it.
5- Is there a new version of the screen reader around? Great, try it! Oh, does it have a show stopper bug and you only realised it after upgrading? Sorry for you. Learn to live with that. If you deppend on that functionality, then you are in trouble. Because guess what ... there is no way to roll back without rolling back the whole system and if your corp decided to upgrade then rolling back the system might not ** it will likely not be ** an option. To summarise, forget it.
6- Is there a new version of the operating system around? Sorry for you. Learn to live with that. If you deppend on that functionality, then you are in trouble. Because guess what ... there is no easy way to roll back the whole system and if your corp decided to upgrade then rolling back the system might not ** it will likely not be ** an option. To summarise, forget it.
7- Is there a new version of a given essential software you use to perform a daily task around? Sorry for you. Learn to live with that. If you deppend on that functionality, then you are in trouble. Because guess what ... there is no way to roll back apps installed via app store. To state that yet another time, forget it.
8- Are you unsatisfied with your current screen reader? Sorry, no oother option. To summarise, forget it.
9- Are you having to use different screen readers because each one performs best to help you with different tasks? Impossible. Again, sorry for you. No competition, no creativity, no new ways of seeing cenarius and come up with inovation.
You can use though mac for mail, finder and safari. Out of the bbox you will have trouble,. Audacity works best on Windows, Sonnar works best on Windows, compilers, text editors, pdf, command terminal, web navigation .... all work best on Windows. Not because Windows is supper better ... but because the comunity has means of producing assistive technology made by blind people for blind people taking the blind needs in consideration.
To be fair, Emacs, with Emacspeak, does work on the Mac, while the last compiled version of Emacspeak for Windows was 42, last I checked, and I check often. That version used SAPI5, which didn't allow much for voice-lock and aural-highlighting to work well. I get around Mac accessibility problems with terminal stuff, like TDSR for Mac, and Emacs for writing local stuff.
Note though that TDSR is new and was not likely available when this guide was written.
Note also that emacspeak is hard to get on mac OS, specially if you compare the setup process with the one on Unix. But Emacspeak is a different case because the package is fully self voiced, so you do not really deppend on a screen reader. The definition of a screen reader is some software generic enough to read apps in a transparent form for users. Any self voiced package is not included on this analysis. We can not forget that the main reazon or likely the only reazon one must have a Mac instead of a PC now is to code for iOS, interestingly one of the most accessible platforms I have so far used. If the guys from Cupertino just tooke care of mac accessibility using the same criteria they do with iOS then things would be somewhat though not much different in this case. Even so, coding for iOS in any serious working environment requires deep XCode usage and using XCode, made by developpers for developpers, just the guys that with no escuse should think about accessibility, without sight is one of the hardest chalenges I have ever had in my full almost 20 years work with software development.
What fustrates me more is that we could help to emprove things but Apple at the same thime thinks that we are not good enough to help and don't solve theirselves the problems we currently have. Nothing umder the sun justifies the play ground, remarcably the most interesting repl system I have seen to date and that could just introduce so many new blind programmers to the world of programming, just like it does with sighted newbies, not to be accessible. I do not obviously have access to their sourcecode but even though I know that it could be made accessible, specially after so many years of availability. If even them can not produce accessible software and tools then how can they require third part developpers to do just that? I wounder how many amasing apps could be produced for blind by blind on iOS should the development environment be more useable and accessible.
Emacspeak wasn't too hard to get on Mac, it just took a while. Get Home-brew, get latest Emacs, git clone http://www.github.com/tvraman/emacspeak and make config and make emacspeak, and load-file emacspeak-setup.el in your ~/.emacs.el file.
Of course, I've been doing that for a while now, so every time I have to reinstall macOS I install Emacs and Emacspeak soon after. Yes, it is easier on Linux, but not too much more so. I do agree with everything you've said otherwise, though. Even their "learn to code" course isn't accessible enough with VoiceOver, this as they've made it accessible for students in blind schools, which to me shows a lot of how disorganized they are on accessibility, or how closed their eyes are. As Microsoft gains even more accessibility, as access becomes a deeper and deeper part of them, they become a much more inviting option for me as well. Shoot, I may even be able to give up Emacs if I do switch to Windows full-time, although I'd need a laptop with a much more substancial battery. Now that Windows has system-wide spell checking and text predictions using much more modern dictionaries and AI than Apple, I come that much closer to having all I need, although Windows doesn't have an actual dictionary app nor the ability to display definitions of words on the fly as Apple does. I'm sure, though, that Microsoft will outpace Apple, its just a matter of time.
While Swift Playgrounds is accessible on iPads, Voiceover and XCode on the Mac don't work well together at all. It is not even close to what you can do with Visual Studio and JAWS or NVDA on windows. Now there is even an extension called Code Talk that makes Visual Studio even more useful. Unless Apple does some serious work with Voiceover, I don't know ow much longer I will be using a Mac.
See that NVDA is a free screen reader that works extremely well with VSCode, which by the way is not accessible in Mac OS possibly because of Apple poor implementations of voiceover but I would need to research a little more to be sure.
Even though, if you install NVDA + a addom called indentNav + vscode your experience will be very fluid when programming in a lot of languages. Add to that the easiness and fluidness of internet navigation with NVDA and firefox, way better and more productive than what VO and Safari could offer and you will see how pleasant it is.
Now we need to wait untill alsa drivers are supported on WSL (windows subsystem for linux) .. and once it is than no more emacSpeak for Windows ... one can use the original emacSpeak for linux on Windows.
As for Apple saying that everyone can code on XCode this is ridiculous. Because there is a huuge difference being able to code and been able to code accessibily and in a equally productive manner than the sighted people and spending the same time.
Otherwise, not everyone is able to have access to devices at the workplace. Sighted people use the simulator ... what about us ... well, as I said several times before, forget it.
Again, the debugger does not put the editor cursor on the line being executed. Instead, a visual marker is shown. This means that VO stays silent unless you press yet another command after pressing the next statement command to place the focus of both together. And this, of course, is not documented either. Chances of newbies succeeding? Well, they are more likely to succeed on Windows ... if we just could use it to develop for iOS it would be so cool
Most of the time, no one talks about the screen reader sweet from the UK company Dolphin Computer Access called "Supernova," since the more popular US ones get the recognition.
People in the USA particularly, are more familiar with "Dolphin Guide" however, which both products can be gotten from
The point I'm making above, is no one likes talking about Dolphin products, when it comes to comparing Windows screen readers to "anything."
Lastly, the latest pre-assembled EmacSpeak archive for Windows by the way, is v43:
Feel free to chat with the creator of that, as I haven't played with EmacSpeak on anything lately. New Windows 10 machine,coming to me soon.