considering a mac maybe

By Josh Kennedy, 24 July, 2025

Forum
macOS and Mac Apps

I had a mac 15 years ago back in 2010. At the time I quickly went back to windows7 because of all the interacting and stop interacting I had to do with voiceover on the mac. It was like using NVDA object nav or the jaws touch cursor, constantly. Is it still the same today? Would I have to constantly interact with stuff going down several levels just to get at what I want? Or has voiceover improved and it is more like windows or Orca for linux where you mostly interact using something like the pc cursor?

Options

Comments

By Josh Kennedy on Thursday, July 24, 2025 - 19:01

also, is it faster to use a mac with local offline models and koboldcpp? (kobold cpp) As in do the AI models respond faster due to the mac's hardware vs an hp windows computer with AMD ryzen 5 cpu and 32gb of ram?

By Voracious P. Brain on Thursday, July 24, 2025 - 19:04

You don't have to dig deep on the Mac forum to find a zillion "Mac or PC" threads you would want to examine. No, VoiceOver hasn't change all that much, though saying that model is "worse" will get a lot of push back. Plusses and minuses. If it's just a small-footprint desktop you're after, you can get cheap mini-PCs on Amazon. I have one sitting on top of my Mac Mini, in fact, and that's what I use. My mini stays turned off 95% of the time now. It's also been buggy since I bought it in January, which I did mainly just because they're cute and I was able to get a good trade-in price for a useless iPad. Only really nice thing I can say about the mini over my AMD mini PC is the built-in speaker. On the other hand, the $15 usb speaker attached to the PC is better.

By Josh Kennedy on Thursday, July 24, 2025 - 19:34

ok, so ini other words, if I was disappointed with the mac's user interface and the mac's version of voiceover back in 2010; I will probably still be disappointed with the mac's version of voiceover today in 2025? Because I do like my iPhone and I like how voiceover works on iPhone and iPad. and I was kind of hoping that voiceover on the mac had improved and I no longer had to do all that interacting and stop interacting and going up and down in object level hierarchy so to speak. It seems to me the mac's version of voiceover is complex, and the voiceover on iPhone and iPad is more user friendly.

By jim pickens on Thursday, July 24, 2025 - 19:40

I’m unsure why this isn’t more well-known in the community, but interaction isn’t a requirement, it’s helpful in some apps and very much unhelpful in others, so you can either create activities to disable interaction in certain apps, or what I did, said it to disabled by default unless I Invoke a specific activity. It’s within the navigation pin of voiceover utility.

By Khomus on Thursday, July 24, 2025 - 20:10

Yeah, what would be useful is to give examples of where you had to do this. I know that might be hard because it was so long ago. But people just do this, oh you have to interact with everything, not true, oh you have to hit fifty trillion keys at once, set your VO modifier to caps lock like a sensible human being, which it is by default, and you'll really cut down on that.

The thing is, if you're really married to the Windows way of doing things, stick to Windows. IMO a *lot* of the problems people have with the Mac is that they try to do things the same old way. Take quick nav, for example. It works the same in both Mac and iOS. So you turn it on, and you can just hit arrow keys. Great! It's like Windows!

Then people complain that, oh it does this weird thing and that weird thing and it doesn't work here or ... I only use it in limited circumstances. Otherwise, it's caps lock and the arrows for me. I just knew this was the basic VO gesture for navigating, so I just started doing it when I switched. That works everywhere. So sure, other people will be all "OMG yes you need to submit twelve thousand bug reports a week or don't even bother because this needs fixing"!

Meanwhile, I'm just using VO arrows 99% of the time and getting on with my life and doing things on the Mac just fine. For some history, I switched back in late Sept. or early Oct., and I haven't used Windows since. But before that I used Windows since Windows 95 with Jaws for quite a while. I switched to NVDA exclusively in maybe 2015, I think my last authorized Jaws version was like 14 or something. I'm excluding the Apple II E and Dos because they're not really relevant, but those too, since the mid-80s.

Will you like the Mac? I have no idea. I'm not saying you should get one. It's entirely possible that VO will still drive you nuts in exactly the ways it did however many years ago. All I'm saying is that I've used computers pretty extensively, and I've been doing nothing but Apple now for nearly a year. I'm able to do the stuff I need to do just fine. But my use cases aren't your use cases. I'm just counteracting the posts that have already started that say you shouldn't touch one ever because it will set your house on fire and you'll die, that's how bad the bugs are.

P.S.

Thanks to this place my Vo welcome message is "You will never be productive again"!

By Josh Kennedy on Thursday, July 24, 2025 - 21:00

Is it faster to get output from local offline AI language models when using a modern mac? I would run the models using koboldcpp.

By Daniel Angus MacDonald on Thursday, July 24, 2025 - 21:31

so there's a podcast here on the site which demonstrates the other options if interacting is undesirable. but really, set to ignore groups, and poof, just like windows, in that sence. people complained how QuickNav was in the early days of Sanoma. Apple changed it back. now, personally, I think it made much more sence, and think it should be like how it was, with the dustiction of arrow key QuickNav and single key QuickNav on iOS. it brought it more in line with Windows screen readers. now, on macOS only at this point, there is an option to have VoiceOver speek text to the right of the curser. this should be brought to iOS and iPadOS, as an option.

By Maldalain on Friday, July 25, 2025 - 04:09

Can you link to that podcast, or if you please can mention a search phrase for it?

By JoΓ£o Santos on Friday, July 25, 2025 - 08:34

I tried the 22 billion parameter quantized 12.7GB model mentioned in the koboldcpp GitHub page on my 2025 Mac Studio on LM Studio, which was processing and generating natural language text at roughly 30 tokens per second, as well as in koboldcpp, which only processed and generated natural language text at roughly 22 tokens per second, so LM Studio might be a better option as it is known to support Apple's optimized MLX framework. I also tried running it on my 2024 M4 iMac with just koboldcpp and it didn't even load due to insufficient memory, which is to be expected due to the sheer size of the model.

Next I tried the 7 billion parameter quantized 4.14GB model also mentioned in the koboldcpp GitHub page, and while the Mac Studio processed and generated content at 25 tokens per second (86 tokens per second on LM Studio), the iMac could only do 16 tokens per second (did not test LM Studio). I can also try this model on my 2020 MacBook Air if you wish.

Koboldcpp results

128GB 40 core GPU M4 Max Mac Studio with the 22 billion parameter quantized model

My question: How much wood would a woodchuck chuck if a woodchuck could chuck wood?

LLM answer: This is a classic tongue twister, but it's not meant to have a factual answer. It's just for fun and practice in pronouncing tricky words!

Statistics: CtxLimit:59/8192, Amt:38/512, Init:0.00s, Process:0.48s (22.9ms/T = 43.66T/s), Generate:1.24s (32.7ms/T = 30.57T/s), Total:1.72s (22.04T/s)

128GB 40 core GPU M4 Max Mac Studio with the 7 billion parameter quantized model

My question: How much wood would a woodchuck chuck if a woodchuck could chuck wood?

LLM answer: As much as he wants!

Statistics: CtxLimit:38/8192, Amt:7/512, Init:0.00s, Process:0.21s (6.7ms/T = 148.33T/s), Generate:0.07s (10.1ms/T = 98.59T/s), Total:0.28s (25.00T/s)

16GB 8 GPU core M4 iMac with the 7 billion parameter quantized model

My question: How much wood would a woodchuck chuck if a woodchuck could chuck wood?

LLM answer: A woodchuck would chuck as much wood as he could. (or she, if you're being gender-inclusive.)

Statistics: CtxLimit:59/8192, Amt:28/512, Init:0.00s, Process:0.56s (18.1ms/T = 55.26T/s), Generate:1.18s (42.0ms/T = 23.79T/s), Total:1.74s (16.11T/s)


Edited to add the LM Studio token rate for the Mac Studio which performed over 3 times faster than koboldcpp using the smaller model, as well as remove the irrelevant local timestamps from the statistics. Also tried to post the console commands to run these models using koboldcpp as a command-line utility on Terminal without having to mess with system-wide privileges but once again Cloudflare didn't let me post those for some reason.

By mr grieves on Friday, July 25, 2025 - 14:04

Just to agree with what was posted above, I have groups turned off by default and the apps I tend to use all the time work fine to the point where I hardly ever have to interact with anything.

Many 1st party Apple apps don't work so well like this so the only way to use them is to grit your teeth and be prepared to get frustrated. So it probably depends on what apps you use most.

My main apps are PyCharm, Safari, Chrome, Mail, Spotify, terminal, a few different text editors and that kind of thing.

Apps that do force interaction are the App Store, Shortcuts, Home, Podcasts etc. I don't use any of these often but absolutely hate the experience when I do.

There are some apps like Finder, Settings, VO Utility etc where you do a little light interacting but they are fine. In my opinion it's the apps you open and they just say 'Collection" or something and you are supposed to know what that means that really confuse me.

By Josh Kennedy on Friday, July 25, 2025 - 15:22

does the mac version of voiceover have the screen recognition and image descriptions like voiceover on iPhone does?

By mr grieves on Friday, July 25, 2025 - 16:04

The Mac doesn't have screen recognition. There is an app called VOCR which you can use to navigate some inaccessible things on the screen which is probably as close as it gets.

For images, pressing VO+Shift+L will give a very basic description. This works in most places but not all apps (e.g. if I have an unlabelled image in PyCharm it won't help me out). I find this quite good for OCR, but not much help otherwise. I tend to look at different images on my Mac to my phone so not sure if it is the equivalent. I don't think there is a way to just get them automatically like the phone either - I think it always requires you to ask for them.