Hi, I'm a lawyer and I want to start learning to code as a new hobby. I am a mac user. Jaws. is the worst, so I'd love to never touch a PC again in my life. Are there any programing languages that are particularly good to learn as a mac user? Are there any that are off limits? Do I have any hope of being a programer someday? Do any of y'all have tech related careers?
Comments
Swift
Hello,
I recommend Swift which is also Apple's programming language.
You can learn swift by searching for the Xcode app.
I am not sure wanting toā¦
I am not sure wanting to recommend swift as the first. I'd say, and I know it's weird, go either high level with python, java or the web stack, or directly with c (not c++). But the most important challenge you have to overcome is tutorial hell :) .
I'm also very interested whyā¦
I'm also very interested why you dislike jaws which I can somehow agree on as I'm an nvda user through and through. Never touching a pc again? I really want to know why. If you want don't reply here as it's off topic, not create another topic either but take one where we discuss mac positively and revive it. It's up to you of course. Nowadays it's more popular to rant the other way around...
My Views
I've been coding for 28 years, most of them sighted, but these days I do it without any sight. Also, while macOS is my main platform, both because I'm used to the Apple ecosystem as a power user and developer, and don't particularly like the bloatware, lack of attention to detail, stance on privacy, and lack of accessible firmwares on the PC world, I still do not think that macOS offers the best experience to most people, so you might want to re-evaluate and reconsider your experience with third-party screen-readers on Windows in particular.
As for programming languages, in the long run they barely matter. It's a good thing to experiment with the mainstream paradigms like imperative and declarative, which are the main categories, as well as their subcategories like procedural, object-oriented, functional, and logical, on the static, strong, weak, and dynamic typing disciplines, giving you a solid theoretical foundation. Once you feel comfortable with the previously mentioned paradigms by building things with all of them, getting familiar with mainstream programming patterns, data structures, and algorithms is also fundamental to build a solid engineering foundation. Also, and while you did not ask about this, experience with multidisciplinary applied math is somewhat important. While the average programmer gets by with lots of knowledge gaps, I can tell from personal experience that having a solid foundation matters a lot, since feeling confidence in your ability to tackle challenges makes you far less prone to burnout, and makes people feel generally safer with you around, ultimately raising your value as a human resource.
Although I said that programming languages barely matter in the long run, some languages are more pedagogical than others, and there really is no consensus on which languages should be taught first, as you can see for yourself if you compare the curricular of multiple college courses. This is because people have different needs and tastes, so there's no criteria that can be objectively applied to decide what's best for everyone. I started with C in the late 90s, and although I don't think C would be a good introductory language even for someone exactly like me, I do think that learning C should be a rite of passage for software engineering, learning a Lisp dialect should be a rite of passage for any computer scientist, and achieving fluency in both within the first 5 years of learning is a rare trait that I regard as a sign of extreme potential (I did not match this criteria myself so I'm not bragging or even making up criteria biased towards emphasizing my own experience). From a practical perspective, and contrary to earlier replies, I do think that Swift can be a decent introductory language, as it's actually designed with the progressive disclosure principle in mind, and that matters a lot when it comes to cognitive accessibility.
As for what is possible to accomplish as a blind programmer, I can tell from personal experience that there really is no limit, because as a blind jack of all trades I've have experimented with highly optimized 3D computer graphics, digital signal processing psycho acoustics, 3D rigid body physics simulations, video-games, bare metal applications, reverse-engineering, specialized neural networks for computer audio and vision, 2D graphical user interfaces, and web applications completely from scratch. The next objective in my pipeline is learning to develop digital hardware circuits that can either be deployed on FPGAs or turned into actual integrated circuits, as I have an old dream of cloning the functionality of a Sound Blaster 16 which was my first sound card back in the mid-90s. These are all highly technical fields that few people actually get into even if we only consider the universe of software developers, and most of them also require college-level math.
As for my background and occupation, I am a fully self taught developer, dropped out of high-school during my senior year at 17 after landing my first job in the industry in January 2000 in the final days of the dot-com bubble, and quit my last job in September 2011 due to the degradation of my vision, with real effect in November 2011 due to contractual obligations with my employer. In 2014 I lost all my useful vision, and only reentered the workforce in November 2024. Due to the huge gap in my resume I had to be upfront about my blindness, and as a result never heard back from any job applications even during the pandemic, when the demand for programmers peaked. I knew that, if given an opportunity, I could easily demonstrate value, and that was exactly what happened when the opportunity to work remotely for a start-up in California finally came last year.
I think that, in order to stand out in the tech field, you either need to be truly passionate or extremely disciplined, as things move very fast and the learning never stops. Fortunately knowledge grows exponentially, so the more you know the longer your learning stride becomes, giving you the ability to absorb as much knowledge in a week as a rookie would absorb in a couple of years. In my experience, beyond a certain point, people assuming that you are a genius becomes a very common occurrence, when in reality you are just applying your experience as a highly trained professional to make the right call based on patterns that you learned to recognized after years of exposure to software engineering problems.
Lastly, the job market for software engineers is in crisis right now, is shrinking rapidly with no end in sight, and is actually brutal for people trying to get their foot in the door these days. Able seniors can still find jobs, but even for them the situation is nowhere near what it was during the pandemic, so as a blind industry senior, I consider myself quite lucky to have been given the opportunity to reenter the workforce last year. Several factors contribute to the current situation, from companies reacting to a perceived shortage of developers during the pandemic by over-hiring and laying off later to optimize costs during a short period in which big tech lost some market capitalization, lots of people gaining interest in the field for the compensations alone, and finally AI, which is the big elephant in the room eating all the entry-level meals, and I don't think it will take long until state of the art large language models become a threat to me as well.
wow this is awesome.
Thanks everybody. College level math is not a problem and neither is the job market. I love my job now, and was mainly asking about my potential. If I can't be awesome at something, I don't see the point of doing it, which is why I asked about career potential.
Programming today
Programming today seems so much more complicated then when i learned it as a kid, back in the late '70s and early '80s. Everything was text-based CRT displays then. Now it's all GUI IDEs and APIs. It seems like there's so much you would need to know before you write one line of code.
But, if I were going to write code today, I'd use it as an opportunity to beef up my Braille skills, because I can't imagine how I would use a screen reader to read production code. The only coding I do these days is a little bit of HTML tweaking, and it's a real struggle, doing it by screen reader alone.
Good luck. Have fun.
Well, it's essentially theā¦
Well, it's essentially the abstraction problem and the exponential learning curve of a dude writing questionable quality code with nodejs serverside vs the understanding of the good old lamp stack? ... For the web at least.
Challenges and Commitments
One of the fundamental challenges in software development is maintaining proficiency across multiple programming languages. The technology landscape today covers many languages, each with unique syntax and applications.
During my computer science education, I had the privilege of learning from a former Department of Defense contractor who demonstrated incredible proficiency in over 30 programming languages. His ability to create identical solutions across several languages while keeping syntax accuracy was awe inspiring.
In my experience, the principle of āuse it or lose itā applies significantly to programming skills. Without consistent practice, even well-learned concepts can fade from memory. My own experiences are a perfect example of thisādespite formal education with Python, Java, Scala, C++, assembly language, HTML, SQL, PHP, and CSS, my current confidence is limited primarily to web technologies and Python.
For those pursuing programming expertise, consistent practice is essential. This is particularly important for careers in information technology, where technical interviews and assessments often rigorously evaluate coding proficiency and problem-solving skills.
TLDR:
Practice often and maintain your skills, otherwise it is very easy to forget what you have learned. Unless you have an eidetic memory, of course. š
Let's not overthink
If you are just looking at this as a hobby, then I wouldn't get too hung up on some of the points made above. I would ask firstly what you are wanting to get out of it?
Do you want to be able to write apps that run on your Mac or iOS? If so, then probably Swift is the way to go, and in this case with XCode on the Mac.
I've used XCode and Objective-C a number of years ago, but I had sight then, so I don't have experience with doing this blind.
Or are you wanting to write little utility scripts? Or are you wanting to just learn solid theory?
I don't think the Mac is necessarily worse than Windows for coding even if it does have issues. The intelliJ suite of apps run well and cover most bases - I use PyCharm but they have IDEs for Javasccript, Go, Rust, Java and countless others. And they work better on the Mac than on Windows. But there's always Visual Studio Code or if you are masochistic you can just use a text editor.
From my own personal point of view, I have been coding since the 80s when I had my BBC B and was writing BASIC. I later got a coding job and have been doing it ever since. I have worked with a number of languages like C#, JavaScript, TypeScript, Java, VB.Net and nowadays Python. I never went to uni.
I was mainly self taught and writing fairly bad code. But I went through a bit of a patch of reading lots of books and then learned how to do it properly.
However, for a hobby, writing bad code may be fine depending on what you want from it.
I have never had to learn a language from scratch since going blind, and really the only language I use extensively now is Python. I find punctuation heavy languages like JavaScript and even HTML quite tricky as my brain fogs over after the fifteenth open bracket in a row. Which is a shame as code that was elegant and clean when I had sight is now a horrible mess I struggle to understand. But that's probably a lack of practice, and maybe compounded by losing my sight later in life. But I find tools like PyCharm that can at least point me at my numerous typing errors at least help a bit.
I am able to make a living from coding blind with audio only and I'm not as good as I was with sight, but I get by. I'm sure braille would help but I don't think I will ever get there.
If you are confident in using a screen reader, and can type reasonably well and enjoy using your Mac, then you should be fine.
You guys are so awesome.
I'm loving all of this.
SwiftUI is the best!
So I started programming for Apple platforms in 2010, and made several apps. We've come a long way since then, and we can now make apps very simply and accessibly with SwiftUI. Here's an example.
struct ContentView: View {
var body: some View {
Text("Hello World!")
}
}
That is an entire app.
Of course there is a lot more to it, but that is the start.
I believe that anyone can learn to build for Apple platforms, and a native mobile and or desktop app be much better than any electron or webg app will do.
I taught a course last year called swiftUI Basics, and it was amazing, and I will be teaching another free course on app development this year starting in September.
You can go to https://techopolisonline.com/courses to find them and sign up.
I would urge everyone to give Swift and SwiftUI a try even as beginners. Swift Playgrounds are great on iPad as well.
Abstraction bait
I think that baiting people to software engineering using abstraction is the wrong approach. I mean it might even work, but you're attracting people with the wrong kind of mentality. Software is quite complex, and I believe that using a top-down approach to teaching it only attracts the people who want things done without actually caring about how whereas in my opinion what the industry needs most is the exact opposite.
When I wrote my first lines of C back in 1997, one question that immediately popped in my head after being explained the need to include headers, was how
printf
could seemingly accept any number of arguments, which was contrary to all the other functions that I had written at that point. This was the first time I had to deal with magic in coding, which is the term that I apply to anything used to teach software engineering at a specific level of skill but whose complexity far exceeds any reasonable expectation of understanding at that level. The answer in that case were variadic functions, which is an advanced C topic that requires going a level lower to understand implementation details.SwiftUI suffers from exactly the magic problem that I mentioned above, but elevated to absurdity. The simple example posted to this thread declares a value type that conforms to an insanely magic protocol called
View
, and inside that value type there's a computed variable of a generic type with just a getter that produces an object of yet another value type conforming to the sameView
protocol. Not only that but it completely skips another type conforming to yet another magic protocol calledApp
that provides the entry point, scene, and window functionality to the application, and the Xcode build process hides quite a lot of important details as well, like the Info.plist file that's not even part of modern templates, how the code is actually built, and how the application is structured in the end. Swift itself is a pretty good language pedagogically speaking, but starting from SwiftUI is one of the worst ways one can use it to teach in my opinion.The problem with teaching people this way is that they are being instructed to suppress their curiosity and memorize rather than understand software engineering, so they'll be clueless to solve any issues that they will inevitably run into. In SwiftUI the simplicity is merely superficial, because the framework is far from complete so people will quickly find themselves needing to tap into AppKit or UIKit to do anything that SwiftUI can't do properly, with rich text editing being the most glaring reasonably expectable basic case because the
TextEditor
SwiftUI view type is only minimally functional and last time I checked didn't even work well with VoiceOver. Last time I used SwiftUI on iOS even things like controlling when the keyboard is shown or hidden, or accessibility details like making announcements and moving the VoiceOver cursor to a specific view, were not even possible to do without UIKit, so in my opinion SwiftUI over-promises and under-delivers.In my experience, SwiftUI is pretty good for quick layout development, as it's much easier to reason about than auto-layout constraints in AppKit and UIKit, but falls short when it comes to anything else. The idea is pretty good and I genuinely like the design of the framework and how Swift itself evolved to make it possible, however to me it's value proposition is conceptually comparable to gold-plated lead. The compiler itself fails to diagnose problems spectacularly even with just moderately complex views and sometimes even crashes, making solving compilation errors a guessing game especially when the local compiler version differs from whatever is being execute in a Continuous Integration pipeline of a project.
Within a code, things are so complicated
A few years back, I learnt python. Not just hello world or concatenation, but functioning programs like a Youtube downloader with accessible interface. I played with lot of libraries, learnt to read their documentation, use them in my programs.
Then, I started reading NVDA code. Oh my god, it's a code within a code. I thought I could work on simple things, enhancing UI here and there, but, You have to actually learn how the application's existing code is structured, what does what, and NVDA being NVDA, even though code is nicely commented and explained everywhere, I couldn't even make a single functioning modification to my local copy of NVDA, forget the github repo.
Sad to say I gave up.
But today, these AI tools have made things kind of simple. If you know what you want, and how you want it to be done, and I mean at a technical level, you can pritty much start with Chat GPT, tell it to use the libraries you want it to use, and, start adding functions prompt by prompt. It might feel slow, but slow and steady gets you through the race. Test everything, troubleshooot and fix everything before moving on to next thing. Ask Chat GPT to troubleshoot errors if you can't find out the cause.
Speaking of that, has anyone tried Github copilot?
You are right about one thing.
With Integrated Development Environments (IDE), and LLM availability, coding is stupid easy these days. I kind of wish I had access to all of that back when I was in school learning to code.
Python was one of the languages I learned as well, unfortunately, I have pretty much forgotten everything I have learned. Maybe one day I will brush up on my education, if I ever need to learn it again, or utilize it for the purposes of a job. It's kind of a shame, because I really liked python over the other languages I learned in school.
On a sidenote, as much as I loathe learning Java, Skcala was kind of neat. That is an offshoot of Java, used for crunching ridiculous numbers, such as the entirety of Pi, or Euler's Constant, for example.