Learning to program in the recent years has gotten significantly easier, but one could also say it's gotten quite harder as well. Hear me out.
In the recent years, getting into software development, let's say web development to be precise, has gotten so much easier. Now anyone could become a web dev in less than 3 months, compared to back then. But yeahh, when I say becoming a web dev in less than 3 months, of course I am referring to just getting to play around with Html, CSS and a wee bit of JavaScript, then maybe Node.js if you're looking to go fullStack.
This is just basically how it it presented to a lot of newbies, or aspiring web devs out there. Yeah there are hundreds of thousands of tutorials and classes that could help you become one. Infact, shortage of resources to learn from is a non existent issue nowadays, not to talk of the insane amount of "Helper Tools" out there, promising to make your work over 70x easier.
Let's talk a little bit about web development in the very early 2000s, web development could be said to have been harder then, mostly only computer gurus and computer science student could easily develop a website, this was because the web was quite relatively new and the layman had little to no idea of how it actually worked, also, it's learning resources were mostly only shared at college level. Learning web development then might have seemed extremely technical, but lets take a brief look what a web developer would be required to learn, to build a fully functional web application, lets take for example, the popular ebay website then.
- An understanding of Html, CSS and JavaScript would be expected, as those are the pillars of the web.
- Knowledge on how to set up and configure web servers.
- Basic server side scripting with PHP.
- CSV(Comma Separated Value) files for database.
Well of course It would look like this:
Yeah obviously not so appealing. Recent web development has introduced a ton of better design principles, more dynamic interactions and better security to web applications. Along with these new introductions came a ton of new tools and techniques required to achieve them. So lets take a look at what it would take to create that same ebay web application in the more recent years, well you would basically have to learn about:
- Same Html, CSS, and JavaScript
- Responsive web design
- frontend JavaScript frameworks like React.js
- Asynchronous programming
- Database management systems(DB MS)
- ServerSide language like Node.js
- ServerSide JavaScript framework like Express.js
- API integrations
- Web sockets
- Version control systems like git
- CI/CD
- Testing frameworks
- How to use User Authentication and Authorization tools like OAuth
- Cloud services like AWS or Microsoft Azure
Sheesh, I personally find all these quite overwhelming.
Well, we all know that these tools and techniques are provided to create more dynamic web applications, offer a higher level of abstraction to the initial development technicalities and overall to make the development process a lot easier and faster. All these are great, but now let us take a look at it from the beginner's point of view.
Knowing you have to learn all these things to build a web application now might not be the main issue, but with the emergence of new tool, frameworks, practices, techniques, libraries and the rest popping up left, right, front, back and center every single day, it's getting quite harder to keep track even for the experienced developer.
Someone just learning about web development might go ahead to learn about the first three pillars(Html, CSS, JavaScript), after that, there comes the confusion with everything.
Having to figure out what you actually need to learn, to pick one tool or framework and stick with it might actually be the most challenging aspect of it all, with the vast number of tools, frameworks and development techniques out there, coupled with an abundance of learning resources, filtering out the information that you actually need and what is actually necessary to learn is the most crucial part of it all. This is also one of the reasons a lot of newbies get stuck in tutorial hell, because they try to consume information about the million and one tools and techniques that are out there, and then later realize they have actually not learned anything in the process at all.
While learning about a particular tool for example, they could come across information of the new and latest tool or framework that is taking the development world by storm, try to divert to go learn that, to make sure they are learning about the latest, but along the way, a cool new shiny raving tool or technique comes out and now they feel they have to learn that as well, it's just never ending. This makes the process of learning to program nowadays much more tasking than before.
Although the field of software development is an ever changing and ever growing field keeping developers on their feet with new trends and adoptions. The question now is "Are we overcomplicating the development process with all of these new adoptions and supposedly helper tools? and when exactly does it get all unnecessarily too much?"
What do you think? Share your thoughts in the comment section below.
Top comments (18)
I started programming about 50 years ago, in 8-bit assembler. No operating system or network in those days and everything was hand-built. Was it easier? Well, yes and no. What we were doing was building the foundations most of today's world rests on. By comparison, the work of today seems more like jamming Lego blocks together than real programming. Everything seems designed to prevent mistakes and avoid the need for any any truly creative input, but that takes all the fun out of it. To me it feels like a straitjacket so I'm glad I'm safely into retirement and able do things the way I want rather than have to slavishly follow endless, forever-changing rules and procedures.
I've used the "Lego blocks" analogy before, too. I feel like it holds up -- under certain conditions. I feel like "Lego blocks" programmers generally are building things with "rough edges" (you may or may not see at first look). The Lego analogy works in that a Lego model looks similar to a McLaren or a Ferrari, but it's just a rough approximation thereof. The more refined Lego gets with their parts over time the better the car looks.
Lego blocks programmers are making approximations of websites. I bet that the really good sites, under the hood (to continue with more car analogies), are less Lego and more custom parts holding everything together.
Maybe that's the essential difference between programming and development. The latter is less technological and more pragmatic, following economic considerations like 80/20 principle instead of overengineering and perfectionism where it does not make sense anymore. (I understand your point that micro-optimization does make sense in low-level programming.) All of this leaves more room for another kind of creativity, on a visual and aesthetic level for example.
Extending the car discussion, the lego block analogy feels a bit off. Modern cars don't look edgy and pixelated when zooming in, but they aren't hand-crafted like a Ford T-model but put together using a lot of standard components built by third parties.
That's the problem with analogies - they are never exact. If they were they wouldn't be analogies :-)
Yeah, you just gave it the most accurate description "Jamming lego blocks together". These days it's just about knowing how to use a specific tool perfectly in context (lego blocks). Although learning to program in those days might have required a much higher level of technical understanding, I would imagine there wasn't really any real confusion on what it was you needed to learn about compared to today.
Thanks for that. I'm concious how easy it would be for me to sound complacent from my safe lofty perch in retirement, and I have no intention of dissing the undoubted talents of the young programmers of today. But then I was never really a programmer; I started as an engineer and just drifted into coding with the advent of the microprocessor. To me, a programmer is someone who follows programs and knows how to persuade a computer to do the same, whereas an engineer is somebody who makes things work, if necessary by breaking virtually every rule in the book save one - Occam's Razor, being to look for simplicity wherever possible. My approach to coding works for me but it irritates most programmers, who see me as an unpredictable iconoclast. From which you may safely conclude that my career had its ups and downs.
I understand your point here, but it ms the same of saying that the wash/dryer machine takes the fun part of cleaning the clothes that should be a analogical work to be done by hands. I couldnβt be working with such a tough task like binaries and building stuff like you did because I am only 34y old and I can work the way things work today. So I am really glad (not ironically) that super smart people like you came before to help me to achieve my dream to work with engineering. The lego blocks were heavy and huge before, but now they are spread out into a million of pieces and into points cloud. So my goal is to become smart as you are and help the newbies to understand those new small lego blocks and create something helpful to make life easier.
I think it's true that as well as the nature of programming having changed, so has the kind of people that go into it. In the "old days", people - including myself - were hackers (in the positive sense of the word) because the rules were yet to be written. Such people still exist but they don't cope well with the discipline required for mainstream programming. The introduction of modern techniques has opened up the field to a wider cohort such as yourself, with a broader range of skills. The price to pay for this is the need to be multidisciplinary, covering languages, frameworks, testing, build tools and more. All but the first of these were almost unknown in the early days of the microprocessor, so we were able to focus on coding. The systems we built were simpler than those which followed, so we also had to look after ergonomics (how well the product fits consumer needs) rather than that being decided elsewhere.
I'm happy to remain a fossil and let the real work be done by younger and better-adapted people. There's still plenty of space for hacking; although it generally doesn't pay well it's as much fun as it ever was.
With all the information around and all the stacks is easy to think that you need to learn them all before entering to the field but that's not the case.
With a plan a newbie will be good after html, css and js (Fetch, Promises, Design Patterns) choose any frontend framework, git. After getting a entry level job and that let it guide you. After that with the foundations you just need to read documentation to get a new tool because if we focus in the current web development stack we will be lost.
Well yeah you would think so, but when it comes to actually getting the job with those skills alone, it would take some bit of luck. Here is what a typical junior level job requirements looks like nowadays
yea, but what that job description really says is
"We need someone who's used bootstrap with tailwind before to make a website go live. We also heard React's what all the cool kids use nowadays, so please have that skill too. Oh and use github so you don't lose our code."
As an applicant, i'd only harp on that skillset, since that's all they really need. The rest is nonsense fluff written by someone who doesn't understand what their website is and/or does.
Point! Thanks
Looking back on my experience as a web-developer since the 1990s, I can only acknowledge sahra's perspective. While the web is still built on the same simple and robust foundation technologies, both the local development stack and the typical libraries and frameworks have changed a lot and everything has become more complex and there are more alternatives, which is a good thing, but it can be overwhelming not only for junior developers, but also for companies and their recruiting departments.
This is a question that we should ask ourselves before and while starting a new project. Sometimes it's the best choice to go for a framework / library / stack that has been popular for several years, which is still actively developed, and where it's easy to find developers and start working together quickly in new teams. Sometimes it isn't.
Personal projects are a good use case for experimenting. I built my personal portfolio site using 11ty, and I might give ClassicPress a chance as an alternative to WordPress for a small side project. But I might still choose React or Vue for a corporate project where I will have to onboard an collaborate with developers that I have never worked with before.
Part of the problem is, that it is so easy to reinvent the wheel today. Things would be much easier, if people tried to use the standards we already have. But everybody needs to introduce an new language along with their new tool. As much as I like Svelte, but they needed to introduce - among a pile of other things - #if and #each. So you find yourself in a code, that uses Javascript, but a few lines later you need a different syntax to do the same thing in Svelte-style. Javascript already knows more than enough ways to write a loop, like for, for...in, for...of, while, .map, .forEach., .filter, .reduce,... just to name some.
For me, this feels much like the situation we had in the 80Β΄s, when every programmer tried to build a user interface without any standards (and often no good knowledge about how to build interfaces). I once used a program in which we counted 7 different ways to input values in an edit fields, and the user had to remember, which page used which method. It was kind of a "graphical user interface", but you had to navigate using CTRL-codes. Each page used different codes, so it was virtually impossible to use the program, as you had to look into the manuals to find the right code.
This is pretty much what we see today: Every little toolbox introduces a new set of commands, regardless if it is a big framwork or simply a small addon.
I fell the same overwhelming feeling being behind. It so annoying that I decided to stick react and next.js stack and do not even bother about other things. I canβt handle more π£
Honestly, shutting your mind from all the rave out there and just picking the things that are actually necessary to learn for the job you want to do is essential.
Software development is not the same as web development. IMHO html,css and Js are a terrible way to learn to program. Learn a more structured language first (e.g. python,ruby,java) which will give you the basis of a solid foundation for a career. I once met someone in their 50s who admitted that they had 'just finally understood software' despite doing it for 30 odd years (because they just went C, sql, perl/cgi, php js etc) and never landed on something that grounded them. That person actually found it in ruby
The omnipresent, yet, utterly wrong idea, that coding would have become easier!
While in truth the exact opposite is reality!
As your cartoon suggested, the given examples are just the tip of the iceberg!
Coding NEVER before was as hard and counterintuitive as nowadays, and it's getting worse and worse!
40 years, ago, you could easily understand a given code, without documentation, and start learning just from this! Even assembler was WAY more intuitive than all the cluttered syntax shenanigans with dozens of ',' or ''' or '.' or ':' and whatever!
10 PRINT "HELLO WORLD!"
(Setting up the IDE for coding this required nothing more than flicking the power switch btw. and it would take anyone (even someone seeing a compuiter the first time!) about 30 seconds max, form the decision to start coding until having the code ready to run, while nowadays setting up most IDEs is a science of it's own!)
Falsify me if you can!
I even coded a very simple 3D 'engine' just spontaneously by trial and error back then - as a eight year old kid without any teacher, mentor or internet explaining anything.
I wouldn't even remotely know how to start trying this in any (supposed!) 'modern' language, despite trying to get into the all samish Java/C(++/#) and so on syntax dozens of times!
My claim is: The whole programming scene's Meta has moved in the worst possible direction! It's 100% optimized towards enterprise needs, and thus moved away from programmer's needs. For example things like OO have become default, while they ARE utterly terrible for 99% of applications. All the performance gains computers had are completely wasted, with nearly nothing actually contributing to improving coding instead of making it more 'corporate friendly'.
Even BASIC would have been the MUCH better choice than any oder language that has been 'invented' since at least 20 years, and this by far! Because there are more than enough resources, and it would be compiled and aut-optimized anyway.
The times is more than ripe, for coding finally getting the modernisation it's screaming for since at least 20 years, now! And finally with AI there's the salvation!
It will free humanity from all the silly and irrelevant stuff, like headers, definitions, declarations, methods and whatever - and enable coding just the part that matters!
So: How can you even state such obviously and utterly wrong ideas?
Name me any modern coding language and I say: It's reinventing the wheel - just worse!