Thanks for the elaborated reply. :)
This answer has received 2 appreciations.
Pretty much everything.
When I started coding as a kid, I had a Commodore 64. IBM PCs were already out there, but were really scarce this side of the Iron Curtain. If you see it from this point, the whole architecture changed, and a lot of options are available for a relatively low price.
On Commodore 64, the only language you could use was BASIC. It doesn’t mean you couldn’t access the hardware directly: the
POKE commands did just that. Now we can choose from a whole range of languages if the goal is to create an OS, and another big range if we want to write applications for such OSes.
The architecture change is definitely a good thing. C64 was sluggish, accessing data both on cassettes, floppies, or even the first hard drives, was really slow (yet, it was faster than reading it from paper, which was the other alternative).
The diversity, both in hardware and programming languages, is not necessarily a good thing, though.
- On the hardware side the Device Driver Hell is a thing; I mean, there must be a reason Apple allows you to install their OS only on select hardware (ie. Macs).
- OS diversity can also be seen as bad: any OS that is backed by commercial support has an unfair advantage, be it Windows, OS/X, or any Enterprise grade Linux. Although such support is good for any OS, there are a lot of awesome projects struggling out there.
- Programming language diversity is also a real hell nowadays; just look at all the questions asking “which language to learn if I want to do X development”.
- I don’t even dare to mention applications. Does the term “editor wars” ring a bell? And that’s just one kind of app.
On the other hand, diversity is generally a good thing, if you can use it to your advantage. If you can choose the right hardware, the right OS, the right language, (and probably the right framework, if that applies to you), your project will probably become a success. This, however, is a really hard task only true polyglots can do (of which there are only a few out there).
The bottom line is, you have to be really well informed to be a good software engineer. Coding is one thing, there are a lot of good coders around; planning the whole thing bottom to top became really hard.
One good thing: In the olden days a lot of software (most?) were done using the waterfall method. Today we use more agile methods.
One bad thing (but actually good thing): Frameworks and platforms today evolve and change so fast that you can learn something this month and next month it's changed or no longer in favor. We have so many choices between languages and frameworks it can be daunting.
Another thing that has changed. While free software and open source has always been around it was not very prominent in 1994. After the 2000's open source really started to take off and today it's common to see large and popular open source projects.