Do you think the barrier to entry into programming today is much lower and less complex than it used to be in 2000? My father is a programmer who comes from an era where programs used to run on 8MB of RAM and he keeps sharing his stories to me.
I started programming in an era where personal computers had only 640kB of RAM. That's right, kB, not MB.
Today, it's easier to get something done because there are so many solutions and libraries available on the internet from which you can copy-paste. In those days, one had to _know _the language or the tool very thoroughly and come up with your own solutions.
OTOH, in those days, programming languages were simpler. JavaScript, for instance, is a lot harder to learn and master (think about closures, promises, ES6 classes etc) as compared to say, C.
So, I would say it cuts both ways. Today, it's harder to learn the tools, but easier to get things done. In those days, it was easy to learn the tools but you were on your own, without much help from the internet.
Definitely! There are still 100% of the resources and technologies available in 2000 versus now, the difference is 17 years of progress!
Even if your goal is to learn some pre-internet technology like FORTRAN, there are going to be more resources available now compared to 2000, more tutorials, more source code available to read, and also the tools you use to write, edit, and run code have improved as well.
In the meantime, we have also developed new technologies and improved versions compared to what we used to use so there are plenty of new ideas and improvements that make everything easier as well.
RAM and resources are one thing, but knowledge accessibility has also evolved in order of magnitude since the 2000. I remember the pre-Google days, offices were the only place with always-active internet access, (ie it was much more difficult to learn on your own and during your spare time) and there were very fewer communities and less-structured knowledge sharing communities.
Back then (around 97, 2000), we used to subscribed to plain-text mailing list to 'enter' a community. Software documentation which was incomplete or not good wasn't completed by some community-led effort in a structured fashion, because the tools weren't there yet.
On the other end though, for better or worse, things evolved fast but not as fast as today. I mean, I cannot imagine leaving today's world for 6 months without having to anticipate some strong catch-up when I come back. Back then, fewer things happened in 6 months regarding languages, framework, security, potential solutions to leverage in your architecture.
All to say, it was much more difficult to learn and become an expert back then, but once you acquired that knowledge, the deprecation speed was not as fast as it is today!
It is. There are many more resources and knowledge to build on top of. The initial challenges of a field like this is simply accessing information. Additionally, more languages exist that are designed to abstract away complex development. For example Processing is a language built on Java and was designed as a simpler language for non-programmers.
Sébastien Portebois
Software architect at Ubisoft
Nathaniel Ng
I graduated from university in 2000. At that time, I had done a bit of coding in Quickbasic/Visual Basic, C/[Visual] C++, Fortran, and Matlab. There was a lot less to learn then (picking up the basics of C, Fortran, Matlab, or Basic could be done in a weekend, and a reasonable proficiency could be achieved within a month or so). Documentation, however, was really sub-par, and there was no Stack Overflow (and finding an answer on the web via search engines like Alta Vista / Infoseek / Yahoo wasn't easy). So while the basics were easy (e.g. code a 100 line program in C/Fortran/Basic), much of the advanced stuff was hard--for example, trying to figure out some user-interface thing in Visual C++ via the Microsoft Foundation Class (MFC) documentation was a real pain for me. I had a lot of trouble figuring out how to use libjpeg to read a jpeg file (I never succeeded).
Now it's 2017. I still code in C/C++ for number crunching stuff, but I've moved to Python for all the high level stuff. It's now a lot easier to do certain things (use Python to read image files, make plots with Matplotlib, interface with SQL / no-SQL databases, or do statistics/machine learning). Building a basic GUI or web interface is generally easier. There's Stack Overflow, plenty of online MOOCs, and countless other places to go on the web for information. On the other hand, there's a lot more to master, and so many competing languages and frameworks/libraries to choose from. Even for a simple language like C, knowing how to optimise it can be difficult to really master (think cache optimizations, #pragma simd, hybrid MPI+Open MP, or compilation/optimisation for NVidia's GPUs or Intel's Xeon Phi). And for any language, if you want to do things properly, unit tests and version control are often standard practice (unlike in 2000).
So I think to become an "expert" in some particular area, it's probably not very different from 2000. On the other hand, if you just "want to get something done", or you want to do a "minimum viable product", for many things, it's going to be considerably easier now. 1000 lines of code will get you a lot farther in 2017 compared to 2000.