As a teacher (informal; I'm talking about teaching coworkers/friends not in a formal institution), I often hear people telling newbies to "learn python" or JavaScript first because it's "easier to grasp the fundamental concepts of programming before dealing with advanced OOP concepts or even low-level concepts like pointers/bitwise operators/etc..."
There are variations on that advice, but, I'm sure you've heard it before too. The question is, has this actually ever been evaluated and shown to be true? Do folks who start on a language which does more for the programmer and abstracts more away, actually, for reals, having a better learning experience? I've taken an opposite approach and taught several people C instead of python, and so far it's worked out better than expected.
One of the reasons I ask is because there was a time when none of the newer languages existed, yet the human race still wrote very effective programs. So I can't help but think "At one time, newbies *had* to learn on assembly language" or C, or PASCAL, or punchcards, etc... And apparently it wasn't *so hard* so as to block us from progressing.
Keep in mind that my question here is specifically about learning experience. I don't believe how fast someone learns or how effectively someone learns has much to do with how good of an actual career programmer they are, so we are not addressing "which way makes x person a better programmer in the long run." We are also not addressing "Which language is best?" or "Are higher level languages better than lower-level languages?"
I had to explicitly state the above because I could see this topic accidentally heading in those directions. Also note that, in this context "more abstract" means "languages which do more behind the scenes for the programmer without them having to worry about it"; those which come with built-in data structures, garbage collection, and other "automatic" features.
I think a good question to address here is "Does programming fundamental data structures/algorithms and dealing with manual memory management teach some part of programming which cannot as easily be learned when only coding on more abstracted/macro level systems?" For example, does coding a hashtable from scratch exercise the brain more than implementing an OAuth API?
As like you, I'm "teaching" my coworkers (primarly apprentices) and I'm hitting the same spot. With 10+ years of experience with strong types languages and as far as I love JavaScript its kind of style I would not recommend JS for teaching fundamental knowledge. Showcasing maybe since "if" or "while" are quite common across the languages.
But (and since I do not know python that well) JS has its, sometimes rough, edges (e.g. "this" context) which becomes more a disadvantage for the trainee. In school often Java is used (since the in the ones my apprentices attending) since its low effort apporach (no heavy IDE necessary) and it is a very widly used and accessable language.
But I think C would be also a bad choice. It's way less accessable and has more than enough edges. To be fair back in the day there was way less choice what language to take than today.
But what I also notice is that "manual memory management" becomes a niche over the time. Don't get me wrong, I don't say it would go away. There will be always a low-level layer to program to. But measured against the amount of developers and it's different profession "targets" it wil become less. As comparision (please don't hit me :) ) the automatic transmission. 20-30 years ago it was less efficient than a human. Today it's the opposite. I think in memory management on an application level we hit that road too. We’re not quite yet there (Electron anyone?) but we'll getting there maybe in the next 10 years.
So in the end what's the right choice? I have no absolute answer. It depends also on the environment the trainees are in. In my case JS is a mandatory but not the only one so I try to teach as language neutral as possible and also let my trainees do stuff in other languages. Surely they have to be flexible.
It isn't necessary to learn a programming language to grasp abstract concepts. They simply help exercise the type of thinking required.
I compare it a lot to manufacturing, which is a complex process with many components that work together. I don't need to learn code to understand the fundamentals of building a car -- I just need good docs. It'd help to know how or why we use pistons, but as long as I put them in the right place, we're good.
Programming methodologies and patterns like OOP are extensions of logical thinking. How do I handle this data? And how do I effectively organize and architect my process to ensure minimal effort and best results? When you learn how to critically and abstractly at the same time, you start to fall into the same patterns everyone falls into.
It's how you said, we coded all our existence without C or Pascal -- the process of human innovation isn't unique, anyone who's had a great idea knows that great minds have and will think alike.
If I want to expand my programming knowledge or create bigger+better apps, I'll dig past my framework/language and see how it's inhibiting the process. That often requires me digging all the way to the V8 engine, learning about HTTP/2, or dropping a language entirely for another because of the holes you find (PHP anyone?).
TLDR: Learning how to make a hashtable is great and exercises the right kind of thinking, but I could glean similar concepts off more practical code (like OAuth APIs) if I cared to dig a little deeper.
I'm a junior developer, so I've no experience of mentoring, but I was a school physics teacher before starting this career. In my teacher training, I learned that the three questions anyone planning a lesson should ask are:
It seems to me like neither 1) nor 2) is completely clear here, but let's assume the answer to 2) is "Nothing about programming at all."
What do you want them to learn?
My guess is that you want them to become productive members of your team, but what does that mean? I figure a software developer has to know about the following things:
In addition to these, you'll also want them to develop the skills of analysing and solving problems.
Now, how does your choice of language affect any of these? Most of these concepts are either language-agnostic (a binary tree is a binary tree in any language) or their implementation is language-specific without being transferable (Is understanding Maven useful if you're using npm? What about if you're vendoring a Ruby gem?). The upshot of this is that if there's one language you work particularly with, it obviously makes sense to start with that. Focus on one idiom, one ecosystem, one API, and teach them how to write working and maintainable code.
But as you're asking this, I guess that doesn't describe your situation. So what then? I'm not sure I'd describe choosing between say, Python and C as a better or worse learning experience, but I'd say there are tradeoffs to be made.
A statically typed, compiled language will teach you computer science concepts that a dynamically typed, interpreted language will not. I'm pretty certain that discussions about pass by reference or by value, for example, would be completely opaque to me if I hadn't studied C(++) when I was first learning to program back when I studied physics and I just needed the computer to give me a damn number lots of times so I could draw a graph. Similarly, this makes you think about your code's journey from your text editor to an executable. I never learned C well enough to really worry about garbage collection, as the scripts I wrote ran so quickly that the OS would sort that for me, but these concepts have been useful.
By contrast, progress will in other respects be slower in these languages. Partly because you're learning computer science as well; partly because all those variable declarations put an extra strain on your working memory until they become second nature; and partly because whacking code into the interpreter and hitting enter gives much more rapid feedback than linking, compiling, and running. (The importance of this feedback in the learning process shouldn't be underestimated, and a learner's working memory should be allocated more sparingly than RAM. On a related note, regarding your observations about learning on punchcards: I once wondered how much difference the ubiquity of soap dispensers or kitchen disinfectants actually makes to human health when we evolved just fine without them over a few hundred thousand years. My brother, who's a doctor, observed that life expectancy back in 1800 was 32. These advances matter.)
You can push these arguments as far as you like. I've often thought that C-style syntax is so ubiquitous you'll make up for any lost time as soon as you learn a second language- of the top 7 TIOBE languages, Python is the odd one out in terms of syntax (not that I know PHP). However, I've learned a huge amount from working on real live production code. It's one thing to read some guy tell you in a book to keep your functions short and DRY; it's quite another to actually be faced with a 200 line method and identify the 5 places you need to change it. If Python (or Ruby) means you can actually read and modify real non-trivial production code without causing a memory leak early in your career, there's probably a lot to be said for that.
The other aspect of a language to think about is paradigms. Javascript has C syntax and the browser console, but in spite of everything I've said so far I'd advise against learning it as a first language- not because of its dependence on global variables, or its ever-expanding ecosystem, but because Eric Elliot estimates that 99% of JS developers don't know what they are doing. Prototypal inheritance, first class functions, its approach to asynchronous programming, ... make JS a wildly different beast from most other commonly used languages, and it's best treated as such. When I first decided to make the career switch to programming, a friend advised me to learn Java- partly for the jobs market, and partly because you learn to write modular code (everything is a class, and one class per file).
Ultimately, no single language is going to teach you everything- C will teach you pointers, but not functional programming. Your "learning experience" is incomplete if it's limited to any language, and the optimal route through them probably differs both on the person learning and the goal they're working towards.