In short, why is Java predominant in high schools and colleges? Other languages are equally important these days. What do you guys think?
Heh, I wish I would learn Java when I was at school. We learned some nerdy garbage like Pascal or VB and I didn't even was understand what was it for and how should it be used.
Depends on the school/uni/college and often the specific lecturer - eg. some teach .NET, some will accept a range of languages. Java has longevity and stability in its favour; and there's consistent demand for it. So it's a sort of "safe bet", which is probably why it's popular.
An ideal course would teach principles and de-emphasise the language used to illustrate it; but reality bites back... it's usually very hard, slow work to get a curriculum updated. Lecturers often just don't have the time to get a range of languages accepted; nor to mark assessment in several different languages. Plus it can be a challenge to prove different students were assessed fairly/equally if they used different languages. Basically academics have a lot to deal with beyond the pure code considerations.
Ultimately there's also some responsibility on the student - if they want to learn something else, learn it! The industry is full of autodidacts who run rings around people who just trudged along a set curriculum.
At university we did a semester VB course and then later on a .NET project that spanned over the course of a year. I did well in the VB course, but the project was a problem, because we did all the coding excercises and workshops at the beginning and then were left completely on our own with an exam right at the end. we had many other subjects, and we had to fit coding and the project and basically teaching ourselves in over all our other work. The teams were in competition and fellow students kept mostly to themselves, because the projects were entered into the Imagine cup. If was horrible. Our project did fine, though, but most of my dev experience was gained while working and then after while doing Honours. Teams are great, but you do not always have the leisure of choosing who you work with and what you are doing, and it is extremely difficult and takes a lot of dedication during undergrad to be or become a strong developer; especially if it is just one of your courses.
I think it's for a mixture of reasons.
1)C# and Java are the two enterprise level languages that a lot of employers are looking for. 2) Learning a concept in 1 language is almost always going to help you do the same thing in a different language. Concepts are fairly ubiquitous. 3)Learning something you aren't interested in helps to teach you perseverance. There are many libraries and tools that are not fun to use in a corporate environment, if you don't have the ability to learn something you don't enjoy (or isn't pretty), then you are doing a disservice as a programmer.
That's my take on it anyway.
In school we started with C#. Then we made stuff with Arduinos -> C / C++ (forget every time what it is). After that PHP (want to forget everything I know about it) and after that a little bit Java.
A big problem with my class is that from 15 students only 3 could programme because they were interested in it. Problem here is that the other 12 students pulled the good students down.
Well it depends on the school, vision and career you're pursuing. For example I studied Digital Art and Animation, the first approach I had with programming was Java and I and every colleague hated it because we had a very bad teacher and because they make that class with ITC students so we had a really bad time with our first approach. In that class, in theory, we learn the basics and started using it in other languages, but fortunately I have AS3 by my side, and taught myself that to understand everything in a visual way. And when I learn javascript I understand that everything that I knew was kind of wrong at least for the Java part.
My experience tells me that teaching Java for first timers could be a very bad idea, specially if they will not pursue a programming career which every colleague did, except for myself. For 4 and a half years of college basically did not know enough to do stuff with Java, only AS3 and Javascript, I did not touched or did something again with Java since my first year of college.
high-school I did Pascal, taught myself Delphi so that I could actually build useful stuff with Pascal.
taught myself BASIC so that I could understand example code in Basic and actually use it in Delphi
varsity I had to use C, C++, Matlab, Assembler, Python, VHDL and some other less known languages - Java we only used in two or three courses either in third or fourth year and we didn't need to know much Java to get by (since the courses could've been in any language). We were never taught any languages, we just got assignments and getting up to scratch with the language was your own problem. Sometimes the assignments stated, use any language or use this language (often not a language you've worked with before)
First job I wrote an entire system in PHP, didn't know PHP before I started, second job, perl developer, didn't know perl before I started, third job Python developer, just knew enough Python to get the get the job, only 4 years after I worked in PHP, perl and Python did I work professionally in Java
So long story short, was never taught any languages other than Pascal, the principles remained the same throughout every language, so it didn't matter (for me) what language I was taught in school.
At schools, it really depends on what the teachers knows. When I was at school, I learned Delphi and PHP and JavaScript. Other people told me that they learned C# or Java or Visual Basic.
But if you want to know, why Java is such an important language, the answer is quite straight. Because Java is very easy to use (e.g. no pointers, no type-hell,...) and it was made for enterprise usage. As a programmer, chances are high you will be employed by an enterprise using Java.
Aside from the enterprise thing, Java is widely used by a lot of people and readily available for nearly every platform you might ever encounter. So by teaching Java, teachers (and later professors at University) give you a good start for anything you might want to do.
python is thought, and javascript will get more important through IOT. But you learn haskell as well and C. it depends on the university and what you're specializing on.
java is taught because it supports the current dominating paradigms and is used in the industry as defacto standard for business-applications. Banking systems use java and antlr4 for example can be used to teach the basics of language synthesis and so on....
java has a GC and simplifies a lot of problems while still using sophisticated techniques like JIT and inheritances.... in the end as everything it's about popularity and taste :)
Jan Vladimir Mostert
Idea Incubator
Sky
Coder
In 1990s and upto 2010, it was kind of tradition to start the programming with C and C++. And then end the syllabus with Java. None of the other languages had any place in the educational curriculum in most of the universities and colleges around the world.
Last few years I have noticed that some colleges are now teaching Python and C#. And some of them are also opening up towards Salesforce and other optional workforce specific tech stacks. For example Selenium WebDriver is pretty important in the Java world. So learn selenium java webdriver tutorials .
It all comes down to the people who set the syllabus. Most of them are not yet much aware of changing dynamics in the workforce. I am hoping that Go is likely to be successful in near future. You can learn from Go Language Tutorial .
However this will soon change as more and more jobs tend to get automated. People will be forced to learn new languages. So javascript, python, and others are likely to get some exposure soon. Java and C# is only good for some context.