During my dual studies (which means, I was backed by a company!), in the C course, when we had to create a solar system with OpenGL, we were using GL 1.0 (1992) with GLUT and GLEW at the time, while 4.2 or 4.3 was the latest (2012), I asked the prof, why we were using such an old version and outdated libraries, and if we would also take a look at later versions and Direct3D, too. The answer was, that "the version of OpenGL we use is enough and we should just start learning from the start. Also, Direct3D is not really useful for most situations, so we will not take a look at that." Wow.
For those of you who don't know, GL 1.0 was released in 1992. Just try to remember what kind of graphics hardware was available at the time and how it has changed up till now. Do you think any company out there would want someone who knows a bit of the tech of 1992? NO!!! Such old software is not able to reflect the current world. It's not even good as a source of learning, because things changed so much. We do not go and draw lines on the screen anymore. We define data and tell the GPU to render that.
Since most of you guys are web devs, let me compare the advice above to someone telling you to start learning JS 1.0 (instead of JS2015) and not taking a look at TypeScript, because who would want to use that at all? That way you can learn about all the changes and the history of the language.
Ain't no one got time for that. The world wants to be competitive. Every enterprise and startup wants to create the next innovation using the latest and the greatest tech. The history might be interesting to understand some things, but it will never land you a job, if you are a student starting out your career.
My advice for everyone would be, contrary to what my prof said: if you learn something, go for the latest version. The world won't wait for you to catch up.