I would say it's "Information hiding" and "building everything as layers of abstractions".
Abstractions are good, but too many abstraction layers are often a disaster.
Sometimes you want certain implementation details to be hidden or ignored, but these cases are rare, and the most common case is that you want the implementation to be a "white box" that you can peek into and change.
"It's ok, test in production". What the absolute hell!
It's not so much a specific advice, as a mindset. It's one that plagues a number of programming languages in their design... and really what it boils down to is the "when you have a hammer everything looks like nails" situation.
ONLY use objects, use objects for EVERYTHING
Ignore objects, use functions for EVERYTHING
XML can do everything, use it for EVERYTHING
MVC is a great concept, use it for EVERYTHING even in programming languages where it's unnatural (like PHP) regardless of how simple the task.
You will hear this time and time and time again on different subjects - and it's almost always bullshit. I suspect the same mentality (emphasis on the mental) is why a decade ago you'd have greenhorns coming into HTML/CSS forum areas asking "how do I make a link turn red on hover" and some joker would chime in with "Use jQuery", and why mouth-breathers and halfwits answer to everything "responsive" related is "use bootstrap".
A good second is "fill_in_the_blank is dying". 99% of the time someone says that it's more their dislike for it than reality talking. See the constant "PHP is dying" we're hearing right now... or even "JAVA is dying" -- even as much as in some cases (like Java) we wish it were true, more often than not it isn't... and even when it is such changes are rarely an overnight occurrence (see Pascal's slow fade from the limelight), or the writing was on the wall from the start. (see Prolog, or on the smaller scale PHP's idiotic mysql_ functions)
These are all from from a big company style guide/conventions (Java):
During my dual studies (which means, I was backed by a company!), in the C course, when we had to create a solar system with OpenGL, we were using GL 1.0 (1992) with GLUT and GLEW at the time, while 4.2 or 4.3 was the latest (2012), I asked the prof, why we were using such an old version and outdated libraries, and if we would also take a look at later versions and Direct3D, too. The answer was, that "the version of OpenGL we use is enough and we should just start learning from the start. Also, Direct3D is not really useful for most situations, so we will not take a look at that." Wow.
For those of you who don't know, GL 1.0 was released in 1992. Just try to remember what kind of graphics hardware was available at the time and how it has changed up till now. Do you think any company out there would want someone who knows a bit of the tech of 1992? NO!!! Such old software is not able to reflect the current world. It's not even good as a source of learning, because things changed so much. We do not go and draw lines on the screen anymore. We define data and tell the GPU to render that.
Since most of you guys are web devs, let me compare the advice above to someone telling you to start learning JS 1.0 (instead of JS2015) and not taking a look at TypeScript, because who would want to use that at all? That way you can learn about all the changes and the history of the language.
Ain't no one got time for that. The world wants to be competitive. Every enterprise and startup wants to create the next innovation using the latest and the greatest tech. The history might be interesting to understand some things, but it will never land you a job, if you are a student starting out your career.
My advice for everyone would be, contrary to what my prof said: if you learn something, go for the latest version. The world won't wait for you to catch up.
Arthur Brown
Mostly Backend guy
Generally, I hear my managers saying something similar to the above when they are under immense pressure to complete a project.