I think that while this may be true in many areas, there are a number of exceptions. In scientific computing, for example, one can run into situations where a trade-off needs to be made between resolution/accuracy versus memory. Another area is in the processing of huge datasets of text/images/video (examples: Tesla releasing 1.3 billion miles of data, training an image classifier on the >100 Gb ImageNet dataset, or natural language processing on large text datasets. Industrial IoT data can also be pretty huge). There's only so much that can be squeezed onto a single compute node (or GPU). Certainly it helps to have memory-efficient algorithms. Also, an inefficient algorithm is going to cost more if the data processing is done in the cloud (e.g. on AWS). Lastly, there's the internet of things (think "smart devices", "smart grids", "smart homes", "smart cities", etc), where storage and RAM limitations are pretty important. How does one get the "smart device" to capture as much data (e.g. streaming, time-series data) as possible into its limited memory, while at the same time keeping the device secure?
Modern programmers are more expensive than hardware - if having to use 10% more hardware due to inefficient code will save me 300 hours of development time, then it makes economic sense to go for the cheaper code that uses more hardware (in the majority of the cases).
An interesting question. Computer hardware is evolving not in years now but in months. Shortage of storage space is a problem in the past, and RAM memory is getting cheaper and cheaper - Memory Prices (1957-2016). A modern mobile phone is more powerful than a computer from the early 2000s.
Looking back, in the early years of programming, where the tools and languages have been far less than now and computers had 16 kB ~ 256 kB memory, programmers didn't have the luxury of using plentiful of resources.
Of particular note, yes, we can say that programmers are lazier than before. We produce code, that would have been problematic 15-20 years ago while wasting so many resources. Take the web developers for example. They work in an isolated environment (the browser) and don't care or don't know about the cost of their code, because "hey, my client is using a 2015 MacBook Pro with 2.9GHz Processor, 256GB Storage and 8GB of RAM, he should be OK". If that is right or not, I leave to you to decide. :)
I wouldn't necessarily say lazy. But because resources are relatively inexpensive these days, it's just not as high a concern as it used to be. The web app I work on is written in PHP, and is constricted via the php.ini file to 128MB max memory. I've intentionally kept it there so as NOT to be quite so unconcerned. There are times where I have to break that limit, but those times are intentionally designed to do so and are rare occurrences.
Coming from the days when storage was not cheap, I am always concerned about database storage and try to design things so as to not duplicate data unnecessarily. Sometimes I do duplicate data, but that's more for convenience in other areas (e.g. reporting). But if I can avoid doing so, I will.
I'm a modern programmer, probably, and probably only a modern one. I started learning programming with Python and C, but CoffeeScript is the first language I learnt to use. By teaching JavaScript, almost the all the guides does not talk about the memory cost and CPU cost. Because it's a scripting language. I prefer not to call it lazy because we have our problems to solve and storage is sometimes not the biggest problem.
When it comes to large simple page apps in JavaScript, the performance issue becomes significant too. Yeah, we are JavaScript programmers and most of us does not have a solution but only refer to the browser vendors for help, like making V8 faster, or pry that React Angular Vue would be faster. You call that lazy? I would say maybe on this aspect.
in short ? the whole industry is lazy because of that. :) we waste resources and don't care about it because horizontal scaling is so cheap :).
Klevis Miho
Frontend Developer
Yes they are. But I think that's how it should be. With hardware becoming cheaper and cheaper we can allow ourselves to be lazy.