This is Hashnode, a community for developers. We are here, among other developers, so our view of the world might become a little bit biased. That's why, here's some food for thought, and I want to hear your voices as an answer to these users' comments:
It's 2019. I have little interest in using a website that doesn't fetch without a full page reload. JS is necessary full stop.
I find that sites that rely on JS for updates often end up sucking up too much memory and slowing the browser down, so I rely on explicit full page reloads to clean up the memory footprint.
I think about intentions. needing JS for a news/information page is just stupid and a waste of energy. We invented HTML for this so why adding extra complexity where it does not need to be?
for an application on the other hand where interactivity and data is not indempotent it can help a lot. Although even there the layers should be there so the fallback is the classic request response model.
Speaking from a web-perspective it seams stupid to move away from a robust declarative model to a fragile imperative language.
But well ... in the end we can find arguments to bend the world to our views anyhow. I personally want things to work without JS because I think most JS coders are script kiddies who don't know what they are doing. Hence I don't trust their applications.
Backend people are less dangerous they mainly break their OS not the clients ;D ....
I feel it is strictly speaking the better solution, and there's no fundamental technical restrictions preventing it.
That said, I can see that the majority of projects wouldn't see their investment earned back, and so probably won't do it.
Which is fine, there's always a gap between technical ideals and what's affordable.
I think if the user becomes actively aware of the technology being used, we've failed to create a great experience.
I'd also say both of those quotes are non-typical user quotes. Most people will say they like a page that's fast, hate one that's slow, and really hate one that makes their browser crash. But if they're talking about fetching and memory footprints, they are way more tech savvy and tech-engaged than the average punter.
required but it does help your page/website stand out with animations etc. But even those can be done with transforms and SVG's. Then after that its about tracking and functional stuff etc.
I agree that a lot of developers sacrifice the actual impact on the users computer, or at the very least it is not considered as much as it should be.
Having a JS website and a non-JS website would be interesting from a caching and
offline perspective though, as well as a
preparing the page type deal where you include only the fundamental components/resources first, then introduce the heavier stuff after.
I also feel that
relying on explicit full page reloads to clean up the memory footprint is an extremely bad practise. In my opinion is suggests being lazy and expecting stuff to
just work. Proper/graceful shutdowns/teardowns are best practise in my opinion. I think the following example illustrates my point quite well;
You're a software engineer and you have had a big day and you're desk is covered in energy drink cans and coffee mugs - a complete mess. Now you can do one of two things; either clean it up, so you know you have a clean environment to work in tomorrow, Or let the cleaners sort it out. BUT there's always a chance that not all the mess gets cleaned up by a 3rd party. Always do what you can yourself.
I wrote this a few+ years back...
~ ~ ~~~~
- Who constitutes the expected, potential customer base?
- What is the probability of a significant opportunity loss for a refusal to provide any/all content to a small fraction of visitors?
- How much will the parallel development of content cost?
- What will be the additional costs and risks during development and ongoing support of alternative approaches to presentation?
With the answers to these questions in hand, a rational decision can be reached.