I refer you to the chrome dev console - it allows you to see exactly how long it takes to fetch & load content into the DOM. Extremely useful. Below are a few things I know about, and I have made a couple of assumptions (don't shoot me haha).
Photoshop is a great image compressor to test on quality & also never include images with resolutions bigger than you actually need. Most of the time, large images will be a max of 1920x1080.
With js files, compress, minify & gzip if you can.
For general project compression have a look at Rollup. It does a load of optimisation techniques which you can read on their website/github page.
I was told about this the other day too PSON which simplifies JSON, could be useful. But if you're working with SOAP etc then I'm sure there's a similar tool out there too.
Get rid of jQuery. (Personal preference).
If you're using something like bootstrap then download only the bits you needs, and NOT the whole package.
You can also do a lot with async loading of the web page. Unfortunately I can't remember any of the articles I've read on this. I think with script loading you can use 'defer' and the like.
I would however ask yourself is there any point in doing this?
Most of the time if the page loads and is displayed in less than a second then you'll be fine - as long as you provide user feedback then there should be no worries. More people leave websites and the like if they have no user feedback over having feedback, such as a spinner to say stuff is loading is sufficient. Again, I have no citations for this, however if you google you will find articles on this.
Hope this helped :)
And as @maruru said, please add TAGs to the post :) (Top RHS to edit post.)