A basic web app with:
I think many people have come across when your company starts to grow very quickly and it becomes more and more difficult for you to manage various documents! On this site lanteria.com/solutions/document-management you will be able to find good software that will make it much easier to work with document management! It's really useful!
Depends on what you mean by "speed up". Often, speed based on perception. Let me explain. I wrote an application that had a feature that loads a whole bunch of data for a patient when displaying a patient view. It pretty much puts all the patient data available at the users fingertips. Each different data type was in a different tab. The idea was that the user would just click on the tab they wanted and look at the data.
The load time was relatively quick considering the numerous data sources that had to be pulled from, but the perception was it was too slow. The simple trick was to not load all the data when the page loads, but to load it as the user needed it. So, instead of loading it all up front, it would load on demand when the user clicked the tab. The end result was the same, but the perception of a speed up was immense.
Another trick is to give feedback to the user during a longer process. Progress bars are a nice touch. Even if there's a slower process, it gives the perception of being quick.
Of course, there's the obvious ones: optimizing database queries, utilizing indexes well, caching, etc.
And sometimes, there's the necessary ones... those long processes that must be either rewritten, or moved to a back-end process (e.g. generating reports that are too long-running for real-time that would be better suited to be generated in a queue situation on the back-end.)
j
stuff ;)
There are usually two things you can do.
Usually people cache before they optimize. Often "runtime caching" is helpful in PHP, you can utilize the clone instruction which is basically a deep copy of the object, but it saves you the constructor and so on.
There are a lot of ways to optimize code but the main focus should be classic time-complexity, just superficial "how can i reduce the amount of loops"; and I don't mean that you should write recursions ... ;D ...
For your application internal calls, if you use Linux try to switch to "file-sockets" and not the TCP protocol, for the web servers and the databases.
Caching is something hard but usually you can say that things that don't change can be cached. Try proxy caches, they are usually low hanging fruits for the payload.
If it gets more complex, employ
memcachefor datasets (redisis nice, butmemcacheis faster - to my knowledge - and you don't always needredis)There are even more solutions, and it depends on your architecture... but this could be a good starting point