Page on-the-fly compression (as pictured above) is the process of putting additional time into the page generation pipeline, so that the data can be transmitted faster. However, it has to transmit so much faster, that it makes up for the additional generation time:
dataGenerationTime + transmitionTime > dataGenerationTime + compressionTime + transmitionTime
When can compression go wrong? Most of the time, when the compression cannot compress the data a lot, hence the transmission time does not decrease enough to make up for the additional computation. That means, compressing content and sending it will take longer than just sending the uncompressed data.
Which data can trigger such a behavior? There are two types of data which are hard to compress and will lead to an increased delivery time:
What can be done in order to improve performance? Usually, all data should be optimized at build-time, not at run-time. For example Webpack can help you optimize your static website files and trigger image optimizations. CI can help you automate that step. Only data, which has to be generated by a server, should be compressed - and only if the file size is big enough. Profiling different sizes and types will give you more accurate information about when to use compression.