Hi,
I’ve had quite a good search around this question, so apologies if this has already been discussed, but I cannot find anything on it.
We keep an eye on Google PageSpeed Insights (Lighthouse) and recently noticed that we started to be marked down on “Enable text compression” which never used to happen. We’ve not changed any settings in CF or config on our servers which we think will impact this. (Note we do (and always have had) css/js minification enabled in CF.)
We seem to have compression on our HTML (www.viovet.co.uk) but not on our static assets (css/js). Let’s take an example of an asset as https://static1.viovet.co.uk/frontend/scss/Viovet/aggregate.css?1590506055
I’ve noticed that if I clear the CF cache for this file then I start to see the file with content-encoding: gzip
response header, see screenshot below. We gzip at our origin server. I don’t understand why CF didn’t compress this further with Brotli. I’d like if it we were taking advantage of Brotli but I can cope with the gzip being passed through.
If I then wait a short while and refresh the file again, I no longer have a content-encoding
header at all.
It seems that when the CF cache is hit there’s no compression. There’s also a difference in the transfer size so it looks like it’s not just a missing header, it’s really just no longer compressed.
I’ve tried disabling the Brotli compression option in CF but that doesn’t have an impact as far as I can see, so I’ve re-enabled it.
My understanding is that Lighthouse uses the content-encoding
header as the sole source of determining if a file is compressed and then compares that to what it thinks it achievable with gzip (not Brotli).
So, my question is how can I ensure my assets are compressed to the best level possible with the connecting client?
Thanks for your help,
Luke