I started adding some customers to Cloudflare - but now discovered that it breaks the content for some older client.
The thing is, that the cache key seems to be url only - not e.g. HTTP Accept.
So when a modern client requests a javascript/css file - it will be compressed by the server - and when it requests an image it will serve e.g. WebP/AVIF - if browser http accepts it.
Then, when an older client request the same cached resources - they will get the files with “modern” encodings that cannot be understood by the older browser - and thereby completely breaking js, css and images.
How to solve this? Can I make Cloudflare add HTTP Accept to the cache key request somehow? I am not interested in:
using “Enterprise” plan
altering the server software
use Cloudflare workers
It seems like such a simple (but very important) issue.
Unfortunately, according to our knowledge base article Creating Cache Keys, some headers can’t be part of the cache key as they could have high cardinality and risk sharding the cache. So without modifying any code via Workers or on the website itself, this isn’t supported at the moment.
Thanks for the answer! Q: When I look at the Caching->Overview it reports WebP under content type - is this based on the response content type only then? Thus it has no effect on the request hit.
I just to confirm that this is actually a general well known problem with Cloudflare (before i start disabling cache / removing the customers again etc.)
No. This is based on the request headers from the client. You should not perform image manipulation on your origin based on the presence of webp or avif capability in the request to your origin if you expect Cloudflare to cache the response.
If you enable WebP in Cloudflare, Cloudflare will convert the image sent to the Client. However, if you do conversion on your Origin, you will end up sending WebP images to clients that do not support WebP. Cloudflare will not convert from WebP back to Jpeg if the client does not support WebP.
All Proxys act/behave like that. Serving different content (on the same URL) but differenciated on some headers always have been an bad idea and just works if it is getting differenciated at the server you are communicating directly (in this case CloudFlare and not your Origin Server)
But there is a workaround for this, or better said: the real solution to this, as like I said before this never was a good idea.
Use the <picture> tag and overload it. Ofc this changes have to be made in the template/DOM but this workes for all browsers.
Serving content based on the request (including header(s)) is not a bad idea - Cloudflare does this with Brotli and Polish too. The HTTP Accept is a perfectly fine way of altering the content. We use img-src as the markup for responsiveness at its smaller - picture gets enormous for all the size multiplied with each new modern format. Also when building the image path via js templates, browsers has a “hard time telling” what format image is supported.
Also as I stated - this is also a problem for the compression - so my.js?gzip, my.js?br, my.js?plain vs. just my.js + HTTP Accept - I prefer the latter.
So a cache key based on url + request supported format is totally legit imho… I just find strange that you need to go “Enterprise” for such a thing. Adding Cloudflare like we did and later finding out that a lot of clients was served bad data as a per default - there must be many encountering this problem - maybe they are not aware.
Its btw. odd that we only got reports on this matter after Safari started supporting WebP on updated devices (now some iPads supports it - others dont - before none), so something tells me that there has been some sort of cache control on supported clients before this - else iUnits would have a major problem all along.
Well you can overload a picture tag with just two images.
You can also overload img-tags with multiple img-src tags… so this is not an argumant.
How much you overload it is your choice.
Cloudflare (like ALL reverseProxys) alway have been URL based and as Cloudflare compresses everything proxied through it with the best compression it sets everything to brotli.
But you can with a PageRule bypass Cloudflare for URLs including *?gzip which makes them not getting processed (and cached) by Cloudflare. So they are getting served and behave just exactly like if you would not use Cloudflare.
That should solve your problem, but for these files you would also not benefit from using Cloudflare.
If you want to use enterprise features you have to go enterpriose. Thats it. I think people here are just spoiled to much and think everything is free, even if I do not know any other CDN or internetcompany which gives you so much for free as Cloudflare does!
Spoiled? FYI there’s 2 levels of subscription between “Free” and “Enterprise” (“Pro” and “Business”) - and the latter don’t even have a price tag on it. Just sayings it’s odd that the cheap “Pro” gives you a lot of unlimited CPU power (compression and re-encoding of all resources), but to do a simple thing like “Vary” or just a direct flag for the accept header - you need to go “talk to us $$$”.
Most of them and Nginx is one of them. They are at the core URL based. In addition they (with some configuration) are able to use vary-header. Cloudflare (as far as I know based on Nginx) also does this. But the header stuff is added at Enterprise plan.
If you dont like this complaining about the price-policy is most probably useless, but you indeed could open a support ticket and complain about it.
Yes correct, thats my personall opinion.
Yes at Cloudflare you really get a ton of features and stuff for your money.
Have you already talked to them or are you complaining before even tried it? If so thats weird. Also if you have talked to them and you decided it does not fit your budget thats ok aswell. If the feature is not worth the money you have to pay for it thats completely ok, everyone if fine with that.
Also: there is actually a solution to your problem:
Also:
This implicates that it is possible to work around this problem with Workers
“Free”: Wow.
“Pro”: Get a ton of extra features very cheap and lots of free CPU time.
“Enterprise”: Make the content delivery actually work.
I am not the one that should talk to Cloudflare about pricing - that’s our customers. I am simply relaying that no one will onboard the “Enterprise” due to the size of them. “Pro”? Sure, “Business”? Maybe but not likely - most are heading towards cloud anyways.
The page rule only works if the backend system has “?gzip” on all resources and only if the browser supports it. I know of no system that does this - at least not the systems our customers are running.
Fact is just simple that I have to opt. out of Cloudflare as the solution will not be working for 90% for your customers. Pretty much what I started saying: Is there’s an obvious simple fix (no workers, no change of content/server) or something I have misunderstood - but there ain’t. Thanks for underlining this after my initial confirmation.