I’m using Backblaze B2 (AWS S3 cheaper alternative) for storing json data (content type
application/json) that is compressed using gzip, fetching it from private bucket should be rather straightforward, but unfortunately Backblaze B2 doesn’t allow setting
Content-Encoding: gzip header for the files, so from I’ve noticed Cloudflare assumes (correctly) that those json files aren’t compressed so it compresses them (I think) before storing in cache.
I know I could work it around using
Cache-Control: no-transform but again it’s not supported by Backblaze. Is it lost cause or maybe there is some setting/flag in Workers that could work around somehow that issue? Ideally I’d like to ‘tell’ that returned response is gzip compressed, so I could access uncompressed json inside worker (as from what I can tell it’s what happens when
Content-Encoding: gzip for origin response is present). Less ideal, but acceptable solution for me would be to simply cache unmodified content and pass trough such compressed files via Cloudflare network, but I couldn’t figure out how to do that.
If I understood correctly, it sounds like the JSON files are compressed when served by the origin, but lacking a
Content-Encoding: gzip header. Cloudflare’s cache is likely not decompressing them and recompressing them, but rather just storing the response bodies as-is – i.e., compressed.
It sounds like this is already what our cache is currently doing. A simple pass-through
fetch() should pass the compressed, cached responses through to the client.
Perhaps I am misunderstanding the situation?
@harris thanks for looking into this. I think you’ve got it right, unfortunately I still found that Cloudflare was compressing those files when serving to end users (I guess it’s separate infrastructure from Workers), but anyway I’ve decided to not use Backblaze B2 anymore and find alternative that supports setting http headers for files. Thanks again for your time.