Please compress static content with brotli level 11

I have brotli enabled on my Cloudflare website, but the received brotli compressed javascript file is much larger than when I compress the file with brotli level 11 on my machine. I assume this is because you are not using level 11 of brotli compression. In my case the javascript file is 124 kb when gzipped with zopfli, 106 kb when compressed with brotli level 11, but 119 kb when served with brotli by Cloudflare.

This doesn’t make any sense to me since I believe level 11 is strictly superior to lower levels for static content. It is slightly slower to compress but this doesn’t matter when serving static content which does not need to be compressed at runtime.

Turn it up to eleven

Since Cloudflare messes with the Accept-encoding header we’re also prevented from sniffing browser support for brotli on the server (?). This effectively prevents any use of brotli level 11 on a site using Cloudflare.

Could you fix this by compressing brotli files with level 11 for static content? If that’s not possible maybe there is some other solution like letting your users serve their own brotli compressed files?

PS. If someone knows of a good workaround I would love to hear it!

Are there any news about this problem?
The situation still appears to be the same. I would like to compress at 11 too, and directly serve that file instead of having Cloudflare getting the gzipped one and converting it to brotli level 6.

As far as I know if you send Cloudflare a compressed gzip or brotli file, it will not compress it again, unless the client doesn’t happen to accept the encoding. So I’d say compress it yourself. https://support.cloudflare.com/hc/en-us/articles/200168086-Does-Cloudflare-compress-resources-

As for why Cloudflare uses a lower level, it’s due considering other sizes, huge load of connections they need to compress and speed. See this article about their experiment with Brotli: https://new.blog.Cloudflare.com/results-experimenting-brotli/

As I remember it, Cloudflare will prevent me from knowing if the user’s browser accepts brotli - thereby making it impossible to know whether to serve the brotli compressed version or the gzipped version of the file.

Are you suggesting that Cloudflare will decompress brotli files and compress them with gzip for browsers that don’t support brotli?

Thank you for the pointer about cloudflare decompressing files. The same article actually says:

Cloudflare only supports the content types gzip and None towards your origin server

This leads me to think that if we force our server to serve a br file, cloudflare won’t decompress it and recompress for clients that don’t support br.

As for your other comment: it’s totally normal for cloudflare to use a level lower than 11, for the reason you mentioned, but for static file it’s possible to compress them once for everybody and directly save the .br on the server and serve it, without compressing it on the fly, having the maximum compression and no cpu used at all during the file request.

Yes cloudflare strips the accept-encoding header and uses only gzip and deflate, but no brotli.

I don’t think they will decompress br files, since they don’t support that encoding from them to our servers.

Actually according to my custom Nginx logging of Cloudflare requests to my Nginx origin, Cloudflare ignores your gzip/br compressed assets altogether and requests from CF to your origin are non-compressed.

I believe you can override that for pre-compressed assets with origin HTTP cache control directive for no-transform Cache-Control - HTTP | MDN

No transformations or conversions should made to the resource. The Content-Encoding, Content-Range, Content-Type headers must not be modified by a proxy. A non- transparent proxy might, for example, convert between image formats in order to save cache space or to reduce the amount of traffic on a slow link. The no-transform directive disallows this.

As far as I’ve seen (from tests i made) and read from this article: https://support.cloudflare.com/hc/en-us/articles/200168086-Does-Cloudflare-compress-resources- cloudflare actually does request resources in gzip to the source, and if the server is enabled it will deliver the gzip-ed resource.

Ok for the no-transform response header, but the server cannot determine if the final client (i.e. the user browser) supports brotli or not (because the accept-encoding header get changed) and it’s not really useful in this situation.

interesting then as all my nginx logged origin requests have been non-gzip from cloudflare

true that

https://gtmetrix.com will tell you whether or not gzip was used. As for Brotli compression, using it as is, that is without first using proper minification / optimization, won’t give you the speed you’re interested in.

GTMetrix only tells if gzip compressed request was done for visitor to cloudflare connection. We (or at least me) are referring to the cloudflare to origin server request side whether it is gzip or brotli compressed or whether cloudflare just requests the non-compressed assets from origin server. My custom nginx origin logging for me at least showed all cloudflare’s requests to origin nginx were non-compressed https://community.centminmod.com/threads/cloudflare-custom-nginx-logging.14790/

example from custom nginx logging the 17th field refers to gzip_ratio nginx log metric

tail -1 /home/nginx/domains/domain.com/log/cfssl-access.log | awk '{print NR": "$0; for(i=1;i<=NF;++i)print ""i":  "$i}'
1: 180.76.15.158 - - [13/May/2018:22:41:32 +0000] GET /tags/php-54/ HTTP/1.1 "200" 45399 "-" "Mozilla/5.0 (compatible; Baiduspider/2.0; +http://www.baidu.com/search/spider.html)" "180.76.15.158" "-" "39" "1" "0.100" 41a8a20c1c953343-HKG TLSv1.2 ECDHE-RSA-AES256-GCM-SHA384
1:  180.76.15.158
2:  -
3:  -
4:  [13/May/2018:22:41:32
5:  +0000]
6:  GET
7:  /tags/php-54/
8:  HTTP/1.1
9:  "200"
10:  45399
11:  "-"
12:  "Mozilla/5.0
13:  (compatible;
14:  Baiduspider/2.0;
15:  +http://www.baidu.com/search/spider.html)"
16:  "180.76.15.158"
17:  "-"
18:  "39"
19:  "1"
20:  "0.100"
21:  41a8a20c1c953343-HKG
22:  TLSv1.2
23:  ECDHE-RSA-AES256-GCM-SHA384

so Cloudflare pulls non-compressed asset from my nginx origin and checks the nginx origin encoding header if it’s gzip served, then Cloudflare serves the uncompressed asset and gzip compresses on CF edge server end to visitors. If the visitor supports brotli encoding, then Cloudflare takes that non-compressed asset from my nginx origin and brotli compresses it on CF edge server end to the visitor. That’s what I think is happening ? @ryan ?

I’m pretty sure that we are always supposed to request the compressed file first, but I’m checking to see if there are caveats.

@eva2000 - If you can file a support ticket with details and message me the ticket number I can have one of our more knowledgable folks take a closer look.

Remember also, that while not directly related to what you are requesting is the fact that Brotli 11, while having a higher compression level than Gzip or Brotli ‘standard’, the deflate time is always of a longer length which actually causes Brotli 11 to be slower / near even (but not faster) than the other two methods mentioned. If speed is what you’re after for your customers.

1 Like

will do :slight_smile:

yup totally true

1 Like

Looks like cloudflare does do gzip requests to origin. The issue was on my nginx origin end, i double checked and nginx vhost config had within web root / location context

gzip off;

LOL

with gzip on; enabled, the 7th field from the right (field 14) now shows a gzip_ratio value = 3.99

1:  66.249.90.241
2:  -
3:  -
4:  [14/Sep/2019:13:28:31
5:  +0000]
6:  GET
7:  /threads/remove-menu-option-21.18309/
8:  HTTP/1.1
9:  "200"
10:  15098
11:  "-"
12:  "Mediapartners-Google"
13:  "66.249.90.241"
14:  "3.99"
15:  "582234"
16:  "1"
17:  "0.100"
18:  5162b45d9eb2d26a-SJC
19:  TLSv1.3
20:  TLS_AES_256_GCM_SHA384

Remember also, that while not directly related to what you are requesting is the fact that Brotli 11, while having a higher compression level than Gzip or Brotli ‘standard’, the deflate time is always of a longer length which actually causes Brotli 11 to be slower / near even (but not faster) than the other two methods mentioned. If speed is what you’re after for your customers.

True, but what we are discussing on this thread is the possibility to serve a statically compressed file to the client, so there is no compression time at all and the size the lowest.

Apparently there is no way of doing this.

If done in the manner in which you expressed, there will be a single resource, that will not be changed over what is essentially a very, very long time. Even static files often change. .css .png .svg and if served as static, the .html will change more often than the aforementioned file types, generally speaking. Also, a single payload, compressed as .bz, delivered to essentially everyone may not be the best of ways to serve webpages. Especially, and I’m not commenting on what you want to bzip once and be done with it, sites requiring user input, from the most innocent to the level of severe security / privacy violations. Why? Because a single payload continually served as a single bzipped file to be decompressed upon arrival, no matter the person, cannot possibly meet such needs. There is almost no such thing as a static website. In fact, I’d theorize that the % of truly static sites to sites which have resources that change from time to time is in the single digits. Something to think about for anyone wishing to serve a compressed payload that will never change over time, or will change perhaps a few times over the course of a site’s lifetime.

There will always be a length of time used to compress a payload. Whether it’s on the server end or Cloudflare’s end. Moreover, the time it takes to decompress the type of .bz files is paid for by the end user. There is no way around that. Unless Cloudflare were to decompress, but then they’d be paying for the time the decompression takes and an overly large file would be served to the end user. No computational resource, of which the bzip compression algo is one, is free. They all cost someone - whether time, CPU power, Memory intensity, even electricity - so really this is Circular reasoning. There is no way out of or around it. Write good (read:concise) code, use only the resources you need (from .css, e.g.), nix the files that cause a site’s loading times to reach beyond 2 seconds, etc. - that’s as close as you’ll get to a compressed static payload, by doing it one’s self - but then you pay for it in time, understanding, computational power, and so on.

Brotli and zlib decompression speed are comparable, and Brotli decompression doesn’t get that much more expensive as you dial the compression level up.

You can go really far into the weeds analyzing whether certain compression settings are Worth It, but it plausibly could be. For that matter, it’s what Google made it for.