Automatic Signed Exchanges (SXGs) Beta Launch

Yes, I did the test from your article using the Search console, here is the http response:

HTTP/1.1 200 OK
Date: Mon, 25 Oct 2021 07:25:49 GMT
Content-Type: text/html; charset=utf-8
Transfer-Encoding: chunked
Connection: keep-alive
CF-Ray: 6a39b68ac8475a34-IAD
Cache-Control: public, no-cache
Vary: Accept-Encoding
CF-Cache-Status: DYNAMIC
expect-ct: max-age=604800, report-uri=“
referrer-policy: strict-origin-when-cross-origin
status: 200 OK
surrogate-control: max-age=86400, stale-if-error=26400
x-content-type-options: nosniff
x-download-options: noopen
x-frame-options: SAMEORIGIN
x-permitted-cross-domain-policies: none
x-powered-by: Phusion Passenger(R) 6.0.10, cloud66
x-request-id: 214aabe9-ec20-4f45-9eb4-d70eb80fb21c
x-runtime: 0.048943
x-xss-protection: 1; mode=block
Report-To: {“endpoints”:[{“url”:“h/ttps://”}],“group”:“cf-nel”,“max_age”:604800}
NEL: {“success_fraction”:0,“report_to”:“cf-nel”,“max_age”:604800}
Server: cloudflare
Content-Encoding: br
alt-svc: h3=":443"; ma=86400, h3-29=":443"; ma=86400, h3-28=":443"; ma=86400, h3-27=":443"; ma=86400

I used Cloudflare Nameservers with orange clouded DNS records. The certificates for the zone are managed by Cloudflare (SSL/TLS encryption mode is Full).

As Cloudflare doesn’t show the CAA Record they add. Any way to verify it has been added? Maybe that’s the issue.

that’s why

Do I have to remove “no-cache”? Is just “public” fine?

Sorry, I’m a noob regarding HTTP headers…

Thanks for your help!

see 1st post in this thread and also my blog post regarding the cache-control max-age requirement be for at least 120 seconds cache freshness/lifetime. cache-control: public, max-age=120 - technically public isn’t needed but I add it anyway.

It works, thanks a lot!

1 Like

great to hear :slight_smile:

I’ve updated my Cloudflare Automated Signed Exchange article with my Google Analytics tracked Core Web Vital metrics comparing Google Android search referral traffic from SXG cached vs non-SXG cached sources


Does SXG make sense for website with minutely updated content, like flight schedules?

For example if part of the content like flight schedules are changing every few minutes?

In the end, I had issues with authenticity tokens that I use with Ajax. I guess SXGs is only useful for websites that have many static pages and don’t use Ajax requests or POST forms.

@rommes As a rule of thumb, I’d say if the frequently updated content is a large portion of your site (visually), then SXG is not a good fit. However, if it is small (e.g. sidebar content) then you can use lazyloading JS to add that after the SXG loads. You may want to pre-allocate space for it to avoid hurting your CLS metric. You can also make a page-by-page decision on this using the Cache-Control header.

That said, this is just a rough rule of thumb; a more precise rule may require doing some experimentation. For instance, one thing that affects this decision is how often people view the SXG (at which point the cache may fetch an updated copy). If you create SXGs with a smaller max-age, then they will often be expired by the time somebody visits the page. If you can monitor how often SXG is served, then perhaps you can tune that parameter to make the trade-off between cache hit rate (latency) and freshness. This is just an idea; I don’t know for certain if such an experiment would work (e.g. have high enough signal-to-noise ratio).

@user2890 It may be possible to make such pages compatible with SXG, by creating an initial ajax request for a CSRF token (no ACAO header; same-origin only) as this StackOverflow topic suggests. But this may slow down your non-SXG pages by adding another request to the waterfall. If you can do it on SXG pages only (e.g. using the Accept header) that might be best.


See requirements for SXG caching outlined in 1st post or my blog post section at

You control SXG cache requirements and freshness & SXG cache time via your cache-control max-age headers set either at Cloudflare dashboard browser cache ttl or your web server origin set cache-control headers you configure yourself.


Hey Firat,
Since it’s been a month I just wanted to know if you guys are still working on a way to set up the cache period specific for SXGs?

You can do like Automatic Signed Exchanges (SXGs) Beta Launch - #89 by eva2000

Same way via origin server as @twifkak stated at webpackager/cmd/webpkgserver at main · google/webpackager · GitHub and you can access ASN of request via Cloudflare Transform header requests modification rule to tag Googlebot requests with an ASN = 15169 with a custom request header you can send back to origin for detection I suppose.

This helps if you are unable to derive the ASN of the request from your origin server - let Cloudflare do that part via Transform rule and then your origin server can look for the custom tagged request header i.e. X-Googlebot in screenshot.

1 Like

Thanks for your comment, buddy. Actually, that’s too technical for me. I want to try the SXGs Cache Period of 3 minutes but Cloudflare by default provides either 2 Minutes which causes SXG Error or 5 Minutes which is quite a high value for the type of site I run.

That’s why I am looking forward to the inclusion of SXGs Specific Cache Period from the Cloudflare Team :slight_smile:

1 Like

@firat just noticed Cloudflare’s own blog’s SXG requests are returning cache lifetime too short :wink:

Request URL:
Request Method: GET
Status Code: 200 
Remote Address:
Referrer Policy: origin
alt-svc: h3=":443"; ma=2592000,h3-29=":443"; ma=2592000,h3-Q050=":443"; ma=2592000,h3-Q046=":443"; ma=2592000,h3-Q043=":443"; ma=2592000,quic=":443"; ma=2592000; v="46,43"
cache-control: private
content-length: 347
content-type: text/html; charset=UTF-8
date: Mon, 22 Nov 2021 04:06:07 GMT
server: sffe
warning: 199 - "debug: content has ingestion error: Error fetching resource: Content is not cache-able or cache-able lifetime is too short"
x-content-type-options: nosniff
x-silent-redirect: true
x-xss-protection: 0

also interesting is if my SXG enabled blog Google search result is a featured snippet it doesn’t get SXG cache prefetched, instead the first regularly search result does? @twifkak so Google Search featured snippets are treated differently than regular results for SXG ?


the first regular search result was Cloudflare’s blog heh which has SXG enabled but isn’t cached due to cache lifetime too short


If my blog is a regular search result, the Google does prefetch and cache SXG version for faster response times

Yes, we launched SXG support for the most common result types first, but we’d absolutely like to extend coverage to more types like this one.

1 Like

Cheers @twifkak hopefully extension to Google feature snippets lands when/while I still have pages promoted as featured snippets :smiley:

We are testing SXG on a couple of sites. And here are the pictures we see in google analytics:

Every time we enable SXG, Avg. Page Download Time grows at least x5 times from the “normal” value.
Furthermore, these spikes happen only for Chrome clients, and if I filter out all browsers except chrome, the difference between the normal value and a spike is almost x20.
I don’t have a clear explanation for why this is happening. Has anyone noticed the same behavior?

Might want to narrow down your traffic to just Android mobile Chrome from Google search/googlebot to investigate.

Remember Google Analytics is real world users so your average download times are dependent on real world user’s ISP speeds and device/browser CPU and capabilities. It only takes an adequate proportion of very slow devices/ISP connections to bring down your average times.

But make sure your SXG cache lifetime is adequate otherwise visits from Google Android Chrome search referrals will most likely be doing a SXG cache miss and redirecting to the non-SXG cached version of your page. You can see 1st post of this thread and my SXG guide at Testing Page Speed With Cloudflare Automatic Signed Exchanges & Google Search Cache - Centmin Mod Blog on how to inspect the headers. Also non-cache non-prefetched SXG is slower than regular CF CDN cached HTML pages - when SXG cache lifetime is too short to be of use on visitor Google Android search from my tests using 1. Webpagetest. The benefit of SXG is really when Google search prefetches and stores your SXG version of your page when visitor uses Google search on Android. If you pages don’t even rank on 1st page of Google search results, then probably less likely to do that SXG cache prefetch as I understand it? @twifkak ?

But yes I notice average page download time differences but mainly higher for non-SXG Android Google search chrome referral traffic. Mine have seen to gone up ever since some of my Google search results have been promoted to Google Feature snippets as they are currently excluded from SXG prefetch cache serving it seems

Google Analytics with custom segments to narrow down traffic I want to insights into.

Avg Page Download Time breakdown - you can clearly see download times for real world are dependent on speed and the type of device used. Desktop is fastest with average device power and speeds much higher. Android mobile without SXG cache slowest as expected and SXG cached Android Google search was 2nd fastest overall and fastest for the segments of mobile traffic that we are looking at.

  • Google Search Android Mobile Referrals = 3.34
  • Google Android Search SXG Cached Referrals = 1.40
  • Mobile Chrome = 2.06
  • Desktop Chrome = 0.10

Segment definitions

google-analytics-webvitals-sxg-compared-segment-sxg-cached google-analytics-webvitals-sxg-compared-segment-sxg-nocached

My current GA source data connector in Google Datastudio for custom Average Page download time vs Server response time just for Google Android search referrals

A 123+ second average pagedown time entry will probably pull down my averages too and I found the slow request for Nov 11 via GA drilldown it’s my Wordpress blog page for faster tar/rsync backups using zstd compression Fast Tar And Rsync Transfer Speed For Linux Backups Using Zstd Compression - Centmin Mod Blog which has a lot of charts and benchmark result image/tables :slight_smile:

According to my custom Google Datastudio dashboard, the slow request came from Northern Europe

Precisely, Bristol, UK

Using Chrome 94

And appears to be Lenovo Motorola 5 5s device so shouldn’t be that slow :man_shrugging: Though it is a 4+ yr old Snapdragon 430 device Motorola Moto G5S - Full phone specifications which is 1yr newer than Google’s mobile device test equivalent of Moto G4 Compare Motorola Moto G5S vs. Motorola Moto G4 -

Guess real world isn’t the same as lab metrics like PSI

Pagespeed Insights only measures really above the fold page load versus Avg Page download time would take into account the entire page download so I guess with a lot of images = slower page download times. So the question is do you even care about Avg Page download times when it’s not really being measured by Google user experience page signal like Core Web Vital metrics? SXG is meant to improve CWV metric LCP via improving TTFB.

I might disable CF SXG for this month and see how it compares.

1 Like

I think there are a few potential benefits to SXG:

  • Big speed improvement if the SXG is prefetched from Google Search. This currently only occurs for the 1st SXG result of every search results page, but that may evolve over time to best meet user & publisher needs.
  • Small speed improvement for the other SXGs on the SERP. Because there was a prefetch for for 1 result, that TLS connection can be reused for the other SXG results. (It’s as if Google Search had done a <link rel=preconnect> to the result site.)
  • Connection resilience. Maybe the user has a spotty connection, and was able to prefetch the SXG, and can now still click the link even though their internet is down (with a degraded experience due to missing subresources).

I think, in practice, that first item (prefetch) is the most significant, but I’m eager to see more real-world testing to verify that. It can very much depend on the page – e.g. imagine an app shell where all the content is in non-preloaded subresources. It can also depend on what % of your page views come from Google Search, are on mobile, etc.

I’m just guessing, but it’s possible the Page Download Time increases because it takes Cloudflare time to spin up the worker & generate the signatures. However, this time is only seen by Googlebot, not by users:

  • If doesn’t have a fresh copy of the SXG, it redirects users to the original URL.
  • When users visit the original URL, Cloudflare serves the unsigned HTML, not the SXG.

I encourage people to look at the impact on client-side metrics like web vitals. Those are the ones SXG aims to improve (especially FCP & LCP), moreso than server-side.

I think the best way to differentiate SXG traffic from non-SXG traffic is by serving slightly different HTML based on the Accept header like I proposed for cache-control. You could use this in Google Analytics to drive a custom dimension, for instance. I’ve proposed a few mechanisms for doing that without any server logic, which I hope to see Cloudflare implement in the future.

Sorry for my verbosity!


Thanks for the details on the benefits. I will see if differentiation can be done at Cloudflare Transform rule level later.

I’ve disabled SXG for this month to see what the differences are. Seems being only 1st ranked search result being prefetched is rather limiting. Hopefully, that is expanded to also include Feature Snippets :slight_smile: