I recently switched one of my sites’ origin from Heroku to Firebase hosting and noticed the cache hit rate drop dramatically from about 96% to about 85% on a given day. You can see the change in this snapshot from the Cloudflare analytics dashboard:
Besides a different server model (dynos vs. cloud functions), the main difference that I can think of is the fact that all sites on Firebase hosting are running behind Google Cloud CDN Fastly, so Cloudflare is actually making requests to Google Cloud CDN Fastly on the new site.
I’m trying to figure out why the responses from Firebase hosting aren’t being cached as well as those from Heroku. I fired up an instance of my old Heroku app to compare the response headers.
The vast majority of requests we get end up in 302 redirects, so I pulled a sample response from one of these requests to analyze.
Here is the response from the Heroku origin (this is the one that Cloudflare caches really well, 96%):
< HTTP/1.1 302 Found
< Server: Cowboy
< Connection: keep-alive
< Access-Control-Allow-Origin: *
< Cache-Control: public, s-maxage=60, max-age=3600
< Cache-Tag: redirect, tag-redirect
< Location: /[email protected]
< Vary: Accept
< Content-Type: text/plain; charset=utf-8
< Content-Length: 36
< Date: Fri, 18 Jan 2019 15:50:42 GMT
< Via: 1.1 vegur
Here is the response for the same resource from Firebase hosting (Fastly, this is the one that Cloudflare doesn’t cache as well, 85%):
< HTTP/2 302
< server: nginx
< content-type: text/plain; charset=utf-8
< access-control-allow-origin: *
< cache-control: public, s-maxage=14400, max-age=3600
< cache-tag: redirect, tag-redirect
< function-execution-id: ypqvo2ch78jz
< location: /[email protected]
< x-powered-by: Express
< x-cloud-trace-context: 3d0dcd14ac070a2fd19113d706d23ee0
< accept-ranges: bytes
< date: Fri, 18 Jan 2019 15:55:50 GMT
< via: 1.1 varnish
< x-served-by: cache-lax8625-LAX
< x-cache: HIT
< x-cache-hits: 1
< x-timer: S1547826950.360335,VS0,VE0
< vary: accept, accept-encoding, cookie, authorization
< content-length: 36
The main differences that I can see are:
- Heroku uses HTTP/1.1, Fastly uses HTTP/2
- Fastly sends
accept-ranges: bytes
- Fastly sends
vary: accept, accept-encoding, cookie, authorization
but Heroku just usesvary: accept
- Fastly sends various cache info headers, e.g.
x-cache
,x-cache-hits
,x-timer
I’m also actually using a longer s-maxage
on the new site to try to mitigate the caching issues, but it doesn’t seem to be helping.
Is there anything that stands out here that makes it more difficult for Cloudflare to cache the responses from Firebase hosting?
As I said, this is a sample of the most common type of request that we get. But I may be looking in the wrong place.
If you care to poke around, here are the URLs for the old and new instances of the app:
Old
- origin https://unpkg.herokuapp.com
- Cloudflare https://unpkg.co (not
.com
)
New
- origin https://unpkg-gcp.firebaseapp.com
- Cloudflare https://unpkg.com
Thanks in advance for any feedback!!