Cache hit rate suffers on Firebase hosting


#1

I recently switched one of my sites’ origin from Heroku to Firebase hosting and noticed the cache hit rate drop dramatically from about 96% to about 85% on a given day. You can see the change in this snapshot from the Cloudflare analytics dashboard:

Besides a different server model (dynos vs. cloud functions), the main difference that I can think of is the fact that all sites on Firebase hosting are running behind Google Cloud CDN Fastly, so Cloudflare is actually making requests to Google Cloud CDN Fastly on the new site.

I’m trying to figure out why the responses from Firebase hosting aren’t being cached as well as those from Heroku. I fired up an instance of my old Heroku app to compare the response headers.

The vast majority of requests we get end up in 302 redirects, so I pulled a sample response from one of these requests to analyze.

Here is the response from the Heroku origin (this is the one that Cloudflare caches really well, 96%):

< HTTP/1.1 302 Found
< Server: Cowboy
< Connection: keep-alive
< Access-Control-Allow-Origin: *
< Cache-Control: public, s-maxage=60, max-age=3600
< Cache-Tag: redirect, tag-redirect
< Location: /[email protected]
< Vary: Accept
< Content-Type: text/plain; charset=utf-8
< Content-Length: 36
< Date: Fri, 18 Jan 2019 15:50:42 GMT
< Via: 1.1 vegur

Here is the response for the same resource from Firebase hosting (Fastly, this is the one that Cloudflare doesn’t cache as well, 85%):

< HTTP/2 302
< server: nginx
< content-type: text/plain; charset=utf-8
< access-control-allow-origin: *
< cache-control: public, s-maxage=14400, max-age=3600
< cache-tag: redirect, tag-redirect
< function-execution-id: ypqvo2ch78jz
< location: /[email protected]
< x-powered-by: Express
< x-cloud-trace-context: 3d0dcd14ac070a2fd19113d706d23ee0
< accept-ranges: bytes
< date: Fri, 18 Jan 2019 15:55:50 GMT
< via: 1.1 varnish
< x-served-by: cache-lax8625-LAX
< x-cache: HIT
< x-cache-hits: 1
< x-timer: S1547826950.360335,VS0,VE0
< vary: accept, accept-encoding, cookie, authorization
< content-length: 36

The main differences that I can see are:

  • Heroku uses HTTP/1.1, Fastly uses HTTP/2
  • Fastly sends accept-ranges: bytes
  • Fastly sends vary: accept, accept-encoding, cookie, authorization but Heroku just uses vary: accept
  • Fastly sends various cache info headers, e.g. x-cache, x-cache-hits, x-timer

I’m also actually using a longer s-maxage on the new site to try to mitigate the caching issues, but it doesn’t seem to be helping.

Is there anything that stands out here that makes it more difficult for Cloudflare to cache the responses from Firebase hosting?

As I said, this is a sample of the most common type of request that we get. But I may be looking in the wrong place.

If you care to poke around, here are the URLs for the old and new instances of the app:

Old

New

Thanks in advance for any feedback!!


#2

Here are a few more logs for common types of requests, made directly to the different backends.

Filename redirect request

This is a request that results in a redirect because it doesn’t specify a filename, so it redirects to the main file in that package.

Old backend:

> curl -v https://unpkg.herokuapp.com/[email protected] > /dev/null

< HTTP/1.1 302 Found
< Server: Cowboy
< Connection: keep-alive
< Access-Control-Allow-Origin: *
< Cache-Control: public, max-age=31536000
< Cache-Tag: redirect, filename-redirect
< Location: /[email protected]/index.js
< Vary: Accept
< Content-Type: text/plain; charset=utf-8
< Content-Length: 45
< Date: Fri, 25 Jan 2019 04:31:14 GMT
< Via: 1.1 vegur

New backend:

> curl -v https://unpkg-gcp.firebaseapp.com/[email protected] > /dev/null

< HTTP/2 302
< server: nginx
< content-type: text/plain; charset=utf-8
< access-control-allow-origin: *
< cache-control: public, max-age=31536000
< cache-tag: redirect, filename-redirect
< function-execution-id: kxcsksc578kk
< location: /[email protected]/index.js
< x-powered-by: Express
< x-cloud-trace-context: 34d09ed198c79749628a542624862108
< accept-ranges: bytes
< date: Fri, 25 Jan 2019 04:31:32 GMT
< via: 1.1 varnish
< x-served-by: cache-lax8644-LAX
< x-cache: MISS
< x-cache-hits: 0
< x-timer: S1548390692.850012,VS0,VE171
< vary: accept, accept-encoding, cookie, authorization
< content-length: 45

Normal file request

This is a request for a JavaScript file, also very common.

Old backend:

> curl -v https://unpkg.herokuapp.com/[email protected]/index.js > /dev/null

< HTTP/1.1 200 OK
< Server: Cowboy
< Connection: keep-alive
< Access-Control-Allow-Origin: *
< Content-Length: 1581
< Content-Type: application/javascript; charset=utf-8
< Cache-Control: public, max-age=31536000
< Last-Modified: Wed, 23 Aug 2017 08:35:00 GMT
< Etag: "62d-cn3mFeN8T+aEWstyQu47N70TGQs"
< Cache-Tag: file, js-file
< Date: Fri, 25 Jan 2019 04:35:54 GMT
< Via: 1.1 vegur

New backend:

> curl -v https://unpkg-gcp.firebaseapp.com/[email protected]/index.js > /dev/null

< HTTP/2 200
< server: nginx
< content-type: application/javascript; charset=utf-8
< access-control-allow-origin: *
< cache-control: public, max-age=31536000
< cache-tag: file, js-file
< etag: "62d-cn3mFeN8T+aEWstyQu47N70TGQs"
< function-execution-id: kxcstc577z9o
< last-modified: Wed, 23 Aug 2017 08:35:00 GMT
< x-powered-by: Express
< x-cloud-trace-context: 604b25997340fa4ae72d0cebc083dba7
< accept-ranges: bytes
< date: Fri, 25 Jan 2019 04:34:31 GMT
< via: 1.1 varnish
< x-served-by: cache-lax8627-LAX
< x-cache: MISS
< x-cache-hits: 0
< x-timer: S1548390872.706117,VS0,VE203
< vary: Accept-Encoding, accept-encoding, cookie, authorization
< content-length: 1581

?meta info request

This is a request for some metadata about a resource.

Old backend:

> curl -v https://unpkg.herokuapp.com/[email protected]/?meta > /dev/null

< HTTP/1.1 200 OK
< Server: Cowboy
< Connection: keep-alive
< Access-Control-Allow-Origin: *
< Cache-Control: public, max-age=31536000
< Cache-Tag: meta
< Content-Type: application/json; charset=utf-8
< Content-Length: 5115
< Etag: W/"13fb-P41ao1uIzjGg267JHPhgU4VWylo"
< Date: Fri, 25 Jan 2019 04:48:11 GMT
< Via: 1.1 vegur

New backend:

> curl -v https://unpkg-gcp.firebaseapp.com/[email protected]/?meta > /dev/null

< HTTP/2 200
< server: nginx
< content-type: application/json; charset=utf-8
< access-control-allow-origin: *
< cache-control: public, max-age=31536000
< cache-tag: meta
< etag: W/"13fb-P41ao1uIzjGg267JHPhgU4VWylo"
< function-execution-id: t3l54o7ieknz
< x-powered-by: Express
< x-cloud-trace-context: 33064945a1ce3f69d84ae6e7e46bd17b
< accept-ranges: bytes
< date: Fri, 25 Jan 2019 04:48:59 GMT
< via: 1.1 varnish
< x-served-by: cache-lax8650-LAX
< x-cache: MISS
< x-cache-hits: 0
< x-timer: S1548391740.566096,VS0,VE271
< vary: accept-encoding, cookie, authorization
< content-length: 5115

#3

Well, I asked Twitter about this and the consensus seems to be that the Vary header from Firebase’ CDN is negatively affecting the cache hit rate.

EDIT: BTW, I found out that Firebase hosting actually uses Fastly as the CDN, not Google Cloud CDN :sweat_smile:


#4

Yep, it turned out to be the Vary header that was causing all the cache misses. I deployed the site to Google App Engine instead of Firebase hosting so I could control the headers better, and the cache hit rate immediately improved.

We can close this one! :metal:t2:


closed #5

This topic was automatically closed after 30 days. New replies are no longer allowed.