Cloudflare Pages caches 404 responses in the browser

Cloudflare Pages appears to browser cache 404 responses when setting “Browser Cache TTL” to a specific duration (e.g. “1 year”):


  1. Add a new Cloudflare Pages application
  2. Add a custom domain
  3. Go to Caching > Configuration settings for that website
  4. Set “Browser Cache TTL” to 1 year

Expected: 404 statuses should not have a “Cache-Control: max-age: …” response header
Actual: 404 statuses have a “Cache-Control: max-age: …” response header

You can see an example of this on the Cloudflare Page’s own site by visiting an asset that doesn’t exist (e.g., opening up DevTools and checking the response headers:

Notice that the asset doesn’t exist (404 status code), yet Cloudflare still sends back a “Cache-Control: max-age: …” response header.

Is there anyway to make Cloudflare not set this cache-control header for 404 responses?

This isn’t Pages doing it, that’s just due to the setting on the zone being set that way.

We recommend to not do zone-level caching when using Pages for this exact kind of reason. Let Pages handle the cache.


It appears that the site doesn’t even follow this advice as their JavaScript responses have a Cache-Control: public, max-age=14400 header in the response.

When I change the “Browser Cache TTL” setting to “Respect Existing Headers”, the Cache-Control header becomes public, max-age=0, must-revalidate, which IMO is undesirable behavior for assets that already have a content hash in the filename because the browser still has to make a conditional request with the If-None-Match header. Ideally, I’d like to cache those assets for a year in the browser (as they are immutable) and not make the browser do a conditional request.