Ensure users get my updated assets (.js, .css, etc.) following a Pages deployment?

When you deploy your assets the client pulls them down to the browser cache. This is good, up until the assets (esp. .js modules) are updated and redeployed to the same endpoint. The client continues to get the old, cached code.

My static site is deployed via Cloudflare Pages. When I visit the site at _____.pages.dev (the origin server), everything just works (updated assets are refreshed) but if I hit the custom domain directly (I have it set to Proxy, not DNS Only) they are not. How should I be configuring things so that whenever I deploy updates all users can expect to get them?

I believe I understand what’s happening now. The Cloudflare cache is kept via the Proxy setting even after a Pages deployment. Clearing the Cloudflare cache (Everything) after every deployment is necessary to solve the issue.

Alternately, one can eliminate the Proxy and use DNS Only since one presumes Pages already enjoys all the benefits Cloudflare provides. This way, there is no need to clear the cache on every deployment.

Pages has internal cache that isn’t reflected via the cache headers (its just KV). You can make a Cache Rule to stop caching on your custom domain


under Caching → Cache Rules → Create Rule.

This way you don’t need to manually purge, and you still get all the features of proxying (your domain’s waf/page rules/other rules/etc).

Most frameworks these days use asset names with hashes in them that also avoids the cache issue entirely, CF cache by default is only on js/css/images and a few other extensions, not html.

1 Like

I actually tried that path already and you’re right, bypassing the cache does work but the trade off is obvious: no caching.

I believe I found the best of both worlds in the solution I proposed. I will do more testing over the coming week and report back if my conclusion proves otherwise.

Thank you for the response.

Pages already has internal cache, putting your own caching above it is redundant. Although Pages internal cache (kv) can be bit slower then normal cache, it’s marginal, and at worst you’re fragmenting it by having two caches on top of each other.

If purging cache works for you and you always remember that’s great, I think just disabling the normal cache works better for most people though. If you were to disable proxy, you’d also be back to just normal pages internal cache (which is also what you get on the pages.dev directly)

I see. I guess I figured I could use DNS Only and having Pages built on Cloudflare’s platform, I’d have all of the benefits Cloudflare provides inherent to the Proxy with just Pages serving requests. But, in rereading what you wrote, maybe you’re saying that’s not the case, that the Proxy would provide something (even with a rule to bypass caching) that Pages by itself would not.

Most frameworks these days use asset names with hashes in them that also avoids the cache issue entirely, CF cache by default is only on js/css/images and a few other extensions, not html.

Right. I see that practice, but I don’t like having to add the query string/hash to all imports even programmatically. When I need cache bust myself, I put my cachable assets in a release folder (e.g. r20240503) and use relative imports. Fortunately, with the way Pages handles caching, none of this is necessary.

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.