Cache all pages after deploy

all

#1

Hello!
We’ve got a static website with ~100 pages.
It’s a flat-file CMS but we decided to disable framework-level caching and rely on Cloudflare for cache instead.
We created a page rule in Cloudflare to cache everything.
I’d like to force Cloudflare to cache all pages just after each deploy. Because currently, Google bot will often be the first to visit a page after a deploy and will have a slow loading time.
Any idea how could I do that? Setup a small script that will curl all urls after each deploy?
Thanks!
Félix


#2

There’s really no benefit to this (plus Cloudflare doesn’t provide a way to preload the cache).

Every edge node is its own cache, and even within edge nodes, the node servers don’t necessarily share the cache. So it’s not going to take any stress off your server…it may even add to it for that moment all the edge nodes load the cache.

If your server is quick, the first visitor from that region will load the Cloudflare cache for that edge node, then all other visitors from that region get a faster cached version. You could enable Argo routing so edge nodes will pull cached data from central nodes if it’s available.


How to preheat CDN
#3

While there is no native way via Cloudflare, you can get creative and do it other ways Warming Up Cache Geographically Using Webpagetest :slight_smile:


#4

Thanks! Very interesting!


#5

Thanks Sdayman, although I disagree with the “There’s really no benefit to this” :stuck_out_tongue:


#6

Ok…there’s some benefit to this. When I look at a particular site of mine (for a restaurant), I see a daily spike around lunch time. So I know the first person that day preloads the cache. They’re not getting the full speed enhancement. They might not even notice if they’re on a slower connection (I have a very fast server). Everybody else gets full cache performance.

I was on my way to posting a link to that Warming The Cache script. And then realized that they want 100 pages cached. I did the math, then gave up after half a bazillion calls to page testers to hit every page from every geographical region.


#7

Yeah might not scale with a script and API calls to webpagetest. For that best to optimise caching for the common page elements that make up your site’s framework i.e. css, js, site logo, background images etc. Then one a single pre-cache call to a page on your site will populate caches for those page elements which will essentially speed up page loads (perceived visual render time) for pages. You can look at your google analytic stats for top 10/20 traffic/visited pages and use those as part of scripted calls to webpagetest so you cache those pages fully and directly cache those page elements for the rest of your pages.

Usually top 10 or 20 traffic visited pages can make up 50+ % of all traffic to a site.


#8

FYI, Google bot crawl speed doesn’t have much bearing on SEO. You’d want to optimise for real visitor page load times - hence optimise caching for page elements i.e. Google Chrome User Experience report CRuX and Google Page Speed Insights Origin stats


#9

Thanks!