For the past 10 years, I have powered my computer and other office equipment using a pedal-powered generator. I had a website on a shared hosting account that described the project.
I’ve found I’m currently generating enough surplus electricity each day to power our modem and a single-board computer 24 hours a day. (The excess electricity I generate is stored in a storage battery.) So I thought it would be a fun project to host my website on a web server running on the single-board computer instead to create a fully human-powered web site :-).
To reduce hits on my little server and filter out DDoS attacks, I set up Cloudflare as a CDN for the site. I’ve moved the site to my pedal-powered server, and it’s working well except for one problem.
I was hoping that if I was unable to pedal on some days because I was out of town or ill, Cloudflare would serve the entire site from cache. But that does not seem to be the case.
If I open a private browser window and request the landing page from my site (https://www.pedalpc.com), it appears from my server logs that at least one call is always made back to my server. The landing page’s cf-cache-status is returned as DYNAMIC, while all the other content on the page is a HIT.
I have one page rule, for pedalpc.com/*; it’s cache setting is set to “cache everything”.
The caching level under cache settings is set to “standard”, and browser cache expiration is “respect existing headers”. My server currently sets a max-age header for text/html content to be 1 day, which is confirmed in the network section of the developer tools of my browser.
Am I missing something? Will my site always return a 522 error whenever I’m unable to pedal for a day, even if I increase the max-age header on all content well beforehand?
Your site has a www subdomain as canonical for all URLs. So the page rule for caching everything must either spell it out or include a wildcard for subdomains:
www.pedalpc.com/* >> will cache everything for www.example.com *.pedalpc.com/* >> will also cache everything for other subdomains you may have
Also, keep in mind that Always Online has its limitations. For instance, you need to set as the Edge Cache TTL at least the number of days it takes AO Crawler to visit your site again. Otherwise, the cached page will expire before a new crawl is performed. Also, you need to be mindful to avoid using “purge everything”, purging instead individual files.
So, if I understand correctly, if I’m on the free plan, I want to make sure the Edge Cache TTL >= 14 days. Is that correct?
If I have both Browser Cache TTL and Edge Cache TTL unset in the page rule, but Origin-Cache-Control turned “on”, then are both values determined from the resource’s Cache-Control header values? The documentation on Origin-Cache-Control doesn’t explicitly state this–it only says the Browser and Edge Cache TTL settings will override will Cache-Control values.
I assume our mail server at mail.pedalpc.com won’t be effected by a page rule wildcard domain, since it’s listed on the MX record on our site and serves no content over ports 80 or 443?
Yes, on a free plan you’d have to set the Edge Cache TTL above 14 days.
As for the interaction between Edge/Browser Cache TTL and Orign Cache Control, please see the section Interaction with Other Cloudflare Features in article below:
The original Cache-Control header is passed downstream from our edge even if Edge Cache TTL overrides are present.