If I use free cloudflare workers for cache. It has a limit of 100000 request per day. How many visitors can it handle until the limit is hit.
This depends on how you code the worker, what sort of route you put it on, how many requests each page load uses, etc.
For cache specifically, you’d probably be running it with a route triggering on every request (
example.com/*), so you could get a rough idea of how many requests that is per day by looking at your Cloudflare analytics for your website - the “requests” metric is likely to be very close to the total amount of requests the worker will receive daily.
What if I want to Just cache the HTML of WordPress page’s. Any idea.
Unless your HTML pages have
.html extension at the end of the URL, I believe you are going to match every request (
example.com/*) then only check the
content-type of each request inside the Workers logic.
That would still count as a Worker Request, so they’d be pretty much stuck with processing every single request. Though they could add earlier Routes to match *.js *.css *.jpg etc., so no workers fire on static resources.
Unfortunately the route logic doesn’t allow for infix wildcards. You can only put a wildcard at the start and end of a route.
The solution for Wordpress in this case is to have a route that matches the wp-content folder, which is where most images etc. live. So you’d have a route setup for the domain (i.e. `.yourdomain/``) which fires for the worker, and another route for ‘.yourdomain.com/wp-content/’ where workers are disabled.
Worker routes match on the more precise match, so the wp-content route overrides the first route.
I ran my workers for ages before I realised this route matching, and it saved me millions of requests a month once I had it setup, as previously my worker was needlessly firing for every image request.
Darn. At least nulling a major directory such as wp-content will save many requests, as you pointed out.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.