Feature Request: Improve Rate Limiting

I have been using Rate Limiting for a couple of years now, and still haven’t found that happy balance that allows my legitimate readers to view my site freely while preventing bots from pounding the site. I’ve erred on blocking the bots, but sometimes my legit readers are rate limited.

I would like to see Cloudflare offer the option to set rate limits based on PAGE view requests. Some pages on my site have as many as 50-60 images while others have none, so it’s hard to estimate.

Thank you.

1 Like

Maybe even simpler is to Rate Limit DYNAMIC URLs.

  1. That generally eliminates the image count problem.
  2. We generally don’t care about rate limiting cacheable content.

I don’t fully understand the process or the correct terminology, but basically, the request is to do whatever it would take so that loading up three pages in less than 10 seconds counts as 3 every time and not anywhere on a scale of 3 to 150 like it does now.

That would be much easier to gauge and regulate so that humans can freely use the site without interruption while overactive bots would be stifled for an hour every time they overstep the standard.

Hopefully this suggestion catches the attention of the powers that be at Cloudflare.

I ran into the same problem you’re talking about. For a particular site, most pages have maybe 30 resources, max. But one has about 50 due to a bunch of pictures. So that page would trigger a rate limiter if I wildcard the entire site.

But when it comes to bots, I use Firewall Rules to block the most invasive of the bunch.

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.