Anyone can tell me how much a normal request amount in 10 seconds period to put in Cloudflare rate limit rules?
I’m having many trouble when using it. I’ve to whitelist my servers, known bots and myself so i’m not getting blocked/limit.
But the problem is when someone (or bot, or whatever it is) get blocked without me knowing because i don’t know how much is best & safe number for a request.
So my question is : How much real visitors (not a bot or crawler) need a request in 10 seconds?
What if i put 50 in the field (50 request in 10 seconds), is it good enough? How about 30, 100, 70?
Let’s say the visitor not only read the article, but take a simple action like downloading files, sorting a table and visit other page.
Anyway, I’m worried too about what should i whitelisted. Now, i’m whitelist the server ip, wordpress user agent, known bots and myself. Is there something missing?
Please help, i’ve been fully stressed with WAF
There’s no simple or “normal” answer to this question - it completely depends on your website use-case and application.
If you’re running a high-traffic API, then it might be completely feasible for users to hit it 100 times a second. Or if you’ve got an endpoint that does something computational expensive, you may only want users to hit it once every minute. What kind of request numbers are you seeing right now?
I would start conservative, maybe 30 requests in 10 seconds, and monitor the results for a few days/weeks. Are you seeing a lot of blocks? None at all? And then tweak it from there once you have some real-world numbers to look at.
Yes i understand & totally agree. It’s hard to generally the number of requests.
But let’s say my website is focuss on server a content to read (no login, no api, and just simple website with a content).
I assume there is a possibility that the user will ask for more than the normal amount (when they are searching, downloading or something like that) compared to a normal user who only reads articles and then switches pages.
Okay, when you say 30 requests in 10 seconds I start to have a bit of an idea. Can you explain more why 30?
Yes off course, i’ve been monitoring all day long. And the problem is : Yesterday there was no serious problem, but earlier I saw that there was a wordpress user agent with an ip server that was blocked by limit rules. So, this make me ask “why since yesterday there is no such thing, even though i fiddle with the blog frequently?”
Sorry for asking too many questions, I don’t know who else to consult.
It was a completely arbitrary number I plucked out of the air as a starting point. It felt like a reasonable number of requests for a page f or a normal reader, but I’m making a bunch of assumptions.
If your site really is just content to read, then perhaps you’d be better off just generating it statically as html/css/js and serving from something like Cloudflare Pages? Then you’ll never really have to worry about rate limiting things.
Realistically here, just try it and monitor the results. If you see it catching legit users, lighten up the threshold. If you see it not catching enough, tighten them up.
You can always use a “Managed Challenge” as the resulting action in your rate limiting rule too - that way, in the worst case for a real user, they’ll get a challenge page and then are straight back to reading your site. Whereas bots likely won’t get past it.
No, i mean not as simple as that, I can’t only using Cloudflare Pages. My web still contain a complex element (wordpress, nginx webserver, databases), but not as complex as woocommerce.
Actually, my website is running smoothly for several years, until there was a ddos attack which was quite troublesome. Even though I do a limit of 20 per 10 seconds, CPU usage is still above 80%.
But if it’s not at the limit, the CPU can go up to 102% (lol, why can it go past 100%, this is very ridiculous).
So i created a backup rules with some patterns detection, and it works. The limit rules help the website still can be access while i’m away from keyboard, and I can do a backup plan.
Now, I want to keep that limit rules. It can help when i’m out the monitor (i can’t stay forever to check CPU usage), also the rules have stopped some annoying web scraping.
But according to the story above, I experienced some confusion. Or if you have any better suggestions for dealing with DDOS, please tell me.
This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.