in the link below I show you log of one of these attacks
during 50 seconds, a page was refreshed +200 times
all it takes is hold the F5 button
these attacks are extremely common on my site, they happen dozens of times every single day, with complettely random IPs
probably one user with VPN and extreme insanity insanity and obsession in their mind
can cloudflare protect against this type of attack ?
can you clarify exaclty how the rate limit can protect against a guy who holds down the F5 button on a random url and with a random IP ? I can see how to protect one specific url but I see no solution when the attacks happen on any url.
none of the links above provide a clear solution
if a solution truly exists why don’t you provide the exact rule set ?
It’s not single click solution.
May I ask if you’ve tried yourself something already or not?
Are you using Free or paid plan type for your zone?
Article mentioned at my above post about manual DDoS mitigation contains quite a lot of usefull things to configure and try out.
If that’s the case, I’d rather cache that particular HTML document at the origin host, therefrom set the HTTP cache-control headers.
Or even at Cloudflare to cache that resource type (HTML) for at least 5-10mins if it doesn’t change frequently or even longer 1h or a day.
For the other resources, I’d set cache to a month.
I’d also consider using Transform Rule(s) or Page Rule to redirec those kind of requests to the cached resource and strip out the parts of the URL, to not get “cache busted” if they add query with parameters into the URL for particular resource(s).
If you’re on a Free plan, count and see how much requests your visitor of your Website needs to download from the Developer Tools (F12) → Network from your domain.com.
E.g. 120-150 threshold, if more than that in 10 seconds, block.
Consider blocking known ASNs via IP Access Rules, block TOR users, HTTP/1.0, empty user-agent, etc., configure other settings to play for you, not agains while using Cloudflare. They’re available via of these forums.
Test, test, test and you’d get a solution for your case. No single-click, no silver bullet working for everyone.
ok friend, I think I’m getting the gist of it, except for the root page, not sure how to add the root page to the list of protected urls. for example I know how to add exampledotcom/forum/* but not how to add exampledotcom or exampledotcom/
URI path / would match all your cases including resources like CSS, JS, images, fonts, etc.
The slash part / you might not see at the URL address bar of your Web browser for example.com (which is example.com/ in reallity), or like example.com/?param=value, or example.com/abc/def or example.com/abc.html.
example.com = example.com/,these URLs are treated exactly the same and it doesn’t matter which version you use.
example.com/page ≠ example.com/page/, however how web server treats those requests, URI path contains the slash / in both cases as /page and /page/ (despite it might be different file vs directory, otherwise Web server would add the ending slash / and should be the same) so again it would match.
A file such as example.com/abc.html, .php, example.com/folder/script.js, .css, .pdf, .jpg, etc.
It’s an example in what I’ve shared from above to match everything everywhere on Free plan. Not an ideal approach for Free plan, especially for news portals, etc. since they might have too much resources and requests to be made for a reader including ads, more additional related news, pictures, load more on scroll, etc.
Free plan is limited with only 1 rule so we’d have to combine multiple OR or AND to get what we need at the end.