Cloudflare simply does not help my website. Without forcing every user to use captcha. It will still get ddos’d, then my web host will null route, resulting in downtime. My site gets ddos’d 1-3 times per day. Cloudflare does almost nothing to prevent it from happening. My IP is hidden, yet people can still bruteforce it offline.
There are a number of other things you can do to protect your site against a DDoS attack. In addition to the common items listed here:
You might also consider rate limiting rules, cache everything rules to cache HTML content (possibly with bypass cache on cookie depending on the type of site). You might also consider another host who isn’t so quick to null route your site depending on what level they consider to be an unacceptable DDoS… some hosts have ridiculously low thresholds.
I don’t see how this will be good enough to actually stop the attacks. Changing caching options won’t matter when it’s already caching requests. I got 14 million requests in 1 hour.
There is no IP leak in the DNS settings. Period.
Increasing the types of content cached and reducing the requests to origin won’t help?
If the site is all dynamic content tools like rate limiting, Web Application Firewall and others in the article listed by @OliverGrant still apply.
I’m already using WAF, I’m already caching content for 4 hours. I’m already using “Under Attack Mode”, then I get 14 million requests in 1 hour and the site goes offline. The only thing that helps is forcing a captcha upon every single user. No other feature seems to work. My mail service hides the real IP, the real IP is not in any of the DNS records.
Do the 14 million requests show up in the CF analytics dashboard?
If you have caching set to “cache everything”, it will drastically reduce the amount of requests to your origin. Just note that this isn’t a solution for dynamic content.
If you have dynamic content, you should consider:
- bypass cache on cookie (business plan)
- rate limiting ($)
- using a separate rate limiting solution
Of course, it might just be time to scale up. If you’re big enough to be getting 14 million (legitimate) hits in an hour, it would be a good idea to move to real hosting platform that specializes in auto-scaling like AWS, Google Cloud, and Azure.
Bypass Cache On Cookie is a business plan feature. If you’re running a WordPress site, there’s a Workers setup that does this, but at 14 million hits per hour, that’s going to get expensive.
I’m paying $20 a month, is that not enough for the business plan?
My business can’t afford $200 per month, yet we’re in a niche which gets targeted often. If you have any suggestions for what a Pro plan can do, I would appreciate it. Otherwise our users will have to endure captchas most of the time.
Are they hitting URLs all over your site?
My strategy would be to craft some Firewall Rules that would blacklist worst offenders or whitelist only best chances – by country or User Agent strings. You get 20 firewall rules on the Pro Plan, and 10 Rate Limiting rules. My final firewall rule would be the CAPTCHA, and this would apply to anybody who’s not whitelisted.
It’s going to take some time to whittle down the attacks. There’s no magic bullet to wipe out DDoS attacks.
As for the $200/month business plan, you might look at how many hours you’re putting in on defending your site and decide it’s worth it. Or just try it for one month. I agree that amount of money makes me cringe, but I don’t run any hugely popular websites, and it’s not my day job.
The bots on the DDoS attack just refresh all different types of resources on the site. Mainly heavy resources like images, scripts etc.
Can you confirm these are purely static assets and they’re being cached? All resources should have the header
Cf-Cache-Status if they’re eligible for caching.
How can I check if a resource is cached with that header? By looking in the recent visitors log?
You can use your browser’s Dev Tools (F12 on Chrome) an go through the items in the Network Tab to look for cache status. Or via command line:
curl -I https://example.com/images/file.jpg
I was able add two rate limiting rules which prevent bots from refreshing and accessing the site’s resources.
70 requests per 1 minute, Challenge
50 requests per 10 seconds, Block for 1 hour
In additional to extending the cache duration to 24 hours or longer. This seems to fix the large amount of bot requests and let my server run smoothly. This seems to have fixed the issue.
I would support other suggestions here for more tailored firewall rules. Cloudflare allows for both very broad and very granular filters. You can block one ip address or a whole country for example.
Spamhaus has a list of the worst bots by ASN number. Just blocking the top ten or so would cut down bad hits dramatically.