Cloudflare is useless! 7 Days since my website is down - Paid Plan Customer

No, its not a shared hosting. Its a $120 dedicated server with E3-1270v6 Processor 8 CPU @ 3.8Ghz 32768MB Memory 5000GB Bandwidth but the problem is that the company is not very good at ddos protection thats why i am planning to change the host.

I have secured my server very well and the attacker is not able to attack my ip directly.

most of the attack is generated from the usa, russia and china. so it is useless to block it.

understood
for now I am getting different error (before site was timing out) now I get this
so I guess firewall is working properly now?

because i blocked all countries except a very small country named palau. so anything except palau will not be able to access this site.

Ok. Good luck.

Hello,

I do agree that Cloudflare can be bypassed in many cases, this however I think is fixed with higher plans that they offer, I would make the assumption that anything below business level is at risk of being “forgiven” when an attack goes through their rules.
At the same time, you mentioned you had a good server right? If you’re getting 100k requests in 5 minutes, that’s less than 400 requests per second, which should be easily held by your current server.

There are many competitors to Cloudflare, and the protection level is pretty much the same concept, obfuscated cookie challenge that if successful, allows you to visit the page. All of them can be bypassed with some effort, this is why bigger business require of an SLA to ensure that even if an attack makes it through the firewall, you have a set of engineers ready to help you no matter the size or duration of the attack.

1 Like

yeah layer 7 protection is not the best and you need some technical knowledge to stop them but

unless one of this rules apply to you, there is no way to really protect any site with any service(its sad but it true):

  1. 100% of your site can be cached
  2. your servers can be horizontal scaled

you can ask why? because of one simple fact: there is no really way to know if a person or a “bot” is visiting you site, now if your server is really great, and your most heavy route can sustain 300 requests per second(which in reality in most cases you will not be able to achieve it)
you will only need bot network of around 10000 machines to put your site down.

(in most sites there is routes that cant even handle 10-30 requests per seconds)

so the sad reallty is unless you can scale your servers to handle 10,000+ requests per second, or make all your routes cacheable, your site will be easily down by layer7 ddos attack, and I dont believe any service can really protect against it

now to the optimistic side:
unless your site is really high value and some shark can invent a lot of money to put it down, probably its just some script kid that bought some cheap service to attack you, in this case you will need some technical knowledge and good logging service to analyse the attack and block it.

so the first step to block it is to get logging service, you can try logflare app or if you want to run it yourself you can see this worker(which I use in my sites)

after you have the logs you will just need to find the common patterns and block it
the next step is to start looking into scaling your servers and full html catching

3 Likes

Hi farooq.

I’m interested in this issue, since we are using Cloudflare for several of our sites, and are currently looking how to tighten the DDOS protection even more.

We experienced a DDOS attack recently, but it wasn’t through Cloudflare, the attack was targeting the IP address of our servers.

After we mitigated the attack, we figured this would be the best solution:

  • make all the websites on the server use Cloudflare
  • in the firewall on our server block http and https acces from all IP addresses except the Cloudflare’s
  • make CF cache everything, not just the static content
  • implement a smart cache clearing mechsnism which clears the CF cache through CF API only when some content is published on the website
  • set the CF cache expiration to several hours

We are currently implementing this solution, and will see how it works over the next few weeks.

Regarding your issue, I have a few questions to help me clarify what happened in your situation:

What was the target of the attack you experienced… was it your server (via the server’s IP address, or a domain which exposed your server’s IP address) or a domain proxied through Cloudflare?

Did you block (not on CF, but on your server’s firewall) http and https access (ports 80 and 443) from all IP addresses except Cloudflare’s?

Did you make CF cache everything, or it was caching just the static content?

Can you send a sshot of the CF analytics of the cached and non cached requests in the time of the attack? I would like to figure out how much does caching help in preventing DDOS attacks.

Regards.

2 Likes

Hi Farooq,

You could also use IP intelligence, like for instance Ipregistry (https://ipregistry.co) to detect IP type for your incoming requests. It is possible to know with a high probability if the IP is from a hosting provider (which is often used by attackers) or from an already known source of attackers:

If you give a try, I would be interested by the results.

You simply can’t cache non static content, come on …

Yes you can.

We are using it successfully on few of our sites. All pages are cached with expiration time of few hours, and we have integrated CF API into the CMS to purge the cache whenever an editor published or modifies some content.

1 Like

Then in your case it may work; dynamic is dynamic for a reason, it can vary as much as you can imagine. If you have a blog, then it’s more suitable, but it’s not for most of the pages.

I use Workers in order to add full page caching for my Wordpress site, with a plugin that purges the cache whenever a page or post is published/updated. It really helps improve performance, better than local Wordpress caching plugins which tend to store generated content as files.

I’ve modified it a bit for my needs, but I started from Cloudflare’s example Worker on Github…

The plugin, for those using Wordpress and want to try this, is here…

But you’re right that if it’s fully dynamic (such as displays the current time generated from the server using PHP, for example) then this could not be cached, as the time would be the time that the cache was created, not the current time. I have had this problem before and resorted to using javascript instead of PHP to get the current time client-side, so that I could still cache the page without losing the functionality.

Yes of course, it depends on the site usage…

You can use this principle even if you have user generated content on the site, for example, purge cache when a user posts a comment.

We are currently also implementing this principle on another website which works as an aggregator, it parser other websites and generates content automatically without human editing. Through CF API we’ll add cache clearing functionality - CF cache will be purged after the parsing cronjob script has finished parsing all the sites.

Generally, whether caching dynamic content is a good idea for a site depends on this two questions:

  • How often is your content updated?
  • How soon do you need to display the content? (realtime or a delay is acceptable)

For example, if your content is updated every second and you have to display it right away, then caching of dynamic pages won’t be a viable solution.

It all boils down to understanding how your site works and making the appropriate decision.

1 Like

There have been a rash of attacks coming from the 35.245/ IP Range is part of Google’s data transmission (Google Cloud Platfor has been / is still being abused for various attacks, including XMLRPC attacks, though the attacks also include extremely excessive POSTing of ACME challenges. And that is not the only range under Google’s “control”. Also, be aware of an infected machine in the Netherlands that has been responsible for a rash of attempted hacking, port scans, and web application attacks. These seem to occur in ~24h spurts, then subside, then begin again after 2-3 days. Most are POST requests, though a small few use GET. There’s also been port scanning and attempted targeted attacks coming from a IP address located in China if they find port 2083 open, which is used for both Secure Radius Services and cPanel logins (though :80 can also be used for cPanel). They also use POST requests. Prior to being hit by any of these, I luckily had a standard firewall rule in place that prohibits most request types including POSTs, expecting POST in certain instances. That one rule effectively blocked, and still is, each and every one of these attacks. My WAF catches the GET requests as they are rightly interpreted as attacks. I should note that I know of these attacks not only due to recent personal experience but also due to information submitted by other members of https://www.abuseipdb.com , an excellent resource for all site owners to contribute to.

Also of note to your specific domain, I see you’re getting hit from TOR. That is not only unfortunate but odd as well as it’s an inefficient network to use for DDoS attacks due to the factor that makes DDoS attacks actually function - time. Your domain itself, though, will obviously draw a lot of traffic from TOR, so I’d be cautious regarding the blocking of that network. Another option would be to have a separate, TOR specific domain and site, which would draw traffic from your regular domain / site. The Tor Project has excellent documentation at https://2019.www.torproject.org/docs/tor-onion-service.html.en which will lead you through the process step by step.

Good luck!

2 Likes

This topic was automatically closed after 30 days. New replies are no longer allowed.