My sites were being excessively crawled by bots

My server memory used is spiking up and I request my hosting company to look into it.
They told me this:

* We noticed that your sites were being excessively crawled by bots:

[domlogs]# grep -Pc “[Bb]ot” $(find . -maxdepth 1 -type f -size +300k -print) ./ ./ ./ ./ ./ ./ ./ ./ ./ ./ ./

How can I block this bot attacks. Help please

Bots tend to do this at times. You can exclude all known bots from crawling your site with a Firewall Rule:


Or block the IP with an IP Access Rile if it’s one bad actor or IP range


Sometimes bots just go through the ipv4 space and find your host. In addition to what was already suggested. I also recommend to use Authenticated-Origin-Pulls (but only if you expect all http traffic to come from Cloudflare) Actually never mind, you probably don’t have access to the necessary configuration files if you are on a shared hosting provider.

I read it as Access Rifle, I guess it can also be used to defend against bots… :wink:

1 Like

if you can allow to pay, the only long time solution to my knowledge is to enable rate limiting

I think I fix it and I am not sure it was the bot attack. Anyways I did challenge traffic from some countries: Cambodia, China, Russia, Ukraine etc for all my accounts. That seems to work and it is easy to do it.
However in my WHM I noticed that apache_php_fpm service was the one that using most of memory so I went and downgrade those domain names from php 7.2 to 7.1 …
I also noticed that some domain names google console plugin was disconnected or a domain name was http on google webmaster and I had it https on my server and it was getting redirect it when try to crawl …
I have almost a day a very good performance for my server.
I am not sure what did the job … but I am happy now :)))
Thanks to all of view that try to help me


This topic was automatically closed after 30 days. New replies are no longer allowed.