Unable to prevent bots and website scanning

Hi,
Since a couple of weeks I see lots of traffic in my homepage and many of them are showing undefined browser or using curl etc. When I analyzed the IP addresses, I can see many of them are not trusted ones. And, if I let the traffic visit the homepage, after a couple of hours my web server slows down and becomes almost inaccessible. I kept the homepage in “I am under attack” mode. It can effectively prevent such traffic, but in terms of user experience it hurts. I kept that mode enabled for a couple of weeks, but as soon as I turn it off the malicious traffic starts. Today I kept the home page in security leve “High”. But, I can still see a couple of visits with undefined browser and from unreputed IPs.

I cannot keep the home page in I am under attack mode for a long time, as it does not give good user experience. My question is, how should I prevent this attack then? Any help will be much appreciated. Thanks.

You may benefit from using Firewall Rules to filter unwanted traffic.

If you find any common denominator for unwanted traffic (user agent, ip ranges, as numbers etc), you could set those up as an expression and serve a JS challenge/block it.

See:

3 Likes

Thanks martin2 for the response. I would explore this option.

Hi,
I have another related query. I can see many malicious requests show browser “undefined”, for example :

Browser: undefined
WinHTTP

or

Browser : undefined
Typhoeus - GitHub - typhoeus/typhoeus: Typhoeus wraps libcurl in order to make fast and reliable requests.

If I want to block such requests, what should be the firewall rule? Should it be something like - if the user-agent contains “undefined” ? And, in that case, should I add another rule to allow good bots like googlebot ?

Thanks,

If you match the whole User-Agent string there is no issues with googlebot, as it has a specific string different from undefined.

The malicious traffic is targeting your server directly or via CF?

All traffic to my website are directed via Cloudflare only. The problem is, even though I keep security level high on the home page and throw challenge for all IPs not having a score 0, I see some of the traffic still comes, though much less in number, for example 8-10 per day. What I can see is for many of them the browser is undefined and the user-agent string may vary. I see they visit the home page and goes back. But, if I keep the security medium and let the traffic in, my server slows down a lot and becomes almost inaccessible.

When I do a little research with the IP address, I see they are reported on various sites with confidence of abuse 10-30%. Keeping security level high definitely helps, but it is not able to prevent them completely.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.