I am using custom rules to block out requests for info.php, filemanager.php, etc… but when I check my website’s firewalls, they are blocking these requests there (which means Cloudflare is letting them through). I suppose I am using the wrong rules or not configuring them correctly.
What steps have you taken to resolve the issue?
I have tried several different rules settings. It seems that ‘http.request.uri.path contains “file-manager”’ should block /wp-content/themes/cay-van-phong/filemanager.php and also /wp-content/uploads/filemanager.php but instead of Cloudflare blocking these I see users blocked on my website for multiple 404 errors, attempts to access these (which do not exist). Again, it probably means I am not using the right rule or not using this one properly.
Was the site working with SSL prior to adding it to Cloudflare?
Yes
What is the current SSL/TLS setting?
Full (strict)
What are the steps to reproduce the issue?
use http.request.uri.path contains “file-manager” as a block rule for your website, then try to access [the website url]/wp-content/themes/anytheme/filemanager.php. Cloudflare lets visitors through, and they’ll reach a 404. Again, it probably means I am not using the right rule or not using this one properly.
Sorry… that description is wrong, my rules do contain filemanager without the hyphen as well.
(http.request.uri.path contains “wp-config”) or (http.request.uri.query contains “wp-config”) or (http.request.uri.path contains “file-manager”) or (http.request.uri.query contains “wp-file-manager”) or (http.request.uri.path contains “php.ini”) or (http.request.uri.path contains “cgi-sys”) or (http.request.uri.path contains “cgi-bin”) or (http.request.uri.path contains “.shtml”) or (http.user_agent contains “ahrefs”) or (http.request.uri.path contains “cdn-cgi”) or (http.user_agent contains “semrush”) or (http.request.uri.path contains “.zip”) or (http.request.uri.path contains “.exe”) or (http.request.uri.path contains “.tar.gz”) or (http.request.uri.path contains “phpinfo”) or (http.request.uri.path contains “filemanager”) or (http.request.uri.path contains “.html”) or (http.request.uri.path contains “about.php”) or (http.request.uri.path contains “.ini”) or (http.request.uri.path contains “backup”) or (http.request.uri.path contains “ini.php”) or (http.request.uri.path contains “.php” and http.request.uri.path contains “/wp-content/”)
We’ve had a lot of bots hitting the site, it used to be that blocking Russia, China, Belarus, Nigeria, etc. took care of a lot of them. Now most of which are coming from bad/compromised hosts like dreamhost, godaddy, digitalocean, etc. or from compromised computers. We block a lot of the bad hosts by ASN since servers on other webhosts shouldn’t be crawling sites looking for vulnerabilities… but there is one horrible ASN we can’t block, 8075, because that ASN has servers and users, and blocking it would block a large portion of potential customers (as well as the website owners themselves). So, I am doing this to block out paths/filenames that the majority of bad bots are using.
If you’re not seeing any reason the rules I have wouldn’t block the paths, then there is one skip rule in the chain… It allows known bots to skip all other rules, but only if their threat score is below 5 and they are not part of a handful of abusive services or blocked countries.
cf.client.bot and cf.threat_score lt 5 and ip.geoip.country ne “RU” and ip.geoip.country ne “CN” and ip.geoip.country ne “BY” and ip.geoip.country ne “KP” and ip.geoip.country ne “UA” and ip.geoip.country ne “SA” and not http.host contains “binance” and ip.geoip.country ne “SG” and ip.geoip.asnum ne 22612 and not http.user_agent contains “ahrefs” and not http.user_agent contains “semrush” and ip.geoip.asnum ne 26347 and ip.geoip.asnum ne 14555 and ip.geoip.asnum ne 14618
Darn it… I see the flaw in the one. I do not have it blocking info.php, but things with phpinfo in them. Changing that, the blocking worked. I went back and tried /wp-content/themes/cay-van-phong/filemanager.php and Cloudflare is blocking me, but not these bad bots. I guess the rule is right but something in my skip equation is the issue.