Every day I have people/bots probing my domain, looking for files with common names (backup.zip, credentials.json, export.php, a.php all the way to z, wp_old, etc). Since the files don’t exist, they are redirected to my customized 404 page, and they keep going. Sometimes 100’s of hits in succession until they get tired.
Would it make sense to make a page rule triggering a Managed Challenge when someone falls on my 404 page? My goal is to slow down/discourage the probing.
Once in a while a bonafide user may fall on the 404 page too. (outdated incoming link, requesting icon of unusual size, etc) Would this bother them?
I do have a rule whitelisting the “good” bots (google updating its index, etc) so I believe that process wouldn’t be bothered.
Would it make sense to put a Managed Challenge in my 403 page too?
Seems like someone uses some GitHub repo of the known “web shell paths” or some Website/Web app Vulnerability Scanner over your domain. Maybe the WPScan patterns could be found in some of those requests too
However, I saw those coming from different IPs and ASNs.
If it reaches your 404 or 403 at the end from your origin host, then it’s not useful at all.
I’d rather track them and check and modify my Firewall Rules to be more strict and stronger and block any request to those files, if possible. I’d even track the IPs or ASNs and block them too.
If it reaches them at the end of their FIRST poke, wouldn’t it stop them from continuing to pound my site? I’d consider that very useful.
I tried, but they are looking for files that don’t exist, and their imagination is always expanding. Some try a,aa,aaa,… and so on, upload, uploader, edit, editor, index_old, index2, login,remote_login, private, private_old, etc…
I get 100’s of those a day. Trying to block specific path names would be a neverending game of whack-a-mole.