Thanks. Why is it normal that Bing Bots and Digital Ocean try to access my WP Admin?
As you can see in the last screenshot in #1, I have a firewall rule that blocks Digital Ocean from doing so. Should I adjust the rule?
Thanks. I didn’t block Digital Ocean in IP Access Rule. I blocked it in this rule. I enabled Known Bots off after I saw your first response.) Is it still normal to see that Digital Ocean try to access my site?
It isn’t. It’s kind of easy to get confused when dealing with the Known Bots exclusion in Firewall Rules.
Known or malicious, a bot has no business probing your website backend, or crawling your login page.
You should definitely use the Known Bots exclusion to avoid blocking/challenging search engine crawlers when the rules is based on general, wide criteria, such as country or ASN. Since their crawlers come from the cloud, you never know witch countries/ASNs/IPs the will use to crawl you pages. .
But in my view you should not exclude Known Bots from rules applying to bad behavior, such as crawling /wp-admin/, /wp-login.php, and any other sensitive areas or your site. These are not pages meant to be indexed anyway.
As for the Bing user agent showing up in Firewall Events log: for rules where you do have the Known Bots exclusion enabled, you’ll see occasional log entries with Bing or Google user agents. These may be requests from bots pretending to be the real search engines.
With IP Access Rules, you cannot set exclusions like Known Bots, paths etc. You should always prefer to incorporate these wide restrictions into your Firewall Rules. A hosting provider ASN may be home to many good bots (think of online services you depend on, such as some page speed testers, header checkers, etc), even some VPNs, as well as malicious bots. For that reason I’d prefer the Managed Challenge instead of Block, and add exclusions to Known Bots, and some paths (robots.txt, ads.txt etc.)
The first two paths are home page and the main page of the site. I want them to be cached and edge-cached as often as possible, so I let bots visit them. Robots.txt should be open to all bots. Most bots we think of as malicious are just crawlers for various web services, and will respect robots.txt directives (check the link @michael posted for instructions on how to set yours). Ads.txt and app-ads.txt should optionally be allowed so that malicious ad vendors don’t use your domain to sell ads without your consent. I also allow favicon.ico just to avoid clutter in my Firewall logs.
This rule is of course combined with other rules that are more strict about bad behavior (blocking remote access to PHP files, for instance)