After setting the Cloudflare Known Bots rule from block to “Skip” we are seeing many more attempts to probe and try to break into our web apps. The known bots rule (when set to skip) is also allowing some bad bots too. When the known bots rule was set to block, it eliminated almost all of that. At this point the only bots that we are concerned with are the good Google bots so we don’t loose organic search. Is there an easy way to block everything but Google bots? We’ve tried to add exceptions under managed rules for all of the Google IPv4 and IPv6 ip ranges that are posted on Googles developer website, but when checking from Google console to see if the URL is reachable it kicks out a 403 error. The only time we don’t get a 403 error is when we have known bots set to “skip”. But like I said that creates a new set of issues. Please advise as to how we could allow good Google bots only until we can determine another solution. Thanks.
This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.