I created a firewall rule that says if it’s not Google or Bing and if the request is robots.txt then block. But it’s blocking all requests.
Rule looks like this: (ip.geoip.asnum ne 8075) or (ip.geoip.asnum ne 15169 and http.request.uri.path eq “/robots.txt”)
I have another mechanism in place using a script at the site that will verify a search engine via rDNS, but I wanted to create this and lessen any log in my FTP.