Was the site working with SSL prior to adding it to Cloudflare?
Yes
What is the current SSL/TLS setting?
Full
What are the steps to reproduce the issue?
There are fake Google bots that log into my website for different purposes. I added a screenshot. The IP address that has nothing to do with Google has been imitated as Google. How can I allow only real Google bots?
Use below expression builder to allow only real Googlebot to your sitemap.xml and robots.txt file, make sure the action of the rule is set to “block”:
(http.request.uri.path contains "robots.txt" and not ip.src.asnum in {15169} and not http.user_agent contains "Googlebot") or (http.request.uri.path contains "sitemap.xml" and not ip.src.asnum in {15169} and not http.user_agent contains "Googlebot")
You could also use Wildcard, since it’ll match both Googlebot (uppercase) and googlebot (lowercase), make sure the action of the rule is set to “block”: (http.user_agent wildcard r"google" and not ip.src.asnum in {15169})
Helpful article how to create a Custom WAF Rule:
Make sure to block below IP addresses (from Google) since they’re used for HTTP DDoS as far as I am aware:
35.208.148.101
35.209.81.23
Add them to the IP Access Rules with action Block for your Website.
Actually, my aim is to block all bots except Google from accessing my website. I am looking for a solution for this. I have no business with any bots except Google. Many bots enter sites for the purposes of capturing content and images, analyzing etc.
Many of them consider using a predefined list of well-known user-agents, therefrom you cannot block all of them since they act so good as real visitors even from local ISPs as a botnet.
You’d have to combine with Bot Fight Mode, Block AI Bots, Browser Integrity Check and more other security options such as blocking ASNs with IP Access Rules.