Hi, I am trying to prevent hotlinking of images from my websites, by making a Firewall Rules, but it needs to allow Search engines like google and yahoo to index my photos! By searching the web I haven’t found a Firewall Rules that are confirmed to work
I have come to this solution below. ( I am not a programmer; I can manage copy and paste)
(http.request.method eq “GET” and http.referer ne “.example.com” and not http.user_agent contains “(googlebot| cf.client.bot | yahoobot )” and http.request.uri.path eq “/ content / jpg /”)
Yes, “Hotlink Protection-Protect your images from off-site linking”
But then my photos will not be indexed by Search engines like google and yahoo. if I have
got it right
If you click on help that says: Hotlink protection has no impact on crawling, but it will prevent the images from being displayed on sites such as Google images, Pinterest, etc.
I tried this solution, as well as others posted here but it doesn’t seem to block anything. All of the scraper sites hotlinking to my images are still showing my images, without exception.
In general, you can’t actually stop scrapers from downloading and re-hosting your content - the above firewall rule and CF’s built-in “hotlink protection” only prevent bad, general purpose scrapers from seeing your content and prevent other websites from using your bandwidth without permission (via <img src).
You should either block the scrappers by IP address, ASN, or user-agent (since most non-malicious scrapers tend to respect robots.txt and keep their user-agent consistent).