Hi, I am trying to prevent hotlinking of images from my websites, by making a Firewall Rules, but it needs to allow Search engines like google and yahoo to index my photos! By searching the web I haven’t found a Firewall Rules that are confirmed to work
I have come to this solution below. ( I am not a programmer; I can manage copy and paste)
(http.request.method eq “GET” and http.referer ne “.example.com” and not http.user_agent contains “(googlebot| cf.client.bot | yahoobot )” and http.request.uri.path eq “/ content / jpg /”)
Any help appreciated.
There is a dedicate hotlink switch, which will do exactly that.
Thanks for your prompt reply,
What do you mean by that ?
Did you click the link and check out what it shows?
Yes, “Hotlink Protection-Protect your images from off-site linking”
But then my photos will not be indexed by Search engines like google and yahoo. if I have
got it right
I believe that wont apply to search engines, but you might want to double check this with support.
If you click on help that says: Hotlink protection has no impact on crawling, but it will prevent the images from being displayed on sites such as Google images, Pinterest, etc.
Of course it will, because you would be disabling hotlinking and that will naturally disable that too, because you disabled hotlinking.
I don’t have that hotlink switch on!!
I’m looking for a way so you can’t do hotlinking to my photos,
But still allow Search engines like google and yahoo to index my photos!
In that case you actually want hotlinking and hence cant disable it.
Your best bet would be a firewall rule of this type
Thank you very much, I want to try it out and see what happens
I tried this solution, as well as others posted here but it doesn’t seem to block anything. All of the scraper sites hotlinking to my images are still showing my images, without exception.
In general, you can’t actually stop scrapers from downloading and re-hosting your content - the above firewall rule and CF’s built-in “hotlink protection” only prevent bad, general purpose scrapers from seeing your content and prevent other websites from using your bandwidth without permission (via
You should either block the scrappers by IP address, ASN, or user-agent (since most non-malicious scrapers tend to respect robots.txt and keep their user-agent consistent).
This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.