Is tricky to impossible. You’d have to work with the referrer, which can be faked and be missing. In either case you’d either still have crawlers to challenge legitimate visitors.
You need to extend your existing rule to check for the referrer and whether it contains any of the allowlisted domain names. But again, that can easily be faked, respectively legitimate requests can have it missing and you will challenge them too.
Thanks for fast response " sandro"
but why everytime after I apply the rule of (not cf.client.bot)
all traffic comes from search engine is gone?
in the following image as you can see
its only one user from google
once i remove the rule that i set in firewall setting (not cf.client.bot)
it comes back to normal which is 40-50 user
Traffic is not gone, but requests which are not from well-known crawlers will be challenged. That essentially means all your regular visitors get the challenge.
It probably is better to find a pattern in the requests you want to block and block them explicitly.
I’d start by analysing the requests you’d like to block and try to find a pattern. That could be the country, the user agent, the IP address block, etc. Once you managed to find a pattern you can try to implement a block either on your end or on Cloudflare
That topic is very broad and blocking such requests is not completely impossible but often very tricky. I would suggest you start with basics like what HTTP is, what data you can get from an HTTP request, etc. It is not something you can cover in five minutes however.