I’ve set up the right configuration. Two basic firewall rules. The first rule allows all users to log in from yandex and google search engines regardless of the visitor’s IP.
The second rule of the firewall sends visitors who are on the black list of IP addresses.
On a site with a low traffic load, everything works correctly and all bots are blocked.
On a site with a high traffic load, this network configuration of the firewall fails and passes part of the bots through the trap. In the analysis of the firewall, you can see that all bots with a specific referer and IP in the list fall on the cap, but some of them are skipped by the firewall, although it should be blocked. Apparently cloudflare with a large load of such a configuration fails. How can this be solved and eliminated by bypassing the hood with bots?
The configuration contains 2 basic rules. The first rule skips all visitors who contain a yandex or google referee. The second rule filters bots that are in the IP blacklist and contain an empty referer, that is, they try to enter directly. The same configuration on different sites works differently. Where there is a small load of connections, all bots are blocked on the cap. Where a large load (about 40 thousand requests per day), part of the bots circumvents the challenge captcha, although in the log of the firewall it is clear that they hit the trap, but then somehow bypassed it.
I suspect that cloudflare in this configuration does not have time to process a large number of bot requests at the same time.
Is it possible to fix this and is it possible to somehow analyze this situation with cloudflare specialists and what is needed for this?