BOTS Causing Performance Issues

We are facing performance issues at with respect to BOTS. If we make the bots as “Blocked”, we get a proper response from the site. However, if we allow the bots, the traffic and CPU usage on the site increases dramatically and after some time it becomes difficult to use the site. We do want the BOTS to visit the site but are there any proper ways we can regulate the bots

Use robots.txt to only allow selected bots and block the rest. Also, if you’re using WordPress, you can use the Blackhole for Bad Bots plugin to block those bots that don’t respect your robots.txt.

You can also specify a crawl delay for all agents or only certain ones:

User-agent: *
Crawl-delay: 3

Not all bots follow the crawl delay directive. For Google, you can use search console > Site settings to control the crawl rate.
1 Like

Need help with setting up strong security on our website

[type or paste code here](

. What are the recommendations?