Essentially, there are five (5) methods for blocking and/or redirecting bad bots.
(1) Via .htaccess file
(2) Via plugin (e.g., SG Optimizer, Wordfence, Blackhole for Bad Bots)
(3) Via CDN (e.g., Cloudflare, KeyCDN)
(4) Via robots.txt file
(5) Via host server (built-in code)
In order to protect server resources and achieve the highest level of protection, which method would you recommend?
At a glance, the robots.txt file method seems the most logical way to go, but adding tens if not hundreds of bad bots to the robots.txt file doesn’t seem efficient.
Currently, we are blocking known bad bots via Cloudflare’s Firewall Rules and it seems to be working pretty good.
Note: We’re using CF’s Free Plan, so don’t have access to the Bot Mitigation protection offered by CF Pro and up.
Thoughts on this appreciated.