I’ve been using either medium or High security for a while now.
But looking at Wordpress Wordfence security plugin, there were many bots that somehow got through cloudflares initial security defenses, and were subsequently blocked by Wordfence.
I decided to enable ‘Under Attack’ for the past few days for my site, and saw a substantial reduction in ‘bad bots’, but a continued stream of search engine bots (I presume bots that were white-listed by cloudflare).
I personally don’t see any downside to having this on from a human perspective, since you simply get a 5-second buffer that declares the site is secure.
Any thoughts on this?
I’ll keep testing this as always on, on all of my properties (currently 6-7).
I’ve seen no drop-off in real human traffic from google analytics so far.
What reason would you lose visitors?
If anything, it’s filtering out fake ‘human’ traffic to your site.
I know that microsoft ads has a lot of fake traffic, as well as pinterest ads that they charge as real traffic. That type of traffic would be filtered out by cloudflare.
Hi Leo, sdayman is giving good advice (he almost always does). Users will hate you, hate your site, curse you name, and take their traffic elsewhere. A challenge is forgivable on something like an e-commerce page or even a contact form, but not on every page. Chalk it up to modern impatience, but it is what it is.
A good alternative is using firewall rules to block bad bots.
Hey Kenny, thanks for the feedback, but once again, I would love to see some actual benchmarks with this.
Also, the challenge only shows up every 30 minutes (Challenge Passage Setting).
Here is my wordpress analytics (unique users) for my internet marketing site Premium23.com. As you can see there was no drop-off in users, or user interaction.
Finally, while a LOT of bots have been blocked by Cloudflare, a few ARE still getting through, and being blocked by Wordfence, and All-in-one SEO bad bot blocker:
Bots Blocked by Wordfence (even WITH Cloudflare under attack / TLS 1.2 / all protections enabled)
I wouldn’t worry about it, I would worry a lot more about the delay than the bots…
I guess you probably using known framework like wordpress or what ever, as long as you keep everything updated you are good to go, because if they will have security vulnerabilities, its not your problem its like 50% of the internet problem and there is nothing you can do about it because the “big guys” will anyway be able to break your defenses.
you cant block bots just live with that, even if you go and captcha 100% of your site there is still this guys https://anti-captcha.com/
Yes they will be using humans in low paid countries.
Some of the clients of anti-captcha and similar services will be crawlers building graphs to help bigger businesses understand the web thus the value-add.
Also be aware that blocking all bots means blocking GoogleBot etc thus you’ll get no traffic from Google or other search engines after a while.
“wp_remote_post() test back to this server failed! Response was: 503 Service Unavailable
This additional info may help you diagnose the issue. The response headers we received were:”
I had to disable ‘Under Attack’ mode on all of my sites unfortunately.
If those are your IP addresses it should be fine, you just set it it to allow though, if im thinking correctly it should be around the same thing. Allow shouldn’t block it in any way.
Edit: If you’re worried about the good bots being blocked, add to the firewall rule of allow and allow the user agents of the common bots. You can find them by googling “Search Engine Bot User Agents”. To help other people that might have the same issue, mark my answer as the solution