“Good” bots from reputable websites getting 403 errors
What steps have you taken to resolve the issue?
Set Security> Bots > Definitely automated to “Allow”
This appears to be the only way to resolve as some important ones e.g. schema.org, https://httpstatus.io etc are getting a 403 / 200 error when trying to access the site.
Was the site working with SSL prior to adding it to Cloudflare?
Yes
What is the current SSL/TLS setting?
Full (strict)
What are the steps to reproduce the issue?
Set Security> Bots > Definitely automated to “Block” or “Managed Challenge”
The bot in question I would have thought would have been considered manually triggered (by me) rather than “Definitely Automated” and therefore blocked?
I use alot of web based tools. The two I mentioned are ones that were noticed in the past two days, I’m quite sure many other legitimate tools have been blocked as well over the past number of months (and a little worried that our rank has dropped considerably if other legitimate ones were blocked due to not filling out a form).
Surely the bigger recognized tools (schema. org for example) should be included automatically to at least a grey list where they are allowed unless otherwise blocked by the user, Cloudflare’s AI or reported as bad by a significant number of persons.
“A bot manager product allows good bots to access a web property while blocking bad bots. Cloudflare Bot Management uses machine learning and behavioral analysis of traffic across their entire network to detect bad bots while automatically and continually allowlisting good bots. Similar functionality is available for smaller organizations with Super Bot Fight Mode, now included in Cloudflare Pro and Business plans.”