Cloudflare firewall or security rules maybe blocking google Bots

What is the name of the domain?

What is the issue you’re encountering

Google Search Console is reporting a high fail rate for server connectivity on my domain

What steps have you taken to resolve the issue?

Google Search Console is reporting a high fail rate for server connectivity on my domain, despite the website appearing consistently accessible to regular visitors. According to the Search Console data, Googlebot is encountering frequent 5xx errors or timeouts, suggesting there may be an intermittent issue at the hosting or network level. We have also verified that there were no intentional blocks or downtime on our server end.
one of our Cloudflare firewall or security rules is likely configured in a way that denies access to what it perceives as suspicious or automated traffic—even though it’s coming from Googlebot.

What are the steps to reproduce the issue?

on google search console it show a high faill rate 25%

Screenshot of the error

Hello, you can either view the zone traffic analytics to find the error code received by the Google crawler or look it up in your Google Console to see the actual code. Once you identify the error code received by the Google crawler, you can refer to the 5xx error code documentation for more details.

Hello all,

Recently we have improved security settings and we have discovered that DDOS protection rule set to medium or high was causing Google Search Console connectivity issues.

How we got to that conclusion?
It was relatively simple as we have 3 websites:

Initially we have setup paranoia level to 3, in managed rules OWASP security level high and DDOS protection high (previously it was disabled).

Then we have started reverting only some settings back to the previous configuration and it turned out to be DDOS. As such with a high level of confidence we can confirm that DDOS protection was causing this issue to occur.

Security consideration
Well, we could leave the DDOS protection off or change the level to low, but that will simply reduce the security level to what we as a Manages Service Provider consider simply insufficient. Also it didn’t go well with our internal CISO…

Final Solution
The consideration was given to the googlebot being excluded from the DDOS rules which we can do with the Custom Rule. As much as we have considered potential attacker impersonating the googlebot and being exluded from the DDOS protection it has been noted that disabling DDOS protection would certainly reduce the level of security.

In our case we have used “or” condition for 3 user agents that we can see in the logs (checked 3 months of logs) so the matching looks like:
Field: User-Agent
Operator: contains
Value: Googlebot
or
Field: User-Agent
Operator: contains
Value: Googlebot-Image
or
Field: User-Agent
Operator: contains
Value: Google-InspectionTool

Downsides and issues:
There is no specific setting for DDOS exclusion, as such we have used managed rules and also we have excluded rate limiting from being checked. Please make sure you place it as a last rule and don’t tick “all other custom rules”

Hope this helps!