Google bots cannot crawl website over HTTP/2


One month ago I receive a message that Google systems will start crawling my website using HTTP/2. Since than I started to receive errors about pages being unreachable. 2 weeks ago the error rate increase drastically. I saw in Cloudflare logs google bots using HTTP/2, on my server the requests are with 200 but they report that bot is not able to access the page. After I disabled HTTP/2 in Cloudflare, bots started to report site being accessible.

Did someone had this issue ?


I haven’t experienced that issue for now, either as Google says it is possible to opt out of HTTP/2 crawling. The server must be configured to send a 421 server response code.

Maybe Cloudflare needs to update their “known bots” to make the “new Googlebot HTTP/2” as a “good one”, confirm and verify the IP addresses?, and allow the requests to pass?

Have you tried contacting Cloudflare support regarding this issue?

As stated due to crawl error troubleshooting:

If the above troubleshooting steps do not resolve your crawl errors, follow the steps below to export crawler errors as a .csv file from your Google Webmaster Tools Dashboard. Include this .csv file when contacting Cloudflare Support.

  1. Log in to your Google Webmaster Tools account and navigate to the Health section of the affected domain.
  2. Click Crawl Errors in the left hand navigation.
  3. Click Download to export the list of errors as a .csv file.
  4. Provide the downloaded .csv file to Cloudflare support.

Thanks for response, in the mean time i contacted also the support :slight_smile: .

1 Like

you should read this.

This topic was automatically closed 5 days after the last reply. New replies are no longer allowed.