Googlebot cannot crawl

I submitted my website to google search console, the couple times failed because of an invalid robots.txt. After I fixed the issue the page also didn’t work, because of a strange redirection error. I cannot figure out why or what to do

Invalid in case like some syntax error in your robots.txt file, or some misspelling to the URL, or it cannot fetch the file due to the slected security options for your domain name at Cloudflare dashboard?

Like? Can you provide an example?
HTTP to HTTPS or non-www to www, etc?

Can you open your URL to the robots.txt file in your Web browser?
Do you get some error?

Do you see some “blocked events” for Googlebot shown at your Firewall Events tab of Cloudflare dashboard for your domain name?

Is the file virtually created or generated (for example if using WordPress and Yoast SEO), or it’s rather the actual physical one stored at the exact location (path) at the origin host/server?

1 Like