Robots.txt fetch failed error in Google Search Console

Hi, I am getting an error in Google Search console - “robots.txt fetch failed. You have a robots.txt file that we are currently unable to fetch. In such cases, we stop crawling your site until we get hold of a robots.txt, or fall back to the last known good robots.txt file. Find out more.”

But I can access the same from here -

I think it has something to do with Cloudflare. Can you please help?

I have also scanned similar issues in the Cloudflare community but could not get any working solution.

Hi, it seems unlikely, but it may be possible that your security settings make Cloudflare issue a challenge or a block for bots (which Google’s Search Crawler is). You could set a page rule, for only the robots.txt path, which disables security. You could also try doing a fetch as Google test on that URL and seeing what the error is, that would help.

Thanks a lot @matteo, let me check, if the suggestions work.

1 Like

Take note about this delay that is currently going on…

This topic was automatically closed after 14 days. New replies are no longer allowed.