Hi, I am getting an error in Google Search console - “robots.txt fetch failed. You have a robots.txt file that we are currently unable to fetch. In such cases, we stop crawling your site until we get hold of a robots.txt, or fall back to the last known good robots.txt file. Find out more.”
Hi, it seems unlikely, but it may be possible that your security settings make Cloudflare issue a challenge or a block for bots (which Google’s Search Crawler is). You could set a page rule, for only the robots.txt path, which disables security. You could also try doing a fetch as Google test on that URL and seeing what the error is, that would help.