Valid Robots File, Can't Crawl Pages

I just deployed my site with the following robots.txt file:

User-agent: *
Disallow: /

After deployment, I attempted to test my SEO settings for deployment yyyyy.xxxxx.pages.dev and site xxxxx.pages.dev on Google PageSpeed Insights, The applicable error is as follows:

Title: “Page is blocked from indexing”
Description: Search engines are unable to include your pages in search results if they don’t have permission to crawl them.
Blocking directive source: xxxxx.pages.dev/robots.txt:2:0

I refer to Cloudflare’s Support Page on troubleshooting crawl errors: Troubleshooting crawl errors · Cloudflare Support docs. Cloudflare states that they allow search engine crawlers and bots. I am on the free tier with default anti-bot settings.

Whose problem does it lie with? Cloudflare, PageSpeed, or my robots.txt file? Thanks!

Your robots.txt has disallowed access to your whole site.

Use Allow: /

3 Likes

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.