Cloudflare DNS proxy loads a different robots.txt

My robots.txt files hosted at the root of my server is set as follow:

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

However, every time I activate the DNS proxy for my website (www/mydomain) on Cloudflare, then mydomain/robots.txt shows the following setting when I do a hard refresh (ctrl+shift+R):
User-agent: * Disallow: /

As soon as I set the settings back to DNS only (not proxied) and do a hard refresh (ctrl+shift+R), then my url mydomain/robots.txt shows the right configuration.

I cleared Cloudflare’s cache, created a cache rule for the robots.txt, but the issue does not go away. This issue has negatively affected my site’s SEO, as Google started hiding my page’s title, meta description, favicon, and images. And this issue occurs every single time I enable the DNS proxy.

Has anyone been in the same situation?

Ok, I reached out to my hosting provider, and they mentioned that “We don’t recommend enabling Cloudflare proxy as this can cause issues like robots.txt. The only fix is to use the Cloudflare DNS using DNS Only.”
Too bad that my server’s configuration does not allow to use Cloudflare’s proxy :frowning:

What is the domain?

The domain is (hosted on I’d really like to be able to proxy my DNS over Cloudflare

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.