Prevent sites from showing up in search engines

What is the name of the domain?

NA

What is the error number?

NA

What is the error message?

NA

What is the issue you’re encountering

NA

What steps have you taken to resolve the issue?

We have a bunch of dev sites that we want to prevent showing up on search engines. We can’t use the robot.txt way since it’s a shared server.

Is there an easy way in Cloudflare to block them with either a FW rule or maybe another type of rule?
Thanks

What is the current SSL/TLS setting?

Full

What are the steps to reproduce the issue?

NA

If your main domain is using Cloudflare and dev.example.com sub-domains are proxied :orange: , you could add them behind an Access policy to your Cloudflare Zero Trust team, therefrom define a Policy so only you and developers can access them.

You cannot prevent them being added and shown in the “DNS history” of the public services, which do expose your DNS records for your and other domain(s) as well, or scan on regularly daily basis.

Adding below HTML meta tag could help, since most bots don’t follow not respect the robots.txt nowadays.

<meta name="robots" content="noindex, nofollow, noodp, noydir, noarchive, nosnippet, notranslate, noimageindex, nocache">

Best method for this is restrict access with Zero Trust Access and sending a x-robots HTTP header as well via Cloudflare Response Header Transform Rules from my experience.

1 Like

Thanks. I was actually able to get this to work using a worker

export default {
async fetch(request) {
const url = new URL(request.url);

if (url.pathname === "/robots.txt") {
  return new Response(
    `User-agent: *\nDisallow: /`,
    {
      headers: {
        "Content-Type": "text/plain",
      },
    }
  );
}

// Pass through all other requests
return fetch(request);

},
};