Hi there, I am currently using a custom domain for my cloudfare pages website. I would like only my custom domain to be indexed by search engines & not the default .pages.dev subdomain. How can I add x-robots-tag: noindex
header to my .pages.dev subdomain, so that it’s not indexed.
Is it not something which should happen by default once a custom domain has been connected ?
I have recently shifted from Netlify. Here is how Netlify handles this :
Improved SEO with canonical link headers - Features - Netlify Support Forums.
Pretty sure the HTML head tag <link rel="canonical" ...
suffices. There’s no need to worry about that header as long as you build your site correctly.
Create a headers file called _headers
with this in and push to build:
https://yourpagesname.pages.dev/*
X-Robots-Tag: noindex
Full details can be found here:
Hi @whistles thanks for pitching in. I tried that but the header is not appearing.
I created the _headers
file at root. That should be fine right ?
Why does the documentation say like this ? Where should it be then ?
The
_headers
file should not always be in the root directory of the repository.
It should be in the Output directory, usually that is the main
branch
I am using Hugo as my SSG. Currently _headers
file is at the root of main branch.
Just realized that the documentation here suggests to use colon syntax to do exactly this.
https://:project.pages.dev/*
X-Robots-Tag: noindex
Tried this as well, but still not seeing the header.
In the same documentation here, it is mentioned that:
Create a
_headers
plain text file in the output folder of your project. It is usually the folder that contains the deploy-ready HTML files and assets generated by the build, such as favicons.
Thus for Hugo it should be in the static
folder (the output directory) & not at the root of repo (like Netlify’s _header).
This solved the problem.
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.