For SEO reasons, direct access to domain.pages.dev
should not be possible, only if it comes from custom domains (e.g. domain.com
).
This is very serious, search engines will see this as duplication and will penalize the website. I’m surprised this hasn’t been addressed.
To prevent direct access to your domain.pages.dev
and only allow access through your custom domain, you can set up bulk redirects. Follow the guide at Bulk Redirects, but replace www.example.com
with your pages.dev
subdomain in the setup process. This will prevent search engines from treating these as duplicate content.
Thanks for the question @andrei2 and thanks for the answer @WalshyMVP. I wasn’t aware this was possible. Can I ask Walshy why this information doesn’t appear (that I’ve found) anywhere in the pages documentation? The only thing close is Prevent your pages.dev deployments showing in search results which uses the X-Robots-Tag
header in the _headers
file (which is still relevant.)
Please make a docs issue and we can add it somewhere
This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.