Robots.txt cloudflare page

Hi all!

I have a site deployed through cloudflare pages (https://www.chesscoordinates.com). I have a robots.txt in the root but Chess coordinate and memory practice does not serve the robots.txt. Oddly enough, I also have a sitemap.xml in the root, and that file is served perfectly fine.

I must be missing something but I cannot figure out what it is.

Some extra info: there is no 404.html. This is framework-less SPA. I’m using parcel to minify the css and JavaScript assets.

Could anybody point me in the right direction towards a solution?

Kind regards

I’ve spent hours today trying to fix it. I can’t even get the sitemap.xml to work like yours is, I’m using vite and adding them to the public directory which should work but cloudflare never gets that public directory sadly.

You need to copy your robots txt to the build output for it to be available.

I solved the problem by adding “cp -r public/* dist/” to my build command, making it “npm run build && cp -r public/* dist/”

If you do that, the folder containing your static robots txt will be copied into the dist, which is what cloudflare servers from. You can even make it copy just the robots txt file to the build output if you want.

Sorry for the late reply, fufferpish. And thanks for taking the time to share your experience and provide a solution.

In the end I added robots.txt and sitemap.xml to the dist folder and added

!/dist/robots.txt
!/dist/sitemap.xml

to my .gitignore. That did the trick.

I also tried it your way (making “npm run build && cp robots.txt dist && cp sitemap.xml dist” my build command in the cloudflare dashboard), but that somehow crashed the entire build and it would just shut down after 4 minutes (build usually takes 20 secs or something).

In any case your suggestions put me on the right path, so thanks again for taking the time to type this out.

Best regards

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.