Adding robot.txt to a worker?

I have a worker that is used to set the headers. This has allowed us to pass our internal compliance scan. Now this internal scan is looking for the robot.txt file. How can I add this to my worker? What is the easiest way to implement this using Cloudflare? I’m a novice so some examples or link would be great

Since I’m lazy, I’d create a separate Worker just for robots.txt:

async function handleRequest(request) {
  const init = {
    headers: {
      'content-type': 'text/html;charset=UTF-8',
      'cache-control': 'max-age=31536000',
      'X-Frame-Options': 'SAMEORIGIN',
      'Referrer-Policy': 'no-referrer',
      'content-security-policy': 'upgrade-insecure-requests',
      'X-XSS-Protection': '1; mode=block',
    },
  }
  return new Response(someHTML, init)
}
addEventListener('fetch', event => {
  return event.respondWith(handleRequest(event.request))
})
const someHTML =  `
User-agent: *
Disallow:
`
1 Like

We have a external compliance scan tool hosted internally. It scans for headers, robot.txt, CSP etc… We have 2 identical applications that returned the same exact results before we added one of them into Access. We have a robot.txt file set at the application level for both DNS The DNS using Access is now failing the robot.txt value and not presenting the headers we set in a worker. This same worker is showing the proper values for the DNS not using Access, Ideally we’d like use the worker to resolve the robot.txt.

Hi Sdayman,

You are soooooo lazy…….Thanks very much I appreciate it!!!. I have another issue with this same Access URL and the worker. Since we added this DNS to Access our standard security headers are blockd in our scan. This same worker presents the proper value with URLs not added to Access. Can this this issue be resolved using a worker?

Thanks

Are you talking about Cloudflare’s “Access” app for password protecting URLs? If so, then you can add a “Bypass” policy for the IP address of your scanner.

But where would i find this robots.txt file in Cloudflare

Cloudflare does not host files. Robots.txt should be in the root directory of your website. But if you’re coding in Workers, you can use Workers to respond to requests for robots.txt.

Hello Everyone!
I need help! As my website is built using click funnels and is hosted on Cloudflare, now I am facing a problem in finding or modifying the robots.txt file in Cloudflare. Would you help me in finding the steps to edit/add the robots.txt file?

Thank you for the reply!