No sitemap.xml to optimize interaction with bots

no sitemap.xml to optimize interaction with bots

I am getting this error and dont know how to fix it…can someone help :slight_smile:
I am using Semrush to test and coming up with this error. the url is

this is what I see with ROBOTTXT tester on Google:

See for documentation on how to use the robots.txt file

User-Agent: *
Disallow: /?cf_affiliate_id=
Disallow: /for_domain/


How and where would I go about fixing this?

Thanks so much in advance!

A sitemap file can be manually created (for very small websites) or automatically created by your platform/CMS or by third-party tools.

Is the example listed for your own webite? Is your domain?

This is not a Cloudflare-related issue.


Thanks for the reply! So Clickfunnels is our website builder, there is no place to add the site map robot text code there…I use to generate the maps, they told me to simply add their robot XML link to my robot.txt file, but I don’t know where to find this.

Are the DNS records for your site set to :orange: Proxied?

yes they are…thank you so much for replying!

my CNAME record is set to proxi, the rest are not.

If the hostname of your site is :orange: Proxied, you can create a Worker that triggers on the route of and returns your XML. Here’s my robots.txt Worker:

async function handleRequest(request) {
  const init = {
    headers: {
      'content-type': 'text/plain;charset=UTF-8',
  return new Response(someHTML, init)
addEventListener('fetch', event => {
  return event.respondWith(handleRequest(event.request))
const someHTML = `User-agent: *

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.