No sitemap.xml to optimize interaction with bots

no sitemap.xml to optimize interaction with bots

I am getting this error and dont know how to fix it…can someone help :slight_smile:
I am using Semrush to test and coming up with this error. the url is hostgroup.us

this is what I see with ROBOTTXT tester on Google:

See http://www.robotstxt.org/wc/norobots.html for documentation on how to use the robots.txt file

User-Agent: *
Disallow: /?cf_affiliate_id=
Disallow: /for_domain/

Sitemap: https://www.clickfunnels.com/sitemap.xml

How and where would I go about fixing this?

Thanks so much in advance!
Dee

A sitemap file can be manually created (for very small websites) or automatically created by your platform/CMS or by third-party tools.

Is the example listed for your own webite? Is clickfunnels.com your domain?

This is not a Cloudflare-related issue.

2 Likes

Thanks for the reply! So Clickfunnels is our website builder, there is no place to add the site map robot text code there…I use pro-sitemaps.com to generate the maps, they told me to simply add their robot XML link to my robot.txt file, but I don’t know where to find this.

Are the DNS records for your site set to :orange: Proxied?

yes they are…thank you so much for replying!

my CNAME record is set to proxi, the rest are not.

If the hostname of your site is :orange: Proxied, you can create a Worker that triggers on the route of example.com/sitemap.xml and returns your XML. Here’s my robots.txt Worker:

async function handleRequest(request) {
  const init = {
    headers: {
      'content-type': 'text/plain;charset=UTF-8',
    },
  }
  return new Response(someHTML, init)
}
addEventListener('fetch', event => {
  return event.respondWith(handleRequest(event.request))
})
const someHTML = `User-agent: *
Disallow:
`