Super Bot Mode Blocking Bing Bot from reading Sitemaps

After several days of investigation, I discovered that Super Bot Fight Mode (when either Blocking or Challenging) was actually preventing my Sitemap.xml file(s) from being read by Bingbot (as well as DDG and Yahoo bots). Various online tests that expose the headers (including Bing Webmaster Site Scan) returned a 403 error (or 400-499 error) meaning the sitemap (as well as other pages) were forbidden from being read by Bing bot.

Once I turned off Super Bot Fight mode in CF, the 4xx errors disappeared and my Sitemap.xml file(s) were read with no issue. This immediately caused a jump in traffic two to four fold! (I ran this test several times to confirm).

I can confirm this because before I tested Super Bot Fight Mode, I toggled and tested the WAF and all OWASP protections. It would be nice to have CF permit these additional good bot to pass through the firewall. HTH someone who is having the same issue.

Ninja Tip: Use a dynamic sitemap generator (AIOSEO if using Wordpress) as opposed to a static one. Your traffic will double overnight. In short, a static sitemap will not remove deleted images/files from the sitemap when your site changes - which the search engines will ding you for. A dynamic sitemap is created on the fly for the search engines and is never outdated - therefore you will not get dinged by the search engines and your traffic will skyrocket.

Yep. That’s why I (and many other people) don’t use Super Bot Fight mode. It’s too difficult to work around.


Feature request: Would like to be able to create a snapshot or download a file with all CF settings so restoration would be simple. Anything like this available?

The only two options are a very painfully long API script you’d have to craft yourself, or Terraform (slightly less painful):

The only easy settings to download are DNS:

Meh, I’ll just take screen shots. :blush:

1 Like

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.