In SEMrush, I’m getting the following error when I try to load a page in the "On Page SEO Tracker’ section: “SEMrushBot-Desktop couldn’t crawl the page because it was blocked by robots.txt. Nevertheless, we were able to collect a few general ideas for this page. Please ensure that your page can be accessed by search engine crawlers, and then start optimizing it using our ideas.”
SEMRush support team let me know that the crawl delay is too high (3600) and that their max crawl delay is 1 second.
I’ve checked with my host (Dreamhost) and when they looked into it, they discovered that the crawl-delay in the cURL only appears when Cloudflare is active. Is there a way to remove this restriction within Cloudflare?