Our SEO agency are complaining that the Bing crawler is refusing to fetch pages.
I have checked the firewall settings and there are no rules that could have an influence on the bing bot.
It appears to work when caching is turned off - to quote the agency “when caching was turned off pages jumped into the index”.
When caching is turned on, the pages dropped out again.
Problem is when caching was turned off, there was an impact on the SSL set up across the subdomains.
Any ideas how to resolve this?
Does the agency/bingbot mention any errors that’s stopping the crawl?
Which caching option are you turning off?
The error that they are receiving is along the lines of The DNS resolution for the host of the url(or the redirected url) could not be resolved.
They turned off caching by turning on development mode (which I believe turns off caching for a max of 3 hrs)
Later that day the site was experiencing SSL errors
Those seem pretty random and unrelated. “Seem” is the operative word. Is this a recently added site?
I am a bit skeptical myself. We have been using Cloudflare for several years now and during that time have been through various SEO agencies and this is the first one that has issues with our site indexing.
They did create some new pages and deleted old pages and I think they are having trouble getting the crawlers to read the headers and disregard the deleted pages and index the new ones.
What is the best way to disable caching for periods longer than 3 hours?
Surely temporarily disabling caching should have no impact on SSL
A Page Rule: Match *example.com* and set Caching Level to Bypass.
This topic was automatically closed after 14 days. New replies are no longer allowed.