Crawl tool getting 520 error

Hi there, I have a wordpress website hosted on SiteGround and ported through Cloudflare. When I try to check my website through browseo.net tool, I get “Error 520”. Browseo is a crawler tool and uses Googlebot as a user agent.

I checked server logs through SiteGround’s cPanel and didn’t see any error messages there.

I tried Googlebot and Bingbot through botsimulator.com and Cloudflare returns 520 error, Yahoo agent works.

Do you have an idea why I get a 520 error for Googlebot and Bingbot?

What is the domain?
Can you share robots.txt content?

Thanks for reply. It’s my personal website andrejgajdos.com

I haven’t changed robots.txt

User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php

Try this one. Why you need an Allow rule?

1 Like

This one works well. Do you know why Browseo and Botsimulator don’t work, but tool that you mentioned works?

No idea, I think robots.txt was automatically generated by SiteGround.

Probably blocked by CF.

This topic was automatically closed after 14 days. New replies are no longer allowed.