Robotx.txt file blocked

Hi all,

As per GSC:

I opened a thread in the Google Community support forum and they advised “They have some sort of firewall built in (not sure the specifics) - see if that is blocking Google somehow!”

How can I check that?
I can provide more info if needed.
Thank you!

Check if has any blocked requests and adjust your security settings accordingly to make sure they are not blocked.

When I click on the URL you sent, I see "No firewall events found matching your filters
Does that mean it’s all clear and nothing is blocked?
Thank you.

Yes, at least in the time period you selected. You should also check your webserver log files.

What’s the domain?

I checked last 24 hours.
I contacted Siteground and they said that on their end everything is OK.
Domain is

Then the question is when did they check it the last time. If that was more than 24 hours ago it could have been blocked by Cloudflare but that’s difficult to say now. You could only try to select other periods to check if there were any blocks there, otherwise it shouldn’t be Cloudflare.

Yeah I saw there was some black 2 days ago. Not sure what that means…

That shouldn’t be Google though. You are referring to Google here, right?

1 Like

Yes, I see the error described in Google Search console.

Yesterday there was a block too

What those block mean exactly?

That was from a Google address, but not Google itself but just someone using their hosting services. So far these blocks all seem okay.

You could always configure a page rule to disable security for /robots.txt.

Thank you.
And is it safe (recommended) doing so?

It probably shouldn’t be too much of an issue for this particular path, but generally I’d probably rather check why and where it is blocked and then adjust that. The two blocks you posted so far should not have blocked Google.

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.