403 Forbidden under googlebot - GSC unable to crawl my sitemap


My site is working fine under the internet browser, but googlebot is not able to crawl my site even my sitemap, so part of my site can not be indexed by google search engine.

So i did some research and ran a test on
when the user agent set to be “googlebot”, the error code 403 will happen. If i pause the cloudflare service and run the test again, the error code will gone.

Is it the reason why GSC is not able to crawl my site even my sitemap?
Why is the 403 happen? Did i setup something wrong in Cloudflare?

Any help would be appreciate.

What does your firewall event log say regarding these requests?

Hi Sandro,

Thanks for you reply,
did you mean the Activity log under the firewall event?
I can not see any regarding information, the last update is on 26 Sep, 2019 09:54:37.
The action is “Challenge”, and the ip is not from google.

Thanks a lot.

In that case it is unlikely to have been blocked by Cloudflare.

daydayplay.hk is the domain?

The 403 response from your example seems to be from your server and not a Cloudflare response.

Hi, Sandro,

yes, the domain is daydayplay.hk.
At the meantime i have contacted our host company, but they did not find anything wrong yet.

I just create a Firewall rule is it ok?

Thanks for your reply.

The firewall wont do much, unless you have other rules which do block.

The issue here however seems to be that this 403 appears to come from your server and not Cloudflare, hence you will need to check your server logs.

From the test page you linked from:

Note: It looks like your site has returned a 403 Forbidden. In some cases the firewall or a bad bot utility will block the use of this tool as a “fake Googlebot”, the primary reason for this is the tool is a “fake Googlebot”. With a 403 response you should use the Fetch as Goolgebot utility in Webmaster Tools to verify your site is returning a 403.

How did you determine this prior to running the tool which is being blocked for the reason above?

HI cscharff,

Thanks for your reply,

at the beginning, i received a missing ads.txt warning from google adsense, and then i found out my sitemap was getting errors HTTP 403, so i did some research, and i think the issue is caused by googlebot can not crawl my site.

Then i found a thread that someone was facing the same issue, he was using Cloudflare too, he ran a test through the above link and get HTTP 403 error.

So i start to think is the HTTP 403 error caused by Cloudflare?

Hi Sandro,

ok. I m waiting for the hosting company.

Thanks again

This topic was automatically closed after 14 days. New replies are no longer allowed.