My site is working fine under the internet browser, but googlebot is not able to crawl my site even my sitemap, so part of my site can not be indexed by google search engine.
So i did some research and ran a test on
when the user agent set to be “googlebot”, the error code 403 will happen. If i pause the cloudflare service and run the test again, the error code will gone.
Is it the reason why GSC is not able to crawl my site even my sitemap?
Why is the 403 happen? Did i setup something wrong in Cloudflare?
Any help would be appreciate.