Is your site public yet? If it is the site in the picture you sent, it seems that you have not pointed to the correct IP address for that domain name, when it can be accessed normally from public, Google can crawl and index your site.
The firewall event you saw is totally unrelated to Gooblebot. You need to go to GSC, make a fresh request, wait 5 minutes, then check the Security Events section for the specific URL you are testing. Make sure it’s Googlebot (check the UA, and using command line interface, run the command host x.x.x.x for the IP in the event.
You should NOT allow any IP address you don’t have full control of. Remove the two IP Access Rules you created, as they open your site to potential attacks. Even if this were Googlebot’s IPs, you should not allow them, as hackers always find creative ways to make Google do certain things for them.
Instead you should edit the rule that is blocking Googlebot and make sure that it only blocks/challenges if not a Known Bot.
And BTW, it’s not a given that Googlebot is being blocked by Cloudflare. It could as well be a firewall at the origin.
Sir, regarding the IP address, thanks for your advice, I’ve removed the two IP Access Rules.
I want to report, yesterday I deleted all DNS Records content in Cloudflare, then I Remove Site from Cloudflare. After that, I re-added the site on Cloudflare (prediksitoto188 .com). This worked temporarily. I mean, my website can be crawled by google search console for 1 day. I just let it go, didn’t do anything, but strangely today when I tried to do a Test Live URL on GSC, the description URL is not available to Google / Page cannot be indexed: Blocked due to access forbidden (403) reappeared.
Just for your information, before, some of the things I did were :
I have disabled plugins, then I tried to go to GSC for a Test Live URL on GSC, but the 403 error still appears.
I have checked file permissions in cpanel, everything looks normal (Files – 644 or 640
Directories – 755 or 750)
I have deleted .htaccess and created a new one, but still 403 error
I have deleted robot.txt and created a new one with Yoast SEO, but still 403 error
I tried using the Wordfence Security plugin but it didn’t help, so I deactivated the plugin.
I tried uninstalling wordpress in cpanel for my site, and reinstalling it, but it didn’t help.
For that many reasons I thought maybe the problem was in Cloudflare sir.
Regarding make a fresh request, wait 5 minutes, then check the Security Events section for the specific URL for testing. can you help me check the UA/using command line interface, because I’m not familiar with using it (see result image below)
I see on your last screenshot that all events were performed by “Custom rules”. That means you need to edit the Custom Rule that is blocking Googlebot. Above that “Request details” section on your screenshot there should be a “Matched service” section, which identifies the specific rule that is blocking the requests. You need to edit this rule to make sure it does not block Known Bots.
Again, skipping/allowing all requests from all Known Bots open your site to vulnerabilities. You should remove that rule. Then run the GSC requests again and check if there is any Cloudflare product blocking the requests. If there aren’t any blocks recorded on the Events panel, that means Cloudflare is not blocking Googlebot, and you need to check your origin configuration.
It means Cloudflare is not blocking Googlebot, something else is blocking it. You need to talk to your developers or check your WordPress installation for possible issues, or ask for help from your hosting provider.
Sir, when I don’t use cloudflare, my GSC is successful in doing Live Test URL & request index, but when I use cloudflare again, 403 error appears again. Are you sure this problem is not due to Cloudflare ? I mean is it possible that my settings in cloudflare are wrong causing 403 errors when I use cloudflare ?
In your previous post, you said you were able to use GSC, there was a block, but the block was not seen anywhere in the Security Events panel. To the best of my knowledge, this means something else (not Cloudflare) is performing the blocking. However, Cloudflare is a set with many, many tools, and yes…
It is possible your settings in Cloudflare are sending a signal that your local firewall detects and causes it to block the requests. This is too hypothetical a scenario, and I cannot speculate on what might be wrong, only, with my limited understanding of what you presented, suggest troubleshooting steps.
To the best of my knowledge, if any Cloudflare product blocks a Googlebot request, there should be an Event logged within a few minutes of the action being performed.
My advice is that you focus on your origin server, check its logs, ask your hosting provider’s support team to help you set up better logging, and so on.