Allow All Google Bots Crawler

Hi guys, i’m struggling with the security (and while adding all google bots crawler ip address to firewall rules).

First, i found a small trouble in Lighthouse "Failed to load resource: the server responded with a status of 403 (Forbidden)".
image_2022-04-27_113432578

I guess google bot is prohibited from rendering images and blocked by Cloudflare. Maybe it’s because i’m activate Server-Side Excludes on Scrape Shield menu.

But i’m not going to deactivate that feature because it’s very useful for me.

Previously I chose to allow all Known Bots to make it simple.
image

After that, i found a person have issues that causing overload on they server (https://community.cloudflare.com/t/known-bots/91359), so i deleted that rule back.

I prefer to add all google bots crawler ip to my firewall rules, i’ve found all google ip address on here : https://www.gstatic.com/ipranges/goog.json, source : https://developers.google.com/search/docs/advanced/crawling/verifying-googlebot

But i’m little confused about this :

  1. What if I add all google ip to whitelist, is there anyone can take advantage of the security gaps? Because someone saying this :

  2. How to add all ip address into 1 rules? Do I need to upload all ip address list on my server? Which rule field should i choose?

Or there is any other effective solution? Please help, many thanks. The problem on https://rajatips.com

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.