AmazonBot caused website went offline

2 times during the past week (Tue and Sat) we faced the website down because of AmazonBot activities lead to server CPU overload. Even after we blocked the IPs caused the the CPU overload, we still experience troubles, however the website went back online. The website finally gat back to normal work only after we turned on Known Bots Blocking.
The questions are:

  1. whether this ‘Known Bots Blocking’ also blocks the Google Search crawlers?
  2. If we can assume that those AmazonBots are used by AI learning systems, how we can limit (but not block) those blocks so their affection to the website functioning become minimal?
    Many thanks for your suggestions

Hi there,

You can check this link to understand what bots categorized as known bots:

We would recommend you to specify bot name in a rule rather than using known bots as you are also blocking other good bots.

1 Like