We have a rate-limiting rule for our website, but we would like to exclude bots(or more specifically SEO bots) from this Rate limiting rule, how can we achieve this
I just answered this on your other thread which you deleted.
Many Thanks sorry for this, for our case it is screaming frog SEO bot, which is being blocked and gets 429 error , this is because for the rate-limiting rule for sure
I see. They’re not a regular crawler. It’s an audit tool. And it looks like something you download and run, right? If that’s the case, then you can try to add the spider’s IP address as an Allow in Firewall → Tools.
This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.