Problem with BadBot Seekport Crawler

I’m receiving thousands of these requests from:
“compatible; Seekport Crawler;”
I tried to enable User-Agent (UA) Rules as is described here:

This is the rule:

  • Name Description: seekport
  • Action: Block
  • User Agent: compatible; Seekport Crawler;
    When I enabled it, for about 2 o 3 minutes, it blocked but now I can see again the request and the rule is there.
    Any suggestion for this problem?


1 Like

I was able to block it using Rate Limiting but I think that the best approach should be using User-Agent (UA) Rules. What do you think?

to my opinion you should keep the rate limiting on as its good idea to always have it against other bad bots or layer 7 ddos attack.

about seekport I would create firewall rule like:
user agent contain seekport challenge(or block)

I will first try with challenge if you see it keep hitting you I will change it to block


This topic was automatically closed after 30 days. New replies are no longer allowed.