The rules is pretty simple - but it’s just one of the rules I use:
On a daily basis we can have anything between 20k to 50k blocked pages (including SQL injection attempts, undesirable bots, blocked ASN/IP addresses, blocked access to protected pages):
I have quite a few rules blocking undesirable bots (I have the rule listed in another topic in this community, although I constantly update the rule with new bots):
As I mentioned, using reCaptcha v3 in all pages you can have an idea of how is the traffic distribution from “suspicious” to “clean” on your website - note this might need to be over a month or more so you can have a baseline. This is not directly related to Google AdSense “invalid clicks” but if you see a very high level of “suspicious” traffic on reCaptcha v3, Google will see it too. Also remember most crawlers (indexing, SEO bots, etc) don’t trigger reCaptcha v3 as most don’t actually execute scripts.
When a page is blocked on my site Cloudflare serves a custom error page. I have Google reCaptcha v3 on that page too, with its own action name. What I’ve noticed is that compared to other “actions” on my site, “cferror” has a very high rate (96%) of suspicious traffic, compared to non-blocked traffic to other pages, meaning pages block by those rules are pretty up high in the suspicious range (for an idea, my main page “forums” has only a 0.63% of suspicious traffic after all filtering is done).

These rules plus the server-side code that averages reCaptcha score by browsers/block ads have helped reduce the Google AdSense clawback.
I hope this helps - more of lots of ideas than specifics as each website is different - audience, country, topic, technology stack all influence on how to deploy these ideas.