IP in list still being challenged

I’m trying to setup a firewall rule which challenges any traffic not from India or Malaysia or IPs in my list. It looks like this at the moment:

(ip.geoip.country ne “IN”) or (ip.geoip.country ne “MY”) or (not ip.src in $my_list)

Except, I’ve found that IPs from my list are still being challenged - there are some from US, UK etc.

Are countries prioritized over IP addresses?

Or, am I not understanding the rule order correctly?

Maybe due to some bad IP reputation since before?
Or maybe you are blocking some bot?

Should it be “not from India” or “not from Malaysia” or “not in my IPs list”, right?
Is the $my_list correct written? (should be using “underscore” _ symbol if some words between)

From my point of view it should do exactly how you described it.
You can also try with:
(not ip.geoip.country in {"IN" "MY"}) or (not ip.src in $my_list)

Is this rule the only one you have at your Firewall dashboard for your domain at Cloudflare?

I do not believe so, from the above rule, should firstly check if from India, then if Malaysia, then would go to check if in your IP list.

Could be true, if using IP Access Rules and having there the IPs which would not be allowed or blocked, in that case due to an “order prioritization” they are firstly checked and then firewall rules, and later on other stuff like shown at the picture below.

Maybe I am wrong due to the rule because you have “two countries”, you should try with “not 1st country and not ip list” or “not 2nd country and not ip list”?

I am afraid, you would need to patiently wait for another reply of some experienced user with this one.

When you list several negative rules, you must link them with AND. Using OR would make sense with positive rules for the action Allow, for instance.

If country is IN or country is MY or IP is in my_list, then Allow


If country is not IN and country is not MY and IP is not in my_list, then challenge

Besides changing ORs for ANDs, I’d also add another rule to exclude known bots (bots Cloudflare deems legitimate, such as search engines) and perhaps some URLs, such as robots.txt and ads.txt.