Javascript Detections without Bot Management Upgrades

What is the name of the domain?

What is the issue you’re encountering

Unable to use JavaScript Detections without Bot Management Upgrade?

What are the steps to reproduce the issue?

I setup a worker script using the cf.bot_management.js_detection.passed and other cf.bot_management fields… but when I go to use them in a real environment it all fails.

So I then Try to setup things in the WAF firewall (cf.bot_management.js_detection.passed) and it tells me that I need a Bot Management Upgrade to use this?

When I turn on “JavaScript Detections
Use lightweight, invisible JavaScript detections to improve Bot Management. Learn more about JavaScript Detections.” What exactly does it do for my account if I can’t access anything without the Bot Management Upgrade?

The entire pages make no mention of this? JavaScript detections · Cloudflare bot solutions docs that is referenced from my bots configure page /security/bots/configure

The Pro ACcount info also says I should be able to use a WAF custom rule, but I don’t seem to be able to: Get started with Super Bot Fight Mode (Pro) · Cloudflare bot solutions docs

Please help?

I see that it creates the ‘cf_clearance’ cookie. Is there a way to 'validate this?" Shouldn’t the cookie be slightly different to avoid mixing in the other challenges?

What prevents the bot from just adding ‘cf_clearance=2784972848274’ ?

If I use ‘javascript detections’… I can’t use the other challenge can I?

That is correct. It’s documented here:
https://developers.cloudflare.com/ruleset-engine/rules-language/fields/dynamic-fields/#cfbot_managementjs_detectionpassed

Yes, for Skipping bot fight mode.

I see that it creates the ‘cf_clearance’ cookie. Is there a way to 'validate this?" Shouldn’t the cookie be slightly different to avoid mixing in the other challenges?

What prevents the bot from just adding ‘cf_clearance=2784972848274’ ?

or is that what bot fight mode is? I turn Bot Fight Mode on for the pages I want to validate the cf_clearance cookie ?

Can I not say “if(cf_clearancecookie != ‘validated’)then challenge ?” ?

If I change my “cookie”, it just makes the javascript detections come back.
If I use ‘javascript detections’… I can’t use the other challenge can I?

“Yes, for Skipping bot fight mode.”

Oh well dang, I wish that was more like labeled.

What is it you’re trying to accomplish with your Worker? If you’re trying to evaluate the bot-ness of a request, you’ll need an Enterprise plan with the Bot Management add-on to get more data out of bot management.

“If you’re trying to evaluate the bot-ness of a request, you’ll need an Enterprise plan with the Bot Management add-on to get more data out of bot management.”

I filled out the ‘Bot Management’ form… but I fear it’s super expensive…

I’m basically using the Javascript Challenge now to kindove restrict some access on the website from bots.

Question:
If a user passes the Javascript Challenge from Bot Management… and I send them with my own rules to WAF rule that says:

(http.request.uri.query contains “captcha”) then take action
Managed Challenge

Will the cf_clearance header allow the user to pass the Managed Challenge ?

Basicallly what i’m asking is if I turn on Javascript Detections, am I essentially making the “Managed Challenge” easier to “crack” because now the user only has to pass the Javascript Detection vs the Managed Challenge?

I hope this makes sense!!!

Nobody should have to pass multiple tests to get into a site. “You might be a bot…please pass this challenge.” followed by “You’re from another country…please pass this challenge.” and then “You want to access this hostname…please pass this challenge.”

Javascript Detections don’t block users. It’s only a signal that helps a rule assess the request’s legitimacy. If there are no rules, then the Javascript Detection won’t interfere with a visit.

P.S. It would be Sweet to have more rules with these bot things…

What I mean is, the Javascript Detections would be awesome to only run them on certain traffic.

Like is it running for GOogle? Bing? All bots? Some bots? Who knows ;x

I think my setup is weird, i’m not sure. I’m trying to do it so nobody has anything to pass… Or at least that they have to “see”.

The thing with the default “challenge pages” is that it doesn’t carry the page title and other things… and I think because i’m blocking some legitimate bots it’s having an effect on organic reach.

“Javascript Detections don’t block users. It’s only a signal that helps a rule assess the request’s legitimacy. If there are no rules, then the Javascript Detection won’t interfere with a visit.”

Ya, i’m just trying to have more control with them… I think i’m using them right, I have my own javascript detections I made. I’m just trying to make 1 more hoop for these people harassing me with a botnet to jump through!

That doesn’t really matter. Let it run on all traffic, and then your other settings can do with what it wants. Googlebot and Bingbot are easily identifiable. That’s what Verified/Known bots and User Agent strings can be used for.

Sorry, ya I probably just overthinking it… but just want to figure out my “setup” that works for me.
I think it’s mainly just the WAF thing (because I can only skip rules you said or something), I’ll have to look more later. Tired. Thanks for your help!!

1 Like

This topic was automatically closed after 15 days. New replies are no longer allowed.