I did 2 tests (POC simple and light page) - one with Bot fight ON, and one without it:
with results: 0ms vs 1341 ms for Total Blocking Time,
Breakdown:
page itself script: 89ms
CF insight script: 36ms
and bot script: ~1300ms
1300ms … this is huge? Google penalizes full TLD for that huge delay
attaching image that shows full blame on bot fight script…(/cdn-cgi/challenge-platform/scripts/jsd/main.js)
Can someone change my reasoning please…cause if CF deploys that heavy and SEO detrimental scripts on my sites - this could be even food for thought for completelly reconsidering CF as solution altogether…
Either keep Bot Fight Mode enabled, or disable such feature if it’s causing you bad score for page load time.
However, take caution because some scrapers might get you after it’s disabled, potentially causing more unneeded and unwanted traffic to your server and more.
Please, feel free to experiment a bit, test and see what options are working for our case the best, tune-up and use or not.
I’d suggest testing using multiple online tools, if so, and not be based only on the results from the one and only due to the measurements.
If 1.3s is huge, despite the nowadays 4G, 5G and FTTH connection, images and all the resources taking seconds to load the Websites, I am afraid it’s not.
It takes time since the CF script analyzed the request which came from the tool itself and the tool itself, somehow had to execute it since it loaded it for it’s test.
If you’re a normal Website visitor, you wouldn’t encounter such experience at all.
The script takes longer CPU time likely due to a PoW or signal collection, it’s not slowing down your site at all.
If it actually impacted the time to render, yes, but that’s not the case, both tests load in the same amount of time.
Whether a tool says that a thread is being blocked or not is irrelevant since it’s not affecting the visitors experience the slightest.
I completelly agree with you here, but I assumes that time would be smaller - not that much…
AFAIK this bot script inclusion on page could not be conditionally toggled / per viewer… is client viewing page - or Google bot etc… so maybe script could ident that in script and to not include it for Google bots or other similar clients (known IP ranges right?)
Unfortunately, I have no clue how does it work exactly, neither what’s being calculated or not with this script since Cloudflare has multiple different ways and criteria to detect malicious request or attacker.
I’d say it does. We can tune a bit this kind of settings on paid plans with Super Bot Fight Mode as needed.
You can always bypass or allowlist the whole ASN number of Google and Bing by adding it into the IP Access Rules, which isn’t always the best action to do since a lot of bad bots and requests come from Bing (Microsoft ASN) and unneeded gigabytes of traffic from crawling.
If they’re coming from suspicious IP address such as TOR or VPN services, they’d get the challenge page, once passed, 1-2 seconds, they’d see the webpage. Yes, it does affect them, but not over normal local ISPs (if not sending some strange requests via python, go, or executing some vulnerabillity scan from their host).
Back to the “score” and “SEO”, here’s what I did experimented:
Have (Super) Bot Fight Mode enabled
Googlebot (detemined by the ASN, user-agent string part and more) is allowed to access robots.txt and sitemap.xml
Verified bots are enabled in WAF
Google Page Speed Tool runs from a different ASN (Google Cloud Platform) than the Googlebot is coming (at least from I remember), therefrom you can even “trick the tool” by blocking the Google Cloud Platform that way you’d always get “green” and “fast” results (90-100) for your URLs, fast LCP/CLS/TTFB/FCP, etc., despite they’re the ones with 403 error (not the expected page), while the URLs are normally functional and indexed and crawled in Google Search Console and displayed at Google search results