Bug: Improve robot detection to avoid permanent false-positives

Use e.g. hddguru.com. Perform casual human-user actions: browse pages, download files, etc. Almost every action prepended with a captcha human-proof. You have to resolve it for every request you initiate. Extremely annoying.

The robot-blocker must troubleshit only robots, but never human users. Never 0% accuracy, never 50%, never even 99,9999%, but exactly 100.000%. For human users robot-blocker must be absolutely transparent.
The way it implemented now means: there is a moody moron(-developed software) on the site side, who(which) experiences deadly panic attack because of permanent site crash on every robotic request. That’s why the moody moron(-developed software) attempts to prevent robotic connections AT ALL COSTS.
Now it completely lacks both system analysis and usability testing. It’s just a pre-pre alpha version of some raw handicraft, not worthing a penny to pay for.

This topic was automatically closed 15 days after the last reply. New replies are no longer allowed.