This posting is part of a series on Cloudflare’s firewall engine and discusses rules which might make your site just a tad less welcoming to automated robots and crawlers.
The HTTP version
HTTP was introduced in '96 with version 1.0 and relatively quickly superseded by version 1.1 in '99. 1.1 had been the defacto standard for about twenty years but got with version 2.0 a successor on its own in 2015. Version 3.0 did not take 16 years any more but its support is still somewhat shaky.
One thing we can safely assume at this point, though, is that we won’t get many requests with 1.0. In particular because name-based hosting (one of the fundamental principles of Cloudflare) was only introduced with 1.1 and should, technically, not be available with 1.0. Still, some bots and crawlers (presumably rather basic implementations) still send such requests and that’s how we can block them.
Creating a firewall rule with the following expression will block or challenge such requests.
(http.request.version eq "HTTP/1.0")
Whether you want to block or challenge these requests is up to you of course. But in this case blocking should not be too much of an issue.
As always, don’t just copy/paste things and first evaluate if a new rule fits within your site setup and be careful when making such changes as they could break your site if not implemented with care. Also, pay attention to the order of the firewall rules as they are evaluated in order.
Ceterum censeo, Flexible mode is insecure and should be deprecated for the sake of the security of the Internet.