yeah layer 7 protection is not the best and you need some technical knowledge to stop them but
unless one of this rules apply to you, there is no way to really protect any site with any service(its sad but it true):
- 100% of your site can be cached
- your servers can be horizontal scaled
you can ask why? because of one simple fact: there is no really way to know if a person or a “bot” is visiting you site, now if your server is really great, and your most heavy route can sustain 300 requests per second(which in reality in most cases you will not be able to achieve it)
you will only need bot network of around 10000 machines to put your site down.
(in most sites there is routes that cant even handle 10-30 requests per seconds)
so the sad reallty is unless you can scale your servers to handle 10,000+ requests per second, or make all your routes cacheable, your site will be easily down by layer7 ddos attack, and I dont believe any service can really protect against it
now to the optimistic side:
unless your site is really high value and some shark can invent a lot of money to put it down, probably its just some script kid that bought some cheap service to attack you, in this case you will need some technical knowledge and good logging service to analyse the attack and block it.
so the first step to block it is to get logging service, you can try logflare app or if you want to run it yourself you can see this worker(which I use in my sites)
after you have the logs you will just need to find the common patterns and block it
the next step is to start looking into scaling your servers and full html catching