Simple log collector worker

recipe-exchange

#1

I created a simple worker that simply log requests data and send them into logdna.com(but you can easily send it to whoever you want ofcourse)

my motive behind it is to help me better protect against layer 7 attacks, so in a time of attack I will simply send all the request data into the service(because probably my servers will be down) and collect all the info I need about the attack that I can use to create rules in cf-firewall and to better modify the rate limiting.

the only way to get logs otherwise which I know of is to upgrade into enterprise plan

  • user agent
  • referer
  • ip
  • countryCode
  • url
  • method
  • x_forwarded_for
  • asn

I field tested it for 1 day with 100% success rate

would love to get feedback
you can see here the code and simple instructions about the parameters


#2

:1st_place_medal: Nice
I run your code, had to correct some minors issues.
My previous implementation didn’t support batching.

I will do git PR with my changes.

Also, there is a strange phenomenon of chronological order, but I haven’t figured out a solution.
24


#3

would love to get PR, I also updated to code slightly as I forgot to include the user agent to the logs.

about chronological order the only solution is to remove the batching, because it looks like dnalog does not reorder the logs based on time(I will need to dig in the api docs maybe there is a way), and cf workers batching are per worker and not globally.

I try to flagged each worker and send it with the logs to see how many workers they are running, and it seems like they are even spawning multiple workers in each datacenter(probably depends on website loads)