I need to have a API proxy listening for huge amount of requests and able to respond quickly that message was received (peaks) and on outgoing side it does not really matter how fast it would be able to process messages (amount of messages is not big, they just arrive in huge peaks). Solution in my head is to use some queue mechanism like rabbit, there could be other way to solve it though. Any ideas?
Cloudflare doesn’t have any sorting of queuing atm. KV is not suitable for this either.
You can run your own queue via a hosted service like AWS SQS or GCP PubSub. Or spin up a rabbit / redis / kafka box.
Happy to help you use Logflare for this also if you’re interested.