Ideas around a Cloudflare Workers architecture for sending bulk personalised emails


We’re investigating the use of Cloudflare Workers for parts of our stack and were wondering if anyone had thoughts on handling bulk personalised email sendouts. We don’t have any concerns about the Email API as that’s handled by SendGrid, but we’re looking for ways to improve our User Subscription List API and Newsletter Content API. Some context:

We send about 500k personalised emails per week. Each email is customised based on what the user is following on our website. Our initial thoughts were to have a few Cloudflare Workers that handle different stages of the email prep process. For example, getting the list of subscribed users and getting the content for a particular user’s personalised email.

What are the recommendations in the community for dealing with something like this with Cloudflare Workers? We currently use a messaging queue and publish a list of IDs from the User List function. The Newsletter Content function, which is also a subscriber, grabs the IDs and fetches the relevant content, generating the email payload in the process. We then fire that off to SendGrid as a batched request.

How would Cloudflare Workers handle something similar? Could it handle a large number of requests simulataneously if our messaging service pushes these messages to its endpoint?

We’d like to process these emails as quickly and efficiently as possible. From our research, I think we could make some substantial cost savings using Cloudflare Workers, but I’m not sure whether that brings limitations on what we can achieve.

Hoping for some insight, thanks!

Each Worker Invocation has a limit of 50 fetch request, all depends on your Provider if it allows batching and templates. You could use 24*60*50*7 = 504K Workers triggers and drip the email during a week.
I would not use Workers for this and use a local nodejs script.

Will Workers Unbound have this limitation?

You can certainly scale workers by using two zones, you could build requests exponentially, just trigger back and forth between the two worker zones.

For this kind of workload, it seems safer to just stay on a queue-based system though, you’d get more control and don’t have to worry about DOS attacking your own infrastructure.

Assuming I’m using Google PubSub, I can batch messages into one request, but only to an extent. If I run the Cloudflare Worker on a scheduled trigger, say once a week, that one request per week can only make 50 outgoing requests to the Google PubSub API, is that correct? Mathematically speaking, that would mean I’d need to batch 10k messages into one request at the very least. Unfortunately, PubSub has a limit on 1k batched messages per request, so this won’t scale. Am I understanding these limitations of Cloudflare Workers correctly?

1 Worker can make 50 sub-requests max, per request.
But you can use a single worker to spawn 50 workers who spawn 50 workers etc.

1 Like

Also keep in mind that a single worker request can’t parse JSON larger than 1MB without exhausting the CPU-time.

Yeah, so I’m at the point where I probably need to create a queuing system that can scale while also keeping within the constraints of the 50 outbound request limit. I don’t think I’d ever need to worry about the JSON payload size. Is there a limit to how many inbound requests / worker instances you can spawn at once?

Not that I’m aware of, I’ve spawned thousands of requests per second in my load testing.

You can create a queue using KV if you need, just create a new KV key every time then there’s no limit, you can create as large a queue as you need. Then you can use a worker cron trigger to work through the queue by using the KV listing feature. The KV listing can also store meta-data.


Interesting, thanks for the insight, checking that out now.

1 Like

Important though, you’d pay +5$ per 0.5M queued items (1x write/1x delete & listings), know what workload you’ll use before going into it, because it can be wildly expensive.

Another option has emerged: