Worker upload limit

I want to use a worker as a “proxy” for uploading large files to a storage provider (as I need to add a API secret in the header).

It works great for smaller files, but I get 413 errors when uploading bigger files.

Now I saw that there are request upload limits (100MB for free users) but I thought they were meant for CDN/proxy.
I also saw that Workers can use the Streams API… so my question is - would it be possible to bypass the upload limit by using the Stream API and effectively streaming the file already during the upload to the storage service? This should limit the memory requirements on the worker side, as there would be no need to keep the whole file in memory.

I’m pretty sure the Streams API only connects to Streams storage. The only workaround I’ve seen is chunking the upload.

I mean this API https://developers.cloudflare.com/workers/learning/using-streams
In the example they are passing the response of a fetch subrequest to the response body of the main request. In my case I need to pass the main request’s body (binary data) somehow to a subrequest’s body…
Actually it kind of works but the limits still apply (and I am not sure if I achieve any performance boost / concurrency )

Chunked upload would be great, but the storage provider doesn’t provide it…

Upload limits still apply to Workers since they still enter Cloudflare’s network and hit their proxies, they just never leave CF’s network unless the worker calls fetch - and you are indeed calling fetch, which uses Cloudflare’s expensive uplinks to the internet. While I don’t think Cloudflare is particularly hurting from the bandwidth people use, the policy of ‘100mb upload limit’ is still in place.

1 Like

With regards to bandwidth-wouldn’t you essentially use the same amount if you do the upload of “several gigabytes“ in chunks?
Anyway, it make sense if the limit is set on the proxy.