I want to use a worker as a “proxy” for uploading large files to a storage provider (as I need to add a API secret in the header).
It works great for smaller files, but I get 413 errors when uploading bigger files.
Now I saw that there are request upload limits (100MB for free users) but I thought they were meant for CDN/proxy.
I also saw that Workers can use the Streams API… so my question is - would it be possible to bypass the upload limit by using the Stream API and effectively streaming the file already during the upload to the storage service? This should limit the memory requirements on the worker side, as there would be no need to keep the whole file in memory.