R2 intermittent 500 errors on PUT

I’m looking at supporting R2 as a storage backend for our platform, and for the most part, it’s been a drop-in replacement for S3. Happy with it, so far. One of the functionalities we have, is a media upload for end users, where we use Multipart uploads, sent directly from the end user to the bucket. I use Uppy as a basis in the browser side of things, which uses API calls to our platform to start/complete/abort the multipart and to presign the individual chunk URLs so that the end user can PUT to those.

This works, for the most part, but on larger uploads (I haven’t seen it with 2GB files but it’s pretty much a given on a 20GB file) there’ll be some 503 errors (no problem, those are automatically tried again), and eventually actual 500 errors from the R2 side. The 500 errors halt the upload, and there doesn’t seem to be much in the way of resuming it again. This may be a current limitation in Uppy, which I’ll look into, but in the mean time I’m somewhat surprised at the 500’s from R2. We’ve been trying for a couple of days now, so it’s not a short-term issue. It’s also not constant, I can upload a lot of files and they’ll all go swimmingly, it’s just the big one that fails.

I’d like to know what could possibly cause them, and a cursory search here tells me I’m not the only one that sees it. If there’s anything I can do to help troubleshoot, I’m all ears!

1 Like

As an addition: we consistently get errors on a 20gb video. Smaller re-encodings all upload successfully, it’s as if there is an upper limit to the size of individual files. Is that the case?

It looks like there’s a 5GB limit…

That’s per chunk, not for the entire file. When uploading, Uppy cuts the 20GB file up in chunks that are way smaller than 5gb, so that’s not the limit we’re running into.

Is Uppy uploading parts of the same size (other than the final part)?

(whoops, butter-fingered my previous post)

I’d have to check, but IIRC, yes, all parts up to the semi-last one are the same size.

Allright, I checked. Yes, same size, chunk size is determined thusly:

const minChunkSize = Math.max(5 * MB, Math.ceil(fileSize / 10000));
const chunkSize = Math.max(desiredChunkSize, minChunkSize); // Upload zero-sized files in one zero-sized chunk

So whatever fits in 10k chunks, with a minimum of 5MB. So in my case, with a 20GB file, it’d use the minimum of 5MB, and send 2000 chunks, give or take.

It looks like Im running into this same issue. I ensure that only one file is uploaded concurrently and that the chunks are all the same size and no larger than 8MB.

I get random intermitted 500 errors “We encountered an internal error. Please try again.”. Retrying does usually work but this is not something I can quickly build into the software we use.

Some cf-rays for the uploads:

  • 7ded2050aae80bae-AMS
  • 7ded3b781828b890-AMS
  • 7dd597dfa99206dc-AMS
  • 7dd59105db81b72a-AMS

Any help would be greatly appreciated!