Hi, does anyone know if it is possible to compute a SHA256 hash of a request body larger than the Workers memory limit of 128MB? My use-case is uploading files to S3 and computing their hash as they flow through the worker.
I see an example of computing such a hash here:
However, from what I can tell, the SubtleCrypto API requires the entire contents to be loaded in an
ArrayBuffer in memory before computing a hash, so that approach unnecessarily loads the whole file in memory and thus limits the file size to less than the maximum worker memory of 128MB.
In Node I would do something like
digest.update(chunk) and so could incrementally build the hash while the body streamed through rather than loading everything in memory up front.
Am I missing something obvious? Is there a workaround to compute hashes on streaming bodies in Workers?