Hi, does anyone know if it is possible to compute a SHA256 hash of a request body larger than the Workers memory limit of 128MB? My use-case is uploading files to S3 and computing their hash as they flow through the worker.
However, from what I can tell, the SubtleCrypto API requires the entire contents to be loaded in an ArrayBuffer in memory before computing a hash, so that approach unnecessarily loads the whole file in memory and thus limits the file size to less than the maximum worker memory of 128MB.
In Node I would do something like digest.update(chunk) and so could incrementally build the hash while the body streamed through rather than loading everything in memory up front.
Am I missing something obvious? Is there a workaround to compute hashes on streaming bodies in Workers?
Have you considered using a third-party library like Forge? It seems like it would solve the issue, but I haven’t tried using it with Workers so can’t say how well it works.
var md = forge.md.sha256.create();
md.update('The quick brown fox jumps over the lazy dog');
// output: d7a8fbb307d7809469ca9abcb0082e4f8d5651e46d3cdb762d02d0bf37c9e592
Looks like they just added crypto.DigestStream in the latest Workers Runtime update to solve this exact problem.
Thanks to the Workers team for addressing this!
crypto.DigestStream is a non-standard extension to the crypto API that supports generating a hash digest from streaming data. The DigestStream itself is a WritableStream that does not retain the data written into it; instead, it generates a digest hash automatically when the flow of data has ended. The same hash algorithms supported by crypto.subtle.digest() are supported by the crypto.DigestStream.