Possible to compute SHA256 hash of large request body?

Hi, does anyone know if it is possible to compute a SHA256 hash of a request body larger than the Workers memory limit of 128MB? My use-case is uploading files to S3 and computing their hash as they flow through the worker.

I see an example of computing such a hash here:

https://developers.cloudflare.com/workers/examples/cache-post-request

However, from what I can tell, the SubtleCrypto API requires the entire contents to be loaded in an ArrayBuffer in memory before computing a hash, so that approach unnecessarily loads the whole file in memory and thus limits the file size to less than the maximum worker memory of 128MB.

In Node I would do something like digest.update(chunk) and so could incrementally build the hash while the body streamed through rather than loading everything in memory up front.

Am I missing something obvious? Is there a workaround to compute hashes on streaming bodies in Workers?

Hi Jacob,

Have you considered using a third-party library like Forge? It seems like it would solve the issue, but I haven’t tried using it with Workers so can’t say how well it works.

var md = forge.md.sha256.create();
md.update('The quick brown fox jumps over the lazy dog');
console.log(md.digest().toHex());
// output: d7a8fbb307d7809469ca9abcb0082e4f8d5651e46d3cdb762d02d0bf37c9e592
2 Likes

Interesting. This is a limitation of SubtleCrypto (tracking bug).

I guess wasm is your best bet, but it will chew through your cpu runtime quickly. Seems like a big limitation of workers that it’s not possible with runtime primitives.

2 Likes

I visited this issue so many time @john.spurlock :sweat_smile: Still can’t grasp how it can be over 5yo now.