Hi Samuel,
It looks like the response from support is a filtered version of what I told them.
We have many customers using WebAssembly successfully today.
Unfortunately, using Wasm today – whether in Workers, or in the browser – generally requires putting some effort into dependency management to get code size down to a reasonable level. Fundamentally, the problem is that Wasm modules end up like statically-linked binaries – they include not just your program itself, but also your programming language’s entire standard library, and all of your other transitive dependencies. Making matters worse, many programming languages that target Wasm were not historically designed to produce small binaries.
Contrast this with JavaScript. The entire JavaScript standard library is “built in” to V8 and therefore into the Workers Runtime. You do not have to bundle your own copy of the library with your application. Moreover, the Workers Runtime APIs aim to provide built-in support for many higher-level features too – like HTTP, TLS, WebCrypto, etc. – which would normally be provided by additional libraries in other languages.
The same trade-off exists in the browser. When using JavaScript, you get the standard library and all the APIs offered by the browser built-in. When using WebAssembly, you have to ship a Wasm module containing your language’s standard library.
Meanwhile, both browsers and edge compute are environments where small code footprint is important. In the browser, you don’t want the user to have to download a huge module before your web site can load. And in Workers, since we deploy your code to thousands of machines in order to be as close to the user as possible, we need to impose some limits on how big that code can be. And in both environments, since code needs to be loaded on-demand, large modules may lead to a further delay at load time.
Because of all this, as of today, Wasm may not be the best technology for packaging “whole applications”. Instead, it is often best used to target specific tasks that would be hard to do in JavaScript, like running a particular preexisting library, or doing number crunching that would be slow in JS.
Generally, when using Wasm, it’s important to use options like Rust’s no_std
, which omits the standard library from your program. This can make binary sizes much smaller, but it does create a bit of a challenge in that you will need to work around missing library features. Again, this is best practice when using Wasm in both browsers and Workers.
In the future
In order for Wasm to work really well on the edge, we need to come up with a “shared library” standard. We need each major programming language to package its standard library as a shared module, so that that one copy of that module can be loaded into all the different Workers written on that language running on the same machine. That way, individual apps can stay small. (This would also help in browsers, if those shared runtimes can be cached and shared across web sites.)
The Wasm standard itself already supports the notion of multiple modules that call each other. However, we also need the compilers for each language to support this concept. Unfortunately, at present, none of them do, as far as I know.
There are some tricks we could do on the Workers end to make large wasm modules load a bit faster, such as by using V8’s code cache features to effectively precompile modules. However, that’s a big project for us, and it’s not clear how much it will really help, if the fundamental problems with Wasm dependency management are not solved.