vigoo's software development blog

Golem 1.3's new JavaScript engine

Posted on September 19, 2025

As we are rapidly approaching the release data for Golem 1.3, a major update, I'm going to publish a series of small posts talking about some of the technical details of this new release. In this first one, let's talk about the new JavaScript engine.

JavaScript support in previous versions

In previous Golem versions we tried to support JavaScript (and TypeScript) using the "official" way of using these languages in the WASM Component Model: using the ComponentizeJs project. This embeds a special version of the SpiderMonkey JS engine, called StarlingMonkey in a WASM component together with the user's JS code, and generates import and export bindings based on the component model interface definition (WIT). In addition to this, ComponentizeJs also does a preinitialization step - basically pre-running and snapshotting parts of the resulting component compile time to make the component's initialization time quicker.

Although this all sounds very good, this project is still considered experimental and we ran into serious issues with it, especially around it's implementation of fetch and async boundaries. We reported these issues, and also tried to fix some of them ourselves, but working on this project is extremely difficult and we did not reach a point where our users would be guaranteed to be able to build on top of these core JS APIs.

The new engine

Instead trying to fix ComponentizeJs or waiting for others to do so, we decided to try to replace it for the next Golem release. This worked out so well that we were able to refocus our language support to be primarily TypeScript for the next release.

So what did I do?

The goal was to have a similar solution - take the user's JS and an interface definition (WIT) and get a WebAssembly component implementing this interface by running the user's JavaScript code. But we wanted something that is significantly easier to work with, and easier to extend with more and more "build-in" JS APIs. This is important for us as we want people to be able to use as many existing libraries in their Golem applications as possible. There must be a trade-off somewhere, of course - and there are two that I'm going to talk about in details. First, our new engine supposed to have worse performance than ComponentizeJs, although it has not been benchmarked yet; and the second one is the need of a Rust compiler toolchain to convert the JavaScript code to WASM. This, however, is not affecting Golem users due to some other changes we introduced; more about it later.

So with all these constraints, I ended up creating wasm-rquickjs, with the following properties:

The result is a CLI tool (wasm-rquickjs-cli) and embeddable Rust library that can take a WIT world, a JS file, and ends up generating a standalone Rust crate that, when compiled using cargo-component, emits the WASM that we need.

It also support emitting TypeScript module definitions for all the imports and exports of the component.

Details

To understand why I chose to go with generating Rust crates and using the above mentioned rquickjs library, let's take a closer look at how things are done within wasm-rquickjs.

Defining built-in APIs

We wanted to be able to easily increase the set of supported built-in APIs to have increased compatibility with the existing JS ecosystem. Some of these APIs can be introduced with pure JS polyfill libraries, but many of them requires to be somehow implemented on top of imported WebAssemby system interfaces (WASI). A good example can be implementing (a subset of) the node:fs API to work with files and filesystems.

The rquickjs crate really makes this very easy to do - it has a convenient way to bind native Rust functions into the JavaScript context, and it also solves the difficult problem of bridging the world of JS promises with async Rust.

This means we can write Rust functions in which we can use the Rust standard library or any other imported WIT bindings and then call these functions from JS. For example we can define a read_file function that exposes std::fs::read for JavaScript:

#[rquickjs::function]
pub fn read_file(path: String, ctx: Ctx<'_>) ->   List<(Option<TypedArray<'_, u8>>, Option<String>)> {
  let path = Path::new(&path);
  match std::fs::read(path) {
    Ok(bytes) => {
      let typed_array =
        TypedArray::new_copy(ctx.clone(), &bytes)
          .expect("Failed to create TypedArray");
      List((Some(typed_array), None))
    }
    Err(err) => {
      let error_message = format!("Failed to read file {path:?}: {err}");
      List((None, Some(error_message)))
    }
  }
}

Then the actual JavaScript API can be implemented in JS itself, using these native functions:

export function readFile(path, optionsOrCallback, callback) {
    // ...
    } else {
        const [contents, error] = read_file(path);
        if (error === undefined) {
            const buffer = Buffer.from(contents);
            callback(buffer);
        } else {
            callback(undefined, error);
        }
    }
}

This makes it really convenient to add support for more and more APIs, and as mentioned earlier, these native functions can be async Rust functions too, which simply translates to async JS functions.

For example, part of the fetch implementation is sending the request body asynchronously:

async function sendBody(bodyWriter, body) {
    const reader = body.getReader();
    while (true) {
        const {done, value} = await reader.read();
        if (done) break;
        await bodyWriter.writeRequestBodyChunk(value);
    }
    bodyWriter.finishBody();
}

The writeRequestBodyChunk method is a native Rust method defined like this:

#[rquickjs::methods(rename_all = "camelCase")]
impl WrappedRequestBodyWriter {
    #[qjs(constructor)]
    pub fn new() -> Self {
        WrappedRequestBodyWriter { writer: None }
    }

    pub async fn write_request_body_chunk(&mut self, chunk: TypedArray<'_, u8>) {
        // ...
    }
    // ...
}

Implementing imports

With the above technique, we could have a precompiled WASM JS engine that is capable of running user code while providing them a fix set of supported APIs. This is what a similar project, wasmedge-quickjs does.

But wasm-rquickjs does not stop here - it uses the same method of defining JS modules with native Rust bindings to define a JS module for each imported WIT interface.

So a code generator takes the WIT imports, and emits Rust code in the style of the above examples that exposes these WIT imports to JavaScript by calling the Rust WIT bindings, generated by wit-bindgen-rust (this happens automatically under the hood when using the already mentioned cargo-component build tool).

Every data type WIT supports is mapped to a specific JS construct, and resources are mapped to JS classes. The following example shows the generated function for one of the exported functions of golem:llm from the Golem AI libraries:

#[rquickjs::function]
fn send(
    messages: Vec<crate::bindings::golem::llm::llm::Message>,
    config: crate::bindings::golem::llm::llm::Config,
) -> crate::bindings::golem::llm::llm::ChatEvent {
    let result: crate::bindings::golem::llm::llm::ChatEvent = crate::bindings::golem::llm::llm::send(
        messages.into_iter().map(|v| v).collect::<Vec<_>>().as_slice(),
        &config,
    );
    result
}

This simply uses rquickjs's native binding macro to do the hard work, and calls the generated Rust bindings under the hood.

Of course to make this work, rquickjs also needs to know how to encode these data types, such as the LLM Message, as JS. So the code generator also emits instances of the ToJs and FromJs type classes, such as:

impl<'js> rquickjs::IntoJs<'js> for crate::bindings::golem::llm::llm::Message {
  fn into_js(
    self,
    ctx: &rquickjs::Ctx<'js>,
  ) -> rquickjs::Result<rquickjs::Value<'js>> {
    let obj = rquickjs::Object::new(ctx.clone())?;
    let role: crate::bindings::golem::llm::llm::Role = self.role;
    obj.set("role", role)?;
    // ...
    Ok(obj.into_value())
  }
}

impl<'js> rquickjs::FromJs<'js> for crate::bindings::golem::llm::llm::Message {
  fn from_js(
    _ctx: &rquickjs::Ctx<'js>,
    value: rquickjs::Value<'js>,
  ) -> rquickjs::Result<Self> {
    let obj = rquickjs::Object::from_value(value)?;
    let role: crate::bindings::golem::llm::llm::Role = obj.get("role")?;
    // ...
  }
}

The main difficulty was not generating these JS mappings - it was matching the expected signatures of wit-bindgen-rust, as it has some complex rules for deciding when to pass things by value or by reference.

Implementing exports

For all the exported interfaces in a component's WIT definition, wit-bindgen-rust generates a trait to be implemented. We expect the JS developers to implement all these imports with some well defined rules (interfaces becoming exported objects, kebab-case names becoming camel cased, etc.). With the assumption that the user's JS code implements all the exports, wasm-rquickjs can generate implementations for these rust traits that are calling into the QuickJS engine, running these functions.

Part of the problem is very similar to what we have with imports - converting from the Rust types (coming from the WIT types) to JS types. This is done using the same conversion type classes we already talked about.

When setting up the JS context, we always store a reference to the user's module in a global variable, so the generated export code can easily access it:

let module: Object = ctx.globals().get("userModule")
  .expect("Failed to get userModule");

There are similar global helper tables for tracking the class instances for WIT resource instances.

Once we have the module object, we can apply the naming rules and find the function value and call it with rquickjs:

fn call_with_this<'js, A, R>(
    ctx: Ctx<'js>,
    function: Function<'js>,
    this: Object<'js>,
    args: A,
) -> rquickjs::Result<R>
where
    A: IntoArgs<'js>,
    R: FromJs<'js>,
{
    let num = args.num_args();
    let mut accum_args = Args::new(ctx.clone(), num + 1);
    accum_args.this(this)?;
    args.into_args(&mut accum_args)?;
    function.call_arg(accum_args)
}

A nice property we can offer is that we don't have to constrain the user to always implement the exported functions as async JavaScript functions. We can simply check the return value before trying to convert it to the Rust equivalent whether it is a Promise or not. And if it is, we can just await it in the Rust code!

if value.is_promise() {
  let promise: Promise = value.into_promise().unwrap();
  let promise_future = promise.into_future::<R>();
  match promise_future.await {
    // ...

Async all the way down

This seamless integration of the JS and Rust async world is a key component in making wasm-rquickjs easy to work with. But it's not enough that rquickjs implements the boundary between JS and Rust. The end result is a WASM component, which is single threaded and only provides a very specific set of system APIs to build on; we cannot just use Tokio for example as our Rust runtime (at the time of writing). At the bottom of all the Rust and JS async stacks, there is single small WASI API supporting all this: wasi:io/poll. Yoshua Wuyts has an excellent blog post about the topic. wasm-rquickjs builds on his wasi_async_runtime crate (and soon will be migrated to the newer wstd crate).

Trade-offs

As I mentioned in the introduction, this approach naturally comes with some trade-offs when comparing to ComponentizeJs.

Performance

We are not doing any precompilation at the moment, so component initialization time for bigger projects is definitely supposed to be slower. On the other hand the engine itself is much smaller than the modified SpiderMonkey in ComponentizeJs, so this may balance out the difference in some cases. I also expect SpiderMonkey to be faster in general than QuickJS, although this is not as clear because SpiderMonkey also has to run in interpreter mode on WASM.

Rust compilation

A more serious trade-off is that by generating a Rust crate, we force the JS/TS users to have a Rust tool-chain available and compile these generated crates to WASM.

We've spent a lot of effort in the past year hiding the complexity of having these build tools, and especially having the correct version of WASM / component model related tools automatically set up and invoked by hiding the component creating process in Golem's own CLI interface.

Still, having to set up Rust to just run a simple JavaScript snippet on Golem is too much to ask. We worked around this issue by not allowing users to work directly on the component model level anymore - no WIT, no composition for them. This way we can embed a precompiled WASM binary in our tooling that can be combined with the user's JavaScript code to form a final WASM component. I am going to write a separate post about this decision and its technical details.

Conclusion

wasm-rquickjs turned out to be a very capable alternative for ComponentizeJs, that is much easier to iterate on. It is a standalone project, completely usable outside of Golem; if the above two trade-offs are acceptable, it provides a nice experience of writing JavaScript or TypeScript code for the WASM Component Model.