Streaming HTML

Can anyone recommend a templating engine with streaming to use in Workers?

I’ve only found these which seem hobby projects…

Flora

Streaming doT

Edit:

Also found Dust which hasn’t been updated in 4 years.

Edit 2:

I just realized none of those template engines will work because Workers do not use Node streams… d’oh!

1 Like

I’ve been looking into responding with a stream to deliver HTML but I’m not sure how this would work with Workers.

By looking at the docs from MDN I shoud be able to use ReadableStream to enqueue string values. See this working example by MDN.

But when using it with workers (see this gist) I get this error:

Failed to construct ‘ReadableStream’: the constructor is not implemented

Then I tried to use TransformStream. According to the CF docs:

let { readable, writable } = new TransformStream()

Which already instantiates the ReadableStream so I don’t see how I could get access to the controller to queue data.

I also tried following this MDN example to add chunks to a writable stream via writable.getWriter() but I get this error:

Error: Failed to get the ‘ready’ property on ‘Writer’: the property is not implemented

I confess I’m totally new to this streaming API…

Can anyone comment on what’s going on or share an example on how to stream strings to a response using a Worker?

So this is as far as I could get and print a string but I haven’t been able to sequentially send chunks to the browser:

let { readable, writable } = new TransformStream();

const writer = writable.getWriter();
const encoder = new TextEncoder();  	

writer.write(encoder.encode('now: ' + Date.now()));

return new Response(readable, {
	status: 200
});

And I still get the error:

Uncaught (in response)
Error: The script will never generate a response.

Anyway, any guidance or help would be appreciated!

You are on the right track, just need to return the response before writing. Can be done with an async function like:

let { readable, writable } = new TransformStream();

const writer = writable.getWriter();
const encoder = new TextEncoder();

async function asyncWrite() {
	writer.write(encoder.encode('now: ' + Date.now()));
	return writer.close();
};

event.waitUntil(asyncWrite());

return new Response(readable, {
	status: 200
});
1 Like

Thanks for the help @john.spurlock that removed the error.

The problem is that if I use event.waitUntil the worker will wait until I close the writer before sending the response with all the chunks at once to the browser. What I’m trying to accomplish is sending bits of text to the browser in batches instead of all at once.

Edit:

it worked!

The issue was that I was sending small bits of text and the stream was waiting for larger chunks.

Edit:

So I created this little repo as an example:

It seems the behavior of sending chunks to the browser is very inconsistent. Is there a way to force the stream to flush to the browser at a precise moment or to define the size at which the stream should be flushed?

See this video for the inconsistent behavior I’m talking about:

1 Like

As far as I know, there is no explicit flush control on writer, but you may want to poke around in the streams spec [1].

Note each writer.write call can also be awaited, but I would not recommend relying on that for message framing, etc. See the comment on MDN [2]:

Note that what “success” means is up to the underlying sink; it might indicate simply that the chunk has been accepted, and not necessarily that it is safely saved to its ultimate destination.

There are several buffers (including TCP buffers) that we don’t have explicit control over in a high-level api like this. You’ll need to do your own framing (e.g. with a length prefix) if you want to use this for an http pull system.

For streaming html though, it works quite well.

[1] https://streams.spec.whatwg.org/
[2] https://developer.mozilla.org/en-US/docs/Web/API/WritableStreamDefaultWriter/write

1 Like

Thanks again @john.spurlock .

I’ve been reading the spec and I believe I could control this using ByteLengthQueuingStrategy unfortunately it seems this class is not available in Workers:

ReferenceError: ByteLengthQueuingStrategy is not defined

2 Likes

Here’s a streaming tagged template literal for workers:

Usage

1 Like