We are streaming the head and then server side rending the body and streaming that when it is ready. However, when we add the streaming it seems that event.waitUntil(/log request/) keeps the connection open to the browser. Below is a super simplified example that shows the issue. Running this we see the content very quickly but the network panel shows the connection takes ~2 seconds. Is there a way to get the connection to close after we are done writing and still let the function wait for the logging to complete?
addEventListener("fetch", event => {
event.respondWith(handleRequest(event));
});
async function handleRequest(event) {
event.waitUntil(
new Promise(res => {
setTimeout(res, 2000);
})
);
let { readable, writable } = new TransformStream();
let writer = writable.getWriter();
var encoder = new TextEncoder();
writer.write(encoder.encode("<html><head></head>"));
streamMore(writer, encoder);
return new Response(readable, {
status: 200,
headers: new Headers({
"Content-Type": "text/html"
})
});
}
async function streamMore(writer, encoder) {
writer.write(encoder.encode("<body>Hello world</body></html>"));
await write.close();
}
And just for comparison the same basic thing without streaming. Here the network tab shows the request takes maybe 50ms (from where we are).
addEventListener("fetch", event => {
event.respondWith(handleRequest(event));
});
async function handleRequest(event) {
event.waitUntil(
new Promise(res => {
setTimeout(res, 2000);
})
);
return new Response("<html><head></head><body>Hello world</body></html>", {
status: 200,
headers: new Headers({
"Content-Type": "text/html"
})
});
}
Hi @thomas4. Thanks for the reply. Interesting thread, but I don’t think it address our issue. We are just trying to send a logging request in the background. The docs mention that using event.waitUntil(fetch()) will assure that the logging request completes even after the main response has been sent to the client. And this works great. The issue arises when we implement streaming (which also works great on it’s own). But when used with streaming the event.waitUntil() seems to keep the original connection to the client open even though it has nothing to do with the client and should still just be running in the background. Hoping there is someway we can close the main client request when the stream is done and let the logging request complete in the background.
I’m afraid the immediate problem is merely a typo:
Replacing write with writer should make the script behave better.
The typo of course causes the script to throw a ReferenceError exception, but since it happens in an asynchronous task, the error gets lost. This is a shortcoming on our part: we’re working on capturing such asynchronous exceptions and reporting them via the analytics in the Workers dashboard, which would have given you some clue that there was an unhandled exception after the response was returned.
There is also a Workers bug at play which probably made it harder to recognize there was a problem: we should only be writing a terminal response chunk once writer.close() is successfully called, but right now we unconditionally, and erroneously, write terminal chunks once the FetchEvent’s lifetime is over, even if the script didn’t call writer.close(). In other words, the HTTP client should have noticed that the response was truncated and alerted you to an error, but due to our bug, it appeared to be a complete response.
Hi @harris. Thanks for the detailed reply, that definitely helps with the mental model of what was going on. Of course that was the problem, it works as expected now, and everything makes sense. I should have caught that typo and yes it would have been nice to have gotten an error somewhere. We had been developing with the streaming for a while and it all seemed to be working as expected, even though the typo had existed for some time. Only seeing the unexpected behavior of logging keeping the connection open exposed our issue (the typo!). In any case, happy that it was such an easy fix and everything works now. These workers are a really awesome product!! Feels like this is the future!