Can Workers transform a chunked request to a non-chunked one?

I’m trying to serve one constantly-updating Google Slide image to a device that doesn’t support Transfer-Encoding: chunked requests.

I’ve got a worker with something like:

addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request))
})

// https://developers.cloudflare.com/workers/examples/modify-response/
// strip all headers from the docs.google.com response and just return the body
async function handleRequest(request) {

  const originalResponse = await fetch('https://docs.google.com/presentation/d/1lSFR-zJzr_ecwGRiFzzz7cc3xVwgZzzLlYe-VNsnTzz/export/jpeg');
  // const originalResponse = await fetch('https://loremflickr.com/1200/825');

  let response = new Response(originalResponse.body, {"status": 200});

  return response;

}

Using the Cache API or KV or something else, is there a way to somehow strip the Transfer-Encoding header and just have a normal ol’ Content-Length: xxx http request? Thanks for your thoughts!

const faviconStr = new Int8Array([-119, 80, 78, 71, 96, -126]); //real life ~500 bytes 
  if (pathname === '/favicon.ico') {
    return new Response(faviconStr, {
      headers: {
        "cache-control": "max-age=691200,no-transform",
        "content-type": "image/vnd.microsoft.icon",
      }
    });
  }

add no-transform to cache-control and CF edge proxy sends it uncompressed and NOT CHUNKED

I save 70 bytes on wireshark IIRC by removing gzip and removing chunked encoding from my ZOPFLIed PNG file, since a PNG header IS a gzip file!!!

remember chunked encoding is pretty obviously needed by CF edge proxy when it uncompresses/recompresses any stream from origin to destination through a promise/stream object. I dont remember if Content-Length header covers AFTER GZIP or before GZIP length by spec. Also remember no 2 gzip encoders in any 2 libraries or compiler settings are guaranteed to produce the same compressed byte stream.

So you have to pass no-transform and/or your file in an STRING/ARRAY obj to new Response() to have a chance at Content-Length to eyeball AFAIK.

Thanks for your thoughts!

I thought I was onto something by piping my fetch call’s ReadableStream into a new TransformStream, then purposefully awaiting it’s completion before returning it…

addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request))
})

// https://developers.cloudflare.com/workers/examples/modify-response/
// strip all headers from the docs.google.com response and just return the body
async function handleRequest(request) {

  let originalResponse = await fetch('https://docs.google.com/presentation/d/1lSFR-zJzr_ecwGRiFzzz7cc3xVwgZzzLlYe-VNsnTzz/export/jpeg');
  // const originalResponse = await fetch('https://loremflickr.com/1200/825');
  
  // https://developers.cloudflare.com/workers/learning/using-streams/
  const { readable, writable } = new TransformStream();

  await originalResponse.body.pipeTo(writable);
  
  return new Response(readable, {"status": 200});

}

…but this solution just makes the request hang and timeout.

Turns out, I was way overthinking it! The following works perfectly:

addEventListener('fetch', event => {
  event.respondWith(handleRequest(event.request))
})

// https://developers.cloudflare.com/workers/examples/modify-response/
// strip all headers from the docs.google.com response and just return the body
async function handleRequest(request) {
  let response = await fetch('https://docs.google.com/presentation/d/1lSFR-zJzr_ecwGRiFzzz7cc3xVwgZzzLlYe-VNsnTzz/export/jpeg');
  let blob = await response.blob();
  return new Response(blob, {"status": 200, "content-type": "image/jpeg" });
}