Is the response of workers cached on the edge?

We have the following setup with image resize through workers:* → worker (image resize)

Is there any setting for reducing (caching) the number of request to ?
In the billable usage, the number of image resize requests are 25% of the worker requests, so probably there is some caching in the middle, but I think that the responses from the workers with the resized images should be cached on the edge.

Thank you

I haven’t used workers resize much and have not tested extensively, but I use a set-up with another party for resizing.
My Worker resizes, stores in KV and caches.

You can create a worker that uses

The resize result you store in KV and cache, then as users do the same request you check the cache, then KV and finally resize if none are available.

At scale, this will save you money and response time.

You have to tell you worker to cache the result, see

I have my workers set proper Cache-Control headers then use the below snippet which utilises the cache control headers.

async function handleRequest(event) {
    const request = event.request

    if(request.method == "GET") {
        // Hook in to the cache and have any cache-control headers respected
        const cache = caches.default;
        let resp = await cache.match(request)

        if (!resp) {
            resp = await run(request)
            // Use event waitUntil so we can send the response but
            // script keeps executing until the response is saved in the cache
            if(resp.status == 200) {
                event.waitUntil(cache.put(request, resp.clone()))

        return resp

    const resp = await run(request)
    return resp

async function run(request) {
  // do resize
  // set response Cache-Control headers i.e. Cache-Control: public, max-age=31536000
  // return resp

Thanks @adam23 will try it,
But as far I know this will reduce the number of imageresizes, not the number of requests to workers. Am I right?

This topic was automatically closed 3 days after the last reply. New replies are no longer allowed.