Cloudflare AI Llama function stopped working out of nowhere

I haven’t changed the llama cloudflare function in 5 months but for some reason it just stopped working.

Here’s the error when I make a POST request to my function via the Cloudflare code editor.

TypeError: Fetch API cannot load: /run
    at InferenceSession.run (vendor/@cloudflare/ai.js:233:36)
    at Ai.run (vendor/@cloudflare/ai.js:1442:34)
    at miniLLM (index.js:8:29)
    at async handleRequest (cors.js:13:19)
    at async jsonError (.internal-33d5e335-4df4-4443-b85c-ec038b457bfd-facade-1.js:12:12)
    at async jsonError (.internal-33d5e335-4df4-4443-b85c-ec038b457bfd-facade-1.js:12:12)
    at async jsonError (.internal-33d5e335-4df4-4443-b85c-ec038b457bfd-facade-1.js:12:12)

The error seems to point to the vendored @cloudflare ai.js line here

The code is here on git: mini-gpt/cloudflare at main · shahzeb1/mini-gpt · GitHub

  1. Includes the vendored/@cloudflare/ai.js
  2. My code is very straight forward here:
/**
 * The following is code for your cloudflare function.
 */
import { Ai } from "./vendor/@cloudflare/ai.js";
import { handleRequest } from "./cors.js";

async function miniLLM(request, env) {
  const body = await request.json();
  const messages = body.messages;
  const ai = new Ai(env.AI);
  const response = await ai.run("@cf/meta/llama-2-7b-chat-int8", {
    messages,
  });
  return response;
}

export default {
  async fetch(request, env) {
    return handleRequest(request, env, miniLLM);
  },
};

You will need to replace your vendor/@cloudflare/ai.js file. I created a new demo worker ai app with the same model and just copy and pasted the code. Or you can use my gist: vendor_@cloudflare_ai.js · GitHub