X [ERROR] InferenceUpstreamError: undefined: undefined

What is the name of the modal you’re running?

@cf/openai/whisper

What is the error number?

X [ERROR] InferenceUpstreamError: undefined: undefined

What is the error message?

InferenceUpstreamError: undefined: undefined at Ai._parseError (cloudflare-internal:ai-api:81:20) at async Ai.run (cloudflare-internal:ai-api:61:23) at async Array. (file:///C:/projects/transcribe-whisper/transcribe-whisper/src/index.ts:22:20) at async jsonError (file:///C:/projects/transcribe-whisper/transcribe-whisper/node_modules/wrangler/templates/middleware/middleware-miniflare3-json-error.ts:22:10) at async drainBody (file:///C:/projects/transcribe-whisper/transcribe-whisper/node_modules/wrangler/templates/middleware/middleware-ensure-req-body-drained.ts:5:10)

What is the issue or error you’re encountering

I’m trying to run cloudflare transcription example with use of hono framework. Here is example code I’m trying to run whisper · Cloudflare Workers AI docs

What steps have you taken to resolve the issue?

It worked yesterday morning I tried different speech recognition models but without much success.

What are the steps to reproduce the issue?

  1. git clone GitHub - arekgotfryd/transcribe-whisper-cf
  2. git checkout minimal-repro
  3. npm run dev
  4. Hit localhost:8787/transcribe endpoint
  5. You should see an error

Screenshot of the error

3 Likes

I too am facing the same issue when I’m calling @cf/meta/llama-3.2-11b-vision-instruct with an image parameter

1 Like

I am experiencing the same issue with these two text generation models: @cf/meta/llama-3.3-70b-instruct-fp8-fast and @cf/meta/llama-3.1-8b-instruct-fast. The code was functioning correctly but suddenly started crashing without any changes to the implementation. I attempted to force the models to respond by sending the same message multiple times (around three attempts), and while this temporarily resolved the issue, the problem reappeared after a few prompts. It seems to be intermittent and unpredictable.

Terminal logs:

[wrangler:inf] POST /webhook 200 OK (8036ms) // Done
[wrangler:inf] POST /webhook 200 OK (6935ms) // Done
[wrangler:inf] POST /webhook 200 OK (8039ms) // Done
[wrangler:inf] POST /webhook 500 Internal Server Error (1488ms) // Error
[wrangler:inf] POST /webhook 500 Internal Server Error (1361ms) // Error After Repeat
[wrangler:inf] POST /webhook 200 OK (7981ms) // Done After Repeat
[wrangler:inf] POST /webhook 200 OK (6951ms) // Done
[wrangler:inf] POST /webhook 500 Internal Server Error (1285ms) // Error Again

The error message:

✘ [ERROR] err InferenceUpstreamError: undefined: undefined

      at Ai._parseError (cloudflare-internal:ai-api:76:20)
      at async Ai.run (cloudflare-internal:ai-api:57:23)
      at async Array.<anonymous>
  (file:///media/ahmed/Data/Repositories/telegram-tracker-bot/src/index.ts:26:42)
      at async jsonError
  (file:///media/ahmed/Data/Repositories/telegram-tracker-bot/node_modules/wrangler/templates/middleware/middleware-miniflare3-json-error.ts:22:10)
      at async drainBody
  (file:///media/ahmed/Data/Repositories/telegram-tracker-bot/node_modules/wrangler/templates/middleware/middleware-ensure-req-body-drained.ts:5:10)