Workers AI - can not pass more than 6144 characters in prompt for models with bigge

Hey, is it possible to pass more than the hard capped 6144 prompt characters to a model like mistral-7b-instruct-v0.2 which should support up to 32k tokens ? The limit is also documented in the API Docs but doesn’t this limitation make models with larger context sizes unnecessary ?

2 Likes

I am also struggling with this.
I am trying to send my database schema to sqlcoder but can’t due to this cap

The same issue is true for me. Can the official fix this problem? Otherwise, the long context advantage of some models will not be applied at all.

Even worst, I try split to 3 API call then merge but result are somewhat slow and mess up.

same here

is there no other way to do this?