Mistral embedding model excessive tokens error
Opened this issue · 2 comments
jamescalam commented
For MistralAI when creating embeddings with mistral-embed
:
MistralAPIException: Status: 400. Message: {"object":"error","message":"Too many tokens in batch. Max is 16384 got 17555","type":"invalid_request_error","param":null,"code":null}
Should add handling for this error.
Gallisatyricon commented
I have the same bug... to follow
Siraj-Aizlewood commented
I've started looking into this. Close to some sort of solution.