Bug Report: No response on tokens overflow
Closed this issue · 0 comments
gdassori commented
When setting the max token to N, if prompt+N > 4001 (model limit), there is an error while communicating w/ OpenAI, and the chatbox doesn't notify anything.
How to reproduce:
Set the token limit to 4000 and start a new chat with any prompt.
Suggestion:
Use tiktoken (https://www.npmjs.com/package/@dqbd/tiktoken) to count the input tokens and determine a proper max_token value that should be model_max_value - input_message_tokens_count