patrikzudel/PatrikZeros-ChatGPT-API-UI

Token lengths becomes too long after some conversations.

Closed this issue · 6 comments

After some conversations, OpenAI consistently returned the following error message.

"This model's maximum context length is 4096 tokens. However, your messages resulted in 4358 tokens. Please reduce the length of the messages."

This may be caused by the history. It is need to restrict the history length.

I will work on this today, I am planning on reworking the token counter.
Will also be probably adding a live token counter under the "Send" button, so users will have an idea how much tokens they're going to send.

On ~3500 tokens I'll warn the user that the history will start getting cut off after 4096 tokens and I'll recommend to summarize.

I faced the same problem, it'll be nice if we can decide how to use the Tokens, no limits pls.

Alright, so today I've successfully reworked the token counter and it is now using OpenAI's Tokenizer, should be way more accurate now, but not perfect. In the same way as https://platform.openai.com/tokenizer/ differs from the actual token counts of GPT3.5-turbo.
Tomorrow I'll do the rest.

I faced the same problem, it'll be nice if we can decide how to use the Tokens, no limits pls.

Don't worry no limits are going to be added. Only warnings.

Thanks for the quick fix. However, I still get token errors in rare cases.
But by pressing Summrize it seems to be able to continue chatting.

Thanks for the quick fix. However, I still get token errors in rare cases. But by pressing Summrize it seems to be able to continue chatting.

Yeep it's not supposed to be fixed yet, as I've said I only reworked the token counter aka. laid the foundation for the fix, I haven't actually touched the important bits yet, I'll do the rest today.

Fixed in the latest commit.