joschan21/quill

Token issue

Opened this issue · 0 comments

To avoid error below, does it mean we need to connect to GPT 4 instead of GPT 3.5 for some messages that require AI to do more difficult task : error: message: "This model's maximum context length is 4097 tokens. However, your messages resulted in 4589 tokens. Please reduce the length of the messages."