Ulov888/chatpdflike

[bug] 多轮会话导致tokens超载

Closed this issue · 1 comments

openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens, however you requested 4658 tokens (3158 in your prompt; 1500 for the completion). Please reduce your prompt; or completion length.

The reason for this error is that gpt3-turbo limits the input length to 4096 tokens, you can modify the length of max_tokens in the response function., but it maybelimit the length of your generated text.