It is very easy to raise the maximum context length error
gazedreamily opened this issue · 1 comments
gazedreamily commented
openai.error.InvalidRequestError: This model's maximum context length is 4097 tokens. However, your messages resulted in 4688 tokens. Please reduce the length of the messages.
gazedreamily commented
Oh I should slice files into small pieces!