API Error: Request too large
Opened this issue · 2 comments
huachuman commented
... Limit 20000, Requested 119668. The input or output tokens must be reduced in order to run successfully. Visit https://platform.openai.com/account/rate-limits to learn more.
Never had this in v1. Why doesn't it just use what it can?
huachuman commented
another model: your messages resulted in 142422 tokens (142210 in the messages, 212 in the functions). Please reduce the length of the messages or functions.
micaelarealign commented
I've had the same problem since v2