This model's maximum context length is 4097 tokens. However, your messages resulted in 4828 tokens. Please reduce the length of the messages.
mcu13321 opened this issue · 2 comments
mcu13321 commented
when use GPT-3.5 turbo
I just input "scroll to bottom" at the current homepage
aristide1997 commented
It probably means that the page (DOM) that you are trying to automate is too big to input as a request for the 3.5 model. Model 4 should help, but I can't get it to work
maxbaluev commented
Anybody have ideas of algorithms how to split, orr optimize dom strucure to make it working with gpt3. 5?