Kav-K/GPTDiscord

8k context limit for 16k models with .txt attachments

arash-ashra opened this issue · 9 comments

I'm not able to give more than 8k tokens with a 16k or 128k gpt4-preview model.

Screenshot 2023-11-26 at 3 29 37 PM

Looks like turbo-16k only has an 8k token limit for output so you'll need to use a gpt-4 model like 32k or 1106-preview

Did you try it on the bot on the server?

the issue is the input token limit. I also tried it on the server bot with gpt-32k and same issue.
Screenshot 2023-11-27 at 11 44 40 AM

The file is just too big, I don't understand your issue here @ashra-academy, after a certain token amount it just won't work

yes, but this file is not over the token limit. it is about 12k. And the gpt4 (32k limit) in open ChatGPT can handle the same input.
So this is clearly a bug in the bot.

image

@ashra-academy It is certainly not a bug in the bot, the bot summarization threshold is set to 5k. Your file is bigger than that so it won't work. I set the threshold to 100k, you can try again now with the bot on the server.

oh I didn't know about such a threshold parameter. It probably better be above 128k since the latest gpt4-preview is that much. Also how can I set this parameter on my own bot?

Above 128k doesn't make sense as it would hit the token limit before even beginning to summarize and also leaves no space for the summary afterwards, so something like 110k is more reasonable, you can set it with /system settings summarize_threshold

ooh I see. Thaaanks