KeyError encountered when using gpt-3.5-turbo-16k
michilu opened this issue · 3 comments
I attempted to use the gpt-3.5-turbo-16k model as GPT-4 is not yet available. However, I encountered a KeyError when trying to access self.model_max_tokens[self.model]
. It seems that gpt-3.5-turbo-16k
is not included in the model_max_tokens dictionary. Could you please update the list to include this model?
Error Details:
File "chatgpt_prompt_wrapper/chatgpt/chatgpt.py", line 118, in set_model
self.tokens_limit = self.model_max_tokens[self.model] - 1
~~~~~~~~~~~~~~~~~~~~~^^^^^^^^^^^^
KeyError: 'gpt-3.5-turbo-16k'
chatgpt-prompt-wrapper/src/chatgpt_prompt_wrapper/chatgpt/chatgpt.py
Lines 88 to 107 in 41b2e8c
Thanks for making the issue.
At v0.0.13,
gpt-4-0613, gpt-4-32k-0613, gpt-3.5-turbo-0613, gpt-3.5-turbo-16k and gp-3.5-turbo-16k-0613 are added.
In addition, from v0.0.13, you can add (or change) new models by adding model_max_tokens
and prices
definitions to the configuration file:
[global.model_max_tokens]
"gpt-3.5-turbo-16k" = 16384
[global.prices]
"gpt-3.5-turbo-16k" = [0.003, 0.004]
See README for details.
But anyway, I'd appreciate it if you make an issue when you find the new model not listed in the default configuration.
That's great! Your quick response and continuous improvement of the system are greatly appreciated.
Now, it seems that version v0.0.13 has not been published to PyPI yet. https://pypi.org/project/chatgpt-prompt-wrapper/#history Due to this, I encountered an error when trying to install it using brew install.
Ah, sorry. The package was updated correctly, and you should be able to install it now.