clarify: which models are currently possible to use with gp.nvim?
divramod opened this issue · 2 comments
divramod commented
in my openapi dashboard i can see a list of possible text models.
could you clarify, which models we can use in general with gp.nvim?
i watched the dev day talk last week and sam altman spoke about gpt-4 turbo
, but i can't see it in the list.
he said that gpt-4 turbo would be able to handle a lot more user input tokens.
could it be that gpt-4-1106-preview
is gpt-4 turbo?
do you know what the 1106 means?
edit: i just switched to gpt-4-1106-preview
it responds ten times faster whooop
Robitx commented
@divramod hey, yes, I'm running on gpt-4-1106-preview
also 🙂 the number is month and day publicly released (MMDD)
gpt-3.5-turbo
andgpt-3.5-turbo-16k
are aliases which currently usegpt-3.5-turbo-0613
/gpt-3.5-turbo-16k-0613
, they plan to switch them togpt-3.5-turbo-1106
during December => the best and cheapest model fresh users can currently use isgpt-3.5-turbo-1106
with 16K token window- access to
gpt-4*
api is limited to paying users - those who paid $1 or more for their API (or got historically access via now discontinued waiting list) - at the given the pricing, token window and speed of
gpt-4-1106-preview
there is no point in usinggpt-4
which is alias togpt-4-0613
divramod commented
thx for the explanations!