Robitx/gp.nvim

clarify: which models are currently possible to use with gp.nvim?

divramod opened this issue · 2 comments

in my openapi dashboard i can see a list of possible text models.

could you clarify, which models we can use in general with gp.nvim?

i watched the dev day talk last week and sam altman spoke about gpt-4 turbo, but i can't see it in the list.
he said that gpt-4 turbo would be able to handle a lot more user input tokens.

could it be that gpt-4-1106-preview is gpt-4 turbo?

do you know what the 1106 means?

edit: i just switched to gpt-4-1106-preview it responds ten times faster whooop

Screenshot 2023-11-13 at 05 35 09

Robitx commented

@divramod hey, yes, I'm running on gpt-4-1106-preview also 🙂 the number is month and day publicly released (MMDD)

  • gpt-3.5-turbo and gpt-3.5-turbo-16k are aliases which currently use gpt-3.5-turbo-0613/gpt-3.5-turbo-16k-0613, they plan to switch them to gpt-3.5-turbo-1106 during December => the best and cheapest model fresh users can currently use is gpt-3.5-turbo-1106 with 16K token window
  • access to gpt-4* api is limited to paying users - those who paid $1 or more for their API (or got historically access via now discontinued waiting list)
  • at the given the pricing, token window and speed of gpt-4-1106-preview there is no point in using gpt-4 which is alias to gpt-4-0613

thx for the explanations!