nomnivore/ollama.nvim

setup codestral, packer nvim

nanvenomous opened this issue · 1 comments

I could use a little help with the setup

The Issue

  • codestral is not set as the default model
  • :OllamaModel only shows my llama3 model as an option
    ollama_nvim_model_selection

My config

neovim verison: v0.10.0

return require('packer').startup(function(use)
  use 'nomnivore/ollama.nvim'
  use 'nvim-lua/plenary.nvim'
end)

require('ollama').setup({
  cmd = { "Ollama", "OllamaModel", "OllamaServe", "OllamaServeStop" },
  opts = {
    model = "codestral",
    url = "http://127.0.0.1:4005",
  }
})

I have run

export OLLAMA_HOST=127.0.0.1:4005
ollama serve

the output of ollama list:

NAME            	ID          	SIZE  	MODIFIED      
codestral:latest	fcc0019dcee9	12 GB 	4 minutes ago	
llama3:latest   	365c0bd3c000	4.7 GB	3 weeks ago  	

I have confirmed that

ollama run codestral

works great

Was able to resolve this. Since I'm not using lazy my setup method needed to look like this:

require('ollama').setup({
  model = "codestral",
  url = "http://127.0.0.1:4005",
  serve = {
    on_start = false,
    command = "ollama",
    args = { "serve" },
    stop_command = "pkill",
    stop_args = { "-SIGTERM", "ollama" },
  },
  -- View the actual default prompts in ./lua/ollama/prompts.lua
  prompts = {
    Sample_Prompt = {
      prompt = "This is a sample prompt that receives $input and $sel(ection), among others.",
      input_label = "> ",
      model = "mistral",
      action = "display",
    }
  }
})