/ollama-copilot

Proxy that allows you to use ollama as a copilot like Github copilot

Primary LanguageGoMIT LicenseMIT

Ollama Copilot

Proxy that allows you to use ollama as a copilot like Github copilot

Video presentation

Installation

Ollama

Ensure ollama is installed:

curl -fsSL https://ollama.com/install.sh | sh

Or follow the manual install.

Models

To use the default model expected by ollama-copilot:

ollama pull codellama:code

ollama-copilot

go install github.com/bernardo-bruning/ollama-copilot@latest

Running

Ensure your $PATH includes $HOME/go/bin or $GOPATH/bin. For example, in ~/.bashrc or ~/.zshrc:

export PATH="$HOME/go/bin:$GOPATH/bin:$PATH"
ollama-copilot

Configure IDE

Neovim

  1. Install copilot.vim
  2. Configure variables
let g:copilot_proxy = 'http://localhost:11435'
let g:copilot_proxy_strict_ssl = v:false

VScode

  1. Install copilot extension
  2. Sign-in or sign-up in github
  3. Configure open settings config and insert
{
    "github.copilot.advanced": {
        "debug.overrideProxyUrl": "http://localhost:11437"
    },
    "http.proxy": "http://localhost:11435",
    "http.proxyStrictSSL": false
}

Roadmap

  • Enable completions APIs usage; fill in the middle.
  • Enable flexible configuration model (Currently only supported llamacode:code).
  • Create self-installing functionality.
  • Windows setup
  • Documentation on how to use.