/llamash

REST API Bridge for Ollama.

Primary LanguageGoMozilla Public License 2.0MPL-2.0

llamash

RESTful API Bridge for Ollama.

Build and Run

$ go build .
$ ./llamash

Setup

Before starting the bridge server, you need a running Ollama server which the address is http://127.0.0.1:11434 in default.

$ podman run --network host ollama serve
$ ./llamash -p 11444 -i 'http://127.0.0.1:11434'

Enjoy it!

$ curl 'http://127.0.0.1:11444/generate?model=codellama&prompt=sayhi'

GET Form:

  • generate Generate content.
    • model The LLaMA model you gonna use.
    • prompt The content will send to the model.

Responds in pure text.