/ollama-js-example

Primary LanguageJavaScriptMIT LicenseMIT

Trying Ollama JS Library

With the purpose of interact with Ollama using Llama 3 Model to use it on a Node Express app. Very easy to use with this basic example.

Test it

  1. Run the services inside the docker-compose.yml file
docker compose up -d
  1. We're gonna pull the Llama3 model inside the ollama container
docker exec -it ollama-example ollama pull llama3
  1. After pulling the model correctly, you can start using the http://127.0.0.1:3000/chat POST endpoint locally