Ollama using JavaScript
Clone this repo:
git clone https://github.com/felixdrp/ollama-js-tutorial.git
cd langChain-ollama-js-tutorial
If git is not installed, then install git.
Bun is an all-in-one JavaScript runtime & toolkit designed for speed, complete with a bundler, test runner, and Node.js-compatible package manager.
Install Bun
Install options:
- Linux or Mac
curl -fsSL https://bun.sh/install | bash
- Mac using brew
brew install oven-sh/bun/bun # for macOS and Linux
If you don't have brew installed:
/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"
- Windows
powershell -c "irm bun.sh/install.ps1 | iex"
Install options:
-
Direct install of Ollama
-
Container version of Ollama (container)
- You NEED a container provider like docker or podman. Install docker.
- Multiple container alternatives like CPU (default), Nvidia and AMD GPU rocm. For GPU you NEED docker.
General Ollama docs.
Install Llama3.1:
# Direct install
ollama run llama3.1
Install on a container:
docker exec -it ollama ollama run llama3.1
# Direct install
ollama list
# Container version
docker exec -it ollama ollama list
After we cloned the repository we need to install the project packages.
bun install
If you use Ollama on a container, please, run the container before running the examples.
Model prompting
bun src/prompting.js
Web prompting using on memory RAG ()
bun src/web-retrieval.js