This repository contains the demo code to be able to run a NodeJS application that communicates with a locally running Ollama server. It is built using LangchainJS to communicate with the LLM, along with SvelteKit for the API & frontend.
In order to get started running this application, you will first need to run Ollama locally. If you have already got it running, skip to the next section.
-
Head to the Download Ollama page.
-
Download & Install Ollama on your computer.
-
Open the Ollama application and it will give you the command to run Ollama, similar to below
ollama run llama2
- Enter this in your terminal application to download and run the model locally.
This app requires at least node version 18.
-
Clone this repo
-
Run
npm install
-
Run
npm run dev
The app will then be running on http://localhost:5173