/typechat-llama-example

Sentiment example using TypeChat with a self hosted LLM

Primary LanguageTypeScriptMIT LicenseMIT

TypeChat-llama-example - TypeChat with Llama

Sentiment example using TypeChat with a self hosted LLM. No API keys required.

Terminalizer

✅ Requirements

⚡️ Quick start

Download a model: e.g. llama-2-13b-chat.ggmlv3.q4_0

curl -L "https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML/resolve/main/llama-2-13b-chat.ggmlv3.q4_0.bin" --create-dirs -o models/llama-2-13b-chat.ggmlv3.q4_0.bin

Start llama-cpp-python web server:

docker-compose up -d

Run the example:

yarn install
yarn start

Notes

  • llama-node could be used instead of the web server

Used software components