Sentiment example using TypeChat with a self hosted LLM. No API keys required.
- Node.js version 18 or above
- llama-cpp-python web server
- Model supported by llama.cpp
Download a model: e.g. llama-2-13b-chat.ggmlv3.q4_0
curl -L "https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML/resolve/main/llama-2-13b-chat.ggmlv3.q4_0.bin" --create-dirs -o models/llama-2-13b-chat.ggmlv3.q4_0.bin
Start llama-cpp-python web server:
docker-compose up -d
Run the example:
yarn install
yarn start
- llama-node could be used instead of the web server
- This example is a modified version of the TypeChat sentiment example
Copyright (c) Microsoft Corporation - MIT License - llama-cpp-python
Copyright (c) 2023 Andrei Betlen - MIT License