/rusty_llama_chat_local

A simple ChatGPT clone in Rust on both the frontend and backend. Uses open source language models and TailwindCSS.

Primary LanguageCSSMIT LicenseMIT

Rusty Llama Webapp

A simple webapp to showcase the ability to write a simple chatbot webapp using only Rust, TailwindCSS and an Open Source language model such as a variant of GPT, LLaMA, etc.

Setup Instructions

You'll need to use the nightly Rust toolchain, and install the wasm32-unknown-unknown target as well as the Trunk and cargo-leptos tools:

rustup toolchain install nightly
rustup target add wasm32-unknown-unknown
cargo install trunk cargo-leptos

You'll also need to download a model (in GGML format) of your choice that is supported by the Rustformers/llm Crate.

In the root of the project directory, you'll find a .env file where an environment variable called MODEL_PATH is defined. Replace the value with the full path to the desired model file.

To run the project locally, cargo leptos watch in the project directory. Then in your browser navigate to http://localhost:3000/?

Tested Models