/htmx_llamacpp_server

Fun little project that makes a llama.cpp server LLM chat interface using HTMX and Rust

Primary LanguageCSSMIT LicenseMIT

HTMX + Llama.cpp Server ❤️

On machine with llama.cpp

.\llama-server -ngl 100 --port 9090 -m <some.gguf> --host 0.0.0.0

When running

cargo run -- --llama http://<llama.cpp_server_IP>:9090
Screenshot 2024-06-30 at 8 03 39 AM