/wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference

Primary LanguageC++MIT LicenseMIT

Pinned issues

performance expectations

#4 opened

Open5

How would you implement RAG / Document chat?

#36 opened

Closed5

Issues