/wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference

Primary LanguageC++MIT LicenseMIT

Pinned issues

Issues