/wllama

WebAssembly binding for llama.cpp - Enabling in-browser LLM inference

Primary LanguageC++MIT LicenseMIT

This repository is not active