/WebLLM

WebLLM backed AI Chat built with Vite and React.

Primary LanguageTypeScript

WebLLM (React + Vite)

A WebLLM project using React, Vite & TailwindCSS. This project is a simple example of how to use WebLLM with Components.

About WebLLM

WebLLM is a high-performance in-browser LLM inference engine that brings language model inference directly onto web browsers with hardware acceleration. Everything runs inside the browser with no server support and is accelerated with WebGPU.

Getting Started

# Clone the repository
git clone https://github.com/sammwyy/webllm.git

# Change directory
cd webllm

# Install dependencies
bun install

# Start the development server
bun dev

Acknowledgements

Made with ❤️ by Sammwy