towardmay's Stars
ggerganov/llama.cpp
LLM inference in C/C++
meilisearch/meilisearch
A lightning-fast search API that fits effortlessly into your apps, websites, and workflow
Mozilla-Ocho/llamafile
Distribute and run LLMs with a single file.
leejet/stable-diffusion.cpp
Stable Diffusion and Flux in pure C/C++
adamschwartz/chrome-tabs
Chrome-style tabs in HTML/CSS.
todbot/circuitpython-tricks
Some CircuitPython tricks, mostly reminders to myself
technoblogy/attiny10core
For programming the ATtiny10/9/5/4.
HyperMink/inferenceable
Scalable AI Inference Server for CPU and GPU with Node.js | Utilizes llama.cpp and parts of llamafile C/C++ core under the hood.