cpu-inference
There are 16 repositories under cpu-inference topic.
kennethleungty/Llama-2-Open-Source-LLM-CPU-Inference
Running Llama 2 and other Open-Source LLMs on CPU Inference Locally for Document Q&A
CoderLSF/fast-llama
Runs LLaMA with Extremely HIGH speed
rbitr/llm.f90
LLM inference in Fortran
jozsefszalma/homelab
The bare metal in my basement
yybit/pllm
Portable LLM - A rust library for LLM inference
laelhalawani/gguf_llama
Wrapper for simplified use of Llama2 GGUF quantized models.
codito/arey
Simple large language model playground app
JohnClaw/chatllm.v
V-lang api wrapper for llm-inference chatllm.cpp
JohnClaw/chatllm.vb
VB.NET api wrapper for llm-inference chatllm.cpp
JohnClaw/chatllm.cs
C# api wrapper for llm-inference chatllm.cpp
JohnClaw/chatllm.nim
Nim api-wrapper for llm-inference chatllm.cpp
chinese-soup/cbot-telegram-whisper
Simple bot that transcribes Telegram voice messages. Powered by go-telegram-bot-api & whisper.cpp Go bindings.
JohnClaw/chatllm.d
D-lang api wrapper for llm-inference chatllm.cpp
JohnClaw/chatllm.kt
kotlin api wrapper for llm-inference chatllm.cpp
JohnClaw/chatllm.lua
lua api wrapper for llm-inference chatllm.cpp
JohnClaw/chatllm.rs
rust api wrapper for llm-inference chatllm.cpp