Pinned Repositories
1Project
test
test
llama.cpp
LLM inference in C/C++
kernel-memory
RAG architecture: index and query any data using LLM and natural language, track sources, show citations, asynchronous memory patterns.
LLamaSharp
A C#/.NET library to run LLM (🦙LLaMA/LLaVA) on your local device efficiently.
OpenVINO.NET
High quality .NET wrapper for OpenVINO™ toolkit.