/llmaz

☸️ Easy, advanced inference platform for large language models on Kubernetes

Primary LanguageGoApache License 2.0Apache-2.0

Llmaz

Llmaz, pronounced as /lima:z/, is a building block for users to build their own LLM from A to Z. We're the aborigines of Kubernetes.

This is mostly driven by several people who has great enthusiasm about AI at spare time, if you're one of this kind of people, please join us.

🚀 All kinds of contributions are welcomed ! Please follow Contributing.

🪂 How to run

Prerequisites

Install

git clone https://github.com/InftyAI/Llmaz.git
kind create cluster # ignore this if you already have a Kubernetes cluster
helm install llmaz Llmaz/deploy/llmaz --create-namespace --namespace llmaz
kubectl port-forward svc/llmaz 7860:7860

Visit http://localhost:7860 for WebUI

webui

✨ Features

  • Foundational model management
  • Prompt management
  • Supervised Fine Tuning
  • Model Serving
  • Self-Instruct
  • Dataset management
  • RLHF
  • Evaluation
  • AI applications, RAG, Chatbot, etc.
  • ...

Still under development, Let's see what will happen !

👏 Contributors

Thanks to all these contributors. You're the heroes.