/kitchenai

Open Source LLMOps tool for AI teams

Primary LanguagePythonApache License 2.0Apache-2.0

๐Ÿฝ๏ธ KitchenAI

KitchenAI

Simplify AI Development with KitchenAI: Your AI Backend and LLMOps Toolkit

Docs
Falco
Hatch Project


Documentation KitchenAI Cloud

๐Ÿš€ What is KitchenAI?

KitchenAI is an open-source toolkit that simplifies AI complexities by acting as your AI backend and LLMOps solutionโ€”from experimentation to production.

It empowers developers to focus on delivering results without getting stuck in the weeds of AI infrastructure, observability, or deployment.

Key Goals:

  1. Simplify AI Integration: Easily turn AI experiments into production-ready APIs.
  2. Provide an AI Backend: Handle the entire AI lifecycleโ€”experimentation, observability, and scaling.
  3. Empower Developers: Focus on application building, not infrastructure.

kitchenai-dev


๐Ÿ› ๏ธ Who is KitchenAI For?

  • Application Developers:

    • Seamlessly integrate AI into your apps using APIs.
    • Experiment and test AI techniques without reinventing the wheel.
  • AI Developers & Data Scientists:

    • Move quickly from Jupyter notebooks to production-ready services.
    • Deploy custom AI techniques with ease (e.g., RAG, embeddings).
  • Platform & Infra Engineers:

    • Customize your AI stack, integrate tools like Sentry, OpenTelemetry, and more.
    • Scale and optimize AI services with a modular, extensible framework.

Say goodbye to boilerplate!

๐Ÿš€ Go from notebook to app integration in minutes.

Example notebook: kitchenai-community/llama_index_starter

By annotating your notebook with KitchenAI annotations, you can go from this:

kitchenai-dev

To interacting with the API using the built in client:

kitchenai-dev


๐Ÿ’ก Why KitchenAI?

Integrating and scaling AI is too complex today. KitchenAI solves this:

  1. AI Backend Ready to Go:

    • Stop building APIs and infra from scratch. Deploy AI code as production-ready APIs in minutes.
  2. Built-In LLMOps Features:

    • Observability, tracing, and evaluation tools are pre-configured.
  3. Framework-Agnostic & Extensible:

    • Vendor-neutral, open-source, and easy to customize with plugins.
  4. Faster Time-to-Production:

    • Go from experimentation to live deployments seamlessly.

โšก Quickstart

  1. Set Up Environment

    export OPENAI_API_KEY=<your key>
    export KITCHENAI_DEBUG=True
    python -m venv venv && source venv/bin/activate && pip install kitchenai
  2. Start a Project

    kitchenai cook list && kitchenai cook select llama-index-chat && pip install -r requirements.txt

    kitchenai-list

  3. Run the Server

    kitchenai init && kitchenai dev --module app:kitchen

    Alternatively, you can run the server with jupyter notebook:

    kitchenai dev --module app:kitchen --jupyter
  4. Test the API

    kitchenai client health
    kitchenai client labels

    kitchenai-client

  5. Build Docker Container

    kitchenai build . app:kitchenai

๐Ÿ“– Full quickstart guide at docs.kitchenai.dev.


โœจ Features

  • ๐Ÿš€ Production-Ready Backend: Go from idea to production in minutes.
  • ๐Ÿ› ๏ธ Built-In LLMOps: Observability, tracing, and evaluation out-of-the-box.
  • ๐Ÿ”Œ Extensible Framework: Easily add custom plugins and AI techniques.
  • ๐Ÿ“ฆ Modular AI Modules: Deploy and test AI components with ease.
  • ๐Ÿณ Docker-First Deployment: Build and scale with confidence.

๐Ÿ“Š AI Lifecycle with KitchenAI

  1. Experiment:

    • Start in Jupyter notebooks or existing AI tools.
    • Annotate your notebook to turn it into a deployable AI module.
  2. Build:

    • Use KitchenAI to generate production-ready APIs automatically.
  3. Deploy:

    • Run the module locally or in production with built-in observability and scaling.
  4. Monitor & Improve:

    • Use KitchenAI's observability tools to evaluate performance, trace issues, and iterate.

Developer Experience

Developer Flow


๐Ÿ”ง Under the Hood

  • Django Ninja: High-performance async APIs.
  • LLMOps Stack: Built-in tracing, observability, and evaluations.
  • Plugin System: Add advanced custom functionality.
  • Docker-Optimized: Seamless deployment with S6 overlays.

๐Ÿš€ KitchenAI Cloud

Coming soon: KitchenAI Cloud will offer a fully managed AI backend experience.

Key Benefits:

  • Serverless deployment for AI modules.
  • Fully managed observability, tracing, and scaling.
  • Team collaboration tools for faster iteration.

๐Ÿ”— Sign Up for Early Access: Register Here


๐Ÿ› ๏ธ Roadmap

  • Expanded SDKs (Python, Go, JS).
  • Enhanced plugin system.
  • Enterprise-grade observability features.
  • KitchenAI Cloud Beta.

๐Ÿค Contribute

Kitchenai is in alpha-

Weโ€™re building KitchenAI in the open, and weโ€™d love your contributions:

  • โญ Star the repo on GitHub!
  • ๐Ÿ› ๏ธ Submit PRs, ideas, or feedback.
  • ๐Ÿง‘โ€๐Ÿณ Build plugins and AI modules for the community.

๐Ÿ™ Acknowledgements

KitchenAI is inspired by the open-source community and modern AI development challenges. Letโ€™s simplify AI, together.

Notable project: Falco Project. Thanks to the Python community for best practices and tools!


๐Ÿ“Š Telemetry

KitchenAI collects anonymous usage data to improve the frameworkโ€”no PII or sensitive data is collected.

Your feedback and support shape KitchenAI. Let's build the future of AI development together!