llm-ops
There are 12 repositories under llm-ops topic.
bentoml/OpenLLM
Run any open-source LLMs, such as Llama 2, Mistral, as OpenAI compatible API endpoint in the cloud.
truefoundry/cognita
RAG (Retrieval Augmented Generation) Framework for building modular, open source applications for production by TrueFoundry
lastmile-ai/aiconfig
AIConfig is a config-based framework to build generative AI applications.
julep-ai/julep
Open-source alternative to Assistant's API with a managed backend for memory, RAG, tools and tasks. ~Supabase for building AI agents.
astronomer/ask-astro
An end-to-end LLM reference implementation providing a Q&A interface for Airflow and Astronomer
athina-ai/athina-evals
Python SDK for running evaluations on LLM generated responses
EmbeddedLLM/JamAIBase
Firebase for AI Agents: Open-source backend platform that puts powerful generative models at the core of your database. With managed memory and RAG capabilities, developers can easily build AI agents, enhance their apps with generative tables, and create magical UI experiences.
friendliai/friendli-client
Friendli: the fastest serving engine for generative AI
YeonwooSung/MLOps
Miscellaneous codes and writings for MLOps
AndrMoura/streamlit-chatbot-analytics
Streamlit-based chatbot leveraging Ollama via LangChain and PostHog-LLM for advanced logging and monitoring
langgenius/dify-conversation
[🚧🚧🚧WIP🚧🚧 🚧]Rework of webapp-conversation
prompt-foundry/typescript-sdk
The prompt engineering, prompt management, and prompt evaluation tool for TypeScript, JavaScript, and NodeJS.