Pinned Repositories
ai-chatbot
Reference integration of Langfuse Typescript SDK in Chat Application
langfuse
🪢 Open source LLM engineering platform: LLM Observability, metrics, evals, prompt management, playground, datasets. Integrates with OpenTelemetry, Langchain, OpenAI SDK, LiteLLM, and more. 🍊YC W23
langfuse-docs
🪢 Langfuse documentation -- Langfuse is the open source LLM Engineering Platform. Observability, evals, prompt management, playground and metrics to debug and improve LLM apps
langfuse-java
🪢 Auto-generated Java Client for Langfuse API
langfuse-js
🪢 Langfuse JS/TS SDKs - Instrument your LLM app and get detailed tracing/observability. Works with any LLM or framework
langfuse-k8s
Community-maintained Kubernetes config and Helm chart for Langfuse
langfuse-python
🪢 Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework
langfuse-vercel-ai-nextjs-example
mcp-server-langfuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
oss-llmops-stack
Modular, open source LLMOps stack that separates concerns: LiteLLM unifies LLM APIs, manages routing and cost controls, and ensures high-availability, while Langfuse focuses on detailed observability, prompt versioning, and performance evaluations.
Langfuse's Repositories
langfuse/langfuse
🪢 Open source LLM engineering platform: LLM Observability, metrics, evals, prompt management, playground, datasets. Integrates with OpenTelemetry, Langchain, OpenAI SDK, LiteLLM, and more. 🍊YC W23
langfuse/langfuse-k8s
Community-maintained Kubernetes config and Helm chart for Langfuse
langfuse/langfuse-python
🪢 Langfuse Python SDK - Instrument your LLM app with decorators or low-level SDK and get detailed tracing/observability. Works with any LLM or framework
langfuse/mcp-server-langfuse
Model Context Protocol (MCP) Server for Langfuse Prompt Management. This server allows you to access and manage your Langfuse prompts through the Model Context Protocol.
langfuse/oss-llmops-stack
Modular, open source LLMOps stack that separates concerns: LiteLLM unifies LLM APIs, manages routing and cost controls, and ensures high-availability, while Langfuse focuses on detailed observability, prompt versioning, and performance evaluations.
langfuse/langfuse-docs
🪢 Langfuse documentation -- Langfuse is the open source LLM Engineering Platform. Observability, evals, prompt management, playground and metrics to debug and improve LLM apps
langfuse/langfuse-js
🪢 Langfuse JS/TS SDKs - Instrument your LLM app and get detailed tracing/observability. Works with any LLM or framework
langfuse/langfuse-java
🪢 Auto-generated Java Client for Langfuse API
langfuse/ai-chatbot
Reference integration of Langfuse Typescript SDK in Chat Application
langfuse/langfuse-terraform-aws
🪢 Terraform module to deploy Langfuse on AWS
langfuse/langfuse-vercel-ai-nextjs-example
langfuse/langfuse-api-reference
langfuse/langfuse-examples
Examples on how to deploy and use Langfuse
langfuse/.github