OpenInference is a set of conventions and plugins that is complimentary to OpenTelemetry to enable tracing of AI applications. OpenInference is natively supported by arize-phoenix, but can be used with any OpenTelemetry-compatible backend as well.
The OpenInference specification is edited in markdown files found in the spec directory. It's designed to provide insight into the invocation of LLMs and the surrounding application context such as retrieval from vector stores and the usage of external tools such as search engines or APIs. The specification is transport and file-format agnostic, and is intended to be used in conjunction with other specifications such as JSON, ProtoBuf, and DataFrames.
OpenInference provides a set of instrumentations for popular machine learning SDKs and frameworks in a variety of languages.
Package | Description | Version |
---|---|---|
openinference-semantic-conventions |
Semantic conventions for tracing of LLM Apps. | |
openinference-instrumentation-openai |
OpenInference Instrumentation for OpenAI SDK. | |
openinference-instrumentation-llama-index |
OpenInference Instrumentation for LlamaIndex. | |
openinference-instrumentation-dspy |
OpenInference Instrumentation for DSPy. | |
openinference-instrumentation-bedrock |
OpenInference Instrumentation for AWS Bedrock. | |
openinference-instrumentation-langchain |
OpenInference Instrumentation for LangChain. | |
openinference-instrumentation-mistralai |
OpenInference Instrumentation for MistralAI. |
Name | Description | Complexity Level |
---|---|---|
OpenAI SDK | OpenAI Python SDK, including chat completions and embeddings | Beginner |
MistralAI SDK | MistraAI Python SDK | Beginner |
LlamaIndex | LlamaIndex query engines | Beginner |
DSPy | DSPy primitives and custom RAG modules | Beginner |
Boto3 Bedrock Client | Boto3 Bedrock client | Beginner |
LangChain | LangChain primitives and simple chains | Beginner |
LlamaIndex + Next.js Chatbot | A fully functional chatbot using Next.js and a LlamaIndex FastAPI backend | Intermediate |
LangServe | A LangChain application deployed with LangServe using custom metadata on a per-request basis | Intermediate |
Package | Description | Version |
---|---|---|
@arizeai/openinference-semantic-conventions |
Semantic conventions for tracing of LLM Apps. | |
@arizeai/openinference-instrumentation-openai |
OpenInference Instrumentation for OpenAI SDK. |
Name | Description | Complexity Level |
---|---|---|
OpenAI SDK | OpenAI Node.js client | Beginner |
LlamaIndex Express App | A fully functional LlamaIndex chatbot with a Next.js frontend and a LlamaIndex Express backend, instrumented using openinference-instrumentation-openai |
Intermediate |
OpenInference supports the following destinations as span collectors.
- β Arize-Phoenix
- β Arize
- β Any OTEL-compatible collector
Join our community to connect with thousands of machine learning practitioners and LLM observability enthusiasts!
- π Join our Slack community.
- π‘ Ask questions and provide feedback in the #phoenix-support channel.
- π Leave a star on our GitHub.
- π Report bugs with GitHub Issues.
- π Follow us on X.
- πΊοΈ Check out our roadmap to see where we're heading next.