Chroma is the open-source embedding database. Chroma makes it easy to build LLM apps by making knowledge, facts, and skills pluggable for LLMs.
Language Support
python
-pip install chromadb
javascript/typescript
-npm install --save chromadb
For example, the "Chat your data"
use case:
- Add documents to your database. You can pass in your own embeddings, embedding function, or let Chroma embed them for you.
- Query relevant documents with natural language.
- Compose documents into the context window of an LLM like
GPT3
for additional summarization or analysis.
- Simple: Fully-typed, fully-tested, fully-documented == happiness
- Integrations:
🦜️🔗 LangChain
(python and js),🦙 gpt-index/LlamaIndex
and more soon - Dev, Test, Prod: the same API that runs in your python notebook, scales to your cluster
- Feature-rich: Queries, filtering, density estimation and more
- Free: Apache 2.0 Licensed
pip install chromadb
import chromadb
client = chromadb.Client()
collection = client.create_collection("all-my-documents")
collection.add(
embeddings=[[1.5, 2.9, 3.4], [9.8, 2.3, 2.9]],
metadatas=[{"source": "notion"}, {"source": "google-docs"}],
ids=["n/102", "gd/972"],
)
results = collection.query(
query_embeddings=[1.5, 2.9, 3.4],
n_results=2
)
Chroma is a rapidly developing project. We welcome PR contributors and ideas for how to improve the project.
- Join the conversation on Discord
- Review the roadmap and contribute your ideas
- Grab an issue and open a PR
What are embeddings?
- Read the guide from OpenAI
- Literal: Embedding something turns it from image/text/audio into a list of numbers. 🖼️ or 📄 =>
[1.2, 2.1, ....]
. This process makes documents "understandable" to a machine learning model. - By analogy: An embedding represents the essence of a document. This enables documents and queries with the same essence to be "near" each other and therefore easy to find.
- Technical: An embedding is the latent-space position of a document at a layer of a deep neural network. For models trained specifically to embed data, this is the last layer.
- A small example: If you search your photos for "famous bridge in San Francisco". By embedding this query and comparing it to the embeddings of your photos and their metadata - it should return photos of the Golden Gate Bridge.