bosun-ai/swiftide

LanceDB support

Closed this issue · 3 comments

While Qdrant has local support and provides a docker image, LanceDB, another Rust vector database, runs serverless and in process, similar to Sqlite.
Qdrant has no GPU support yet when building the index, LanceDB does.
https://lancedb.github.io/lancedb/ann_indexes/#creating-an-ivf_pq-index

Query wise LanceDB appears to be as fast or faster then other vector databases and has lower memory requirements due to it being serverless.
https://blog.lancedb.com/benchmarking-lancedb-92b01032874a/
https://github.com/prrao87/lancedb-study

These can potentially speed up RAG pipelines compared to using Qdrant
https://blog.lancedb.com/accelerating-deep-learning-workflows-with-lance/

Rust API pointers:
https://docs.rs/lancedb/latest/lancedb/
https://lancedb.github.io/lancedb/reranking/
https://towardsdatascience.com/scale-up-your-rag-a-rust-powered-indexing-pipeline-with-lancedb-and-candle-cc681c6162e8

Absolutely! I was actually investigating it yesterday to set it up.

Embedding wise LanceDB only supports OpenAI.... while Qdrant appears to support everything.
https://lancedb.github.io/lancedb/embeddings/embedding_functions/
https://qdrant.tech/documentation/embeddings/

I wonder could Ollama provide embeddings to LanceDB through an OpenAI compatible api?
https://ollama.com/blog/embedding-models
ollama/ollama#2416

It looks like you can define a schema and store the vectors. I think we outperform the embedding speed of lancedb if we do it ourselves 👯