aurelio-labs/semantic-router

supported embedding models

Closed this issue · 2 comments

fastembed only support very limited models, will you change implementation to support all huggingface embedding models?

I've hardcoded huggingface.py to replace embedding model, not elegant though, better set model name as parameter.

@Jeru2023 we have recently added a HuggingFaceEncoder to enable support for embedding models on Hugging Face in PR #90. You can use it as follows:

from semantic_router.encoders import HuggingFaceEncoder

encoder = HuggingFaceEncoder(name="sentence-transformers/all-MiniLM-L6-v2")