Missing Dependency `langchain_ollama` in rag-api-dev-lite
Closed this issue · 4 comments
Running latest rag_api docker
FROM ghcr.io/danny-avila/librechat-rag-api-dev-lite:latest
With the following .env
file
#==================================================#
# RAG #
#==================================================#
# More info: https://www.librechat.ai/docs/configuration/rag_api
RAG_API_URL=http://127.0.0.1:8000
RAG_PORT=8000
EMBEDDINGS_PROVIDER=ollama
OLLAMA_BASE_URL=http://xyz:11434
EMBEDDINGS_MODEL=nomic-embed-text
I get the following error:
Traceback (most recent call last):
File "/app/main.py", line 46, in <module>
from psql import PSQLDatabase, ensure_custom_id_index_on_embedding, pg_health_check
File "/app/psql.py", line 3, in <module>
from config import DSN, logger
File "/app/config.py", line 252, in <module>
embeddings = init_embeddings(EMBEDDINGS_PROVIDER, EMBEDDINGS_MODEL)
File "/app/config.py", line 206, in init_embeddings
from langchain_ollama import OllamaEmbeddings
ModuleNotFoundError: No module named 'langchain_ollama'
Possibly because of a missing dependency
langchain-ollama=0.2.0
in
Lines 1 to 32 in fc7b36c
@ScarFX
Missing dependency of langchain-ollama=0.2.0
seems to be the reason. Thanks @sreevatsank1999 for pointing this out.
I added it to requirements.lite.txt
#85
Same issue happening here, I think the dependency got deleted on a commit last week for requeriments.lite.txt e5e9dfa
The lite image is not meant to include the ollama dependency, it was mistakenly added trying to resolve this issue.
Adding ollama no longer makes it "lite" by jumping up to 7GB in size. To use ollama, you need to change your container configuration to use the base image, which has ollama included:
https://github.com/danny-avila/rag_api/pkgs/container/librechat-rag-api-dev
More info: https://www.librechat.ai/docs/configuration/rag_api#custom-configuration---ollama