danny-avila/rag_api

Missing Dependency `langchain_ollama` in rag-api-dev-lite

Closed this issue · 4 comments

Running latest rag_api docker

FROM ghcr.io/danny-avila/librechat-rag-api-dev-lite:latest

With the following .env file

#==================================================#
#                        RAG                       #
#==================================================#
# More info: https://www.librechat.ai/docs/configuration/rag_api

RAG_API_URL=http://127.0.0.1:8000
RAG_PORT=8000 
EMBEDDINGS_PROVIDER=ollama
OLLAMA_BASE_URL=http://xyz:11434
EMBEDDINGS_MODEL=nomic-embed-text

I get the following error:

Traceback (most recent call last):
  File "/app/main.py", line 46, in <module>
    from psql import PSQLDatabase, ensure_custom_id_index_on_embedding, pg_health_check
  File "/app/psql.py", line 3, in <module>
    from config import DSN, logger
  File "/app/config.py", line 252, in <module>
    embeddings = init_embeddings(EMBEDDINGS_PROVIDER, EMBEDDINGS_MODEL)
  File "/app/config.py", line 206, in init_embeddings
    from langchain_ollama import OllamaEmbeddings
ModuleNotFoundError: No module named 'langchain_ollama'

Possibly because of a missing dependency

langchain-ollama=0.2.0 in

langchain==0.3
langchain_community==0.3
langchain_openai==0.2.0
langchain_core==0.3.5
sqlalchemy==2.0.28
python-dotenv==1.0.1
fastapi==0.110.0
psycopg2-binary==2.9.9
pgvector==0.2.5
uvicorn==0.28.0
pypdf==4.1.0
unstructured==0.15.13
markdown==3.6
networkx==3.2.1
pandas==2.2.1
openpyxl==3.1.2
docx2txt==0.8
pypandoc==1.13
PyJWT==2.8.0
asyncpg==0.29.0
python-multipart==0.0.9
aiofiles==23.2.1
rapidocr-onnxruntime==1.3.24
opencv-python-headless==4.9.0.80
pymongo==4.6.3
langchain-mongodb==0.2.0
cryptography==42.0.7
python-magic==0.4.27
python-pptx==0.6.23
xlrd==2.0.1
langchain-aws==0.2.1
boto3==1.34.144

@ScarFX

Missing dependency of langchain-ollama=0.2.0 seems to be the reason. Thanks @sreevatsank1999 for pointing this out.

I added it to requirements.lite.txt #85

Same issue happening here, I think the dependency got deleted on a commit last week for requeriments.lite.txt e5e9dfa

The lite image is not meant to include the ollama dependency, it was mistakenly added trying to resolve this issue.

Adding ollama no longer makes it "lite" by jumping up to 7GB in size. To use ollama, you need to change your container configuration to use the base image, which has ollama included:

https://github.com/danny-avila/rag_api/pkgs/container/librechat-rag-api-dev

More info: https://www.librechat.ai/docs/configuration/rag_api#custom-configuration---ollama