langchain-ai/langchain

chat_ollama ResourceWarning: unclosed socket.socket

Closed this issue · 3 comments

Checked other resources

  • This is a bug, not a usage question.
  • I added a clear and descriptive title that summarizes this issue.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).
  • This is not related to the langchain-community package.
  • I read what a minimal reproducible example is (https://stackoverflow.com/help/minimal-reproducible-example).
  • I posted a self-contained, minimal, reproducible example. A maintainer can copy it and run it AS IS.

Example Code

from langchain_ollama import ChatOllama

llm = ChatOllama(
    model="xxx",
    temperature=0,
    base_url="xxx"
    # other params...
)

messages = [
    (
        "system",
        "You are a helpful assistant that translates English to French. Translate the user sentence.",
    ),
    ("human", "I love programming."),
]
ai_msg = llm.invoke(messages)

print(ai_msg.content)

Error Message and Stack Trace (if applicable)

ResourceWarning: unclosed <socket.socket fd=19, family=2, type=1, proto=6, laddr=('127.0.0.1', 41080), raddr=('127.0.0.1', 11434)> _chain_future(future, new_future)
ResourceWarning: Enable tracemalloc to get the object allocation traceback

Description

Is the session not closed?

System Info

System Information

OS: Linux
OS Version: #115~20.04.1-Ubuntu SMP Mon Apr 15 17:33:04 UTC 2024
Python Version: 3.11.0 (main, Mar 1 2023, 18:26:19) [GCC 11.2.0]

Unable to reproduce. Can you confirm versions used? langchain, ollama, etc.

langchain==0.3.10
langchain-ollama==0.2.1

mdrxy commented

Please upgrade to the latest versions of each and report back if the issue persists. Will re-open if so.