Getting a "'_thread.RLock'" object error when instantiating a LangChain model as a default object
Opened this issue · 0 comments
DmitriyLeybel commented
ALL software version info
param==2.1.0
langchain-openai==0.1.9
Description of expected behavior and the observed behavior
The following errors out when setting the default instantiation of a ChatOpenAI object on a parameter:
import param
from langchain_openai import ChatOpenAI
class TestClass(param.Parameterized):
model = param.ClassSelector(class_=ChatOpenAI, default=ChatOpenAI())
TestClass()
This is likely because ChatOpenAI
is meant to handle concurrency.
I am able to work around this by instantiating the model in the init:
import param
from langchain_openai import ChatOpenAI
class TestClass(param.Parameterized):
model = param.ClassSelector(class_=ChatOpenAI, default=None)
def __init__(self, **params):
super().__init__(**params)
self.model = ChatOpenAI()
TestClass()
This isn't an ideal solution, as it feels like a hacky workaround and adds overhead to the codebase.