[Refactor] Refactory of OpenAICompatible LLM setting
Closed this issue · 0 comments
valentimarco commented
Right now, the OpenAICompatible setting is broken (see also #735 #723 #713)!!
Describe the solution you'd like
Refactor this class with 4 fields:
- url
- temperature
- model_name
- api_key
And use CustomOpenAI
as _pyclass
defined in the factory/custom_llm.py
.
This latest should be written like so:
from langchain_openai.chat_models import ChatOpenAI
class CustomOpenAI(ChatOpenAI):
def __init__(self, **kwargs):
super().__init__(
openai_api_key=kwargs['api_key'],
model_kwargs={},
**kwargs
)
self.openai_api_base = kwargs['base_url']
This provides a basic connection for most runners that have OpenAI API compatibility (this is the reason why the model_kwargs are an empty dict, to provide only a simple connection!)
Additional context
If the user needs more customization from one provider than, he can build the specific client!
The community provide and publish some of this custom clients on the registry!