Example with openAI client and locally deployed LLM returns "Failed to deserialize the JSON body into the target type: tool_choice: data did not match any variant of untagged enum ToolTypeDeserializer"
Opened this issue · 1 comments
paguilomanas commented
- This is actually a bug report.
- I am not getting good LLM Results
- I have tried asking for help in the community on discord or discussions and have not received a response.
- I have tried searching the documentation and have not found an answer.
What Model are you using?
- gpt-3.5-turbo
- gpt-4-turbo
- gpt-4
- Other -> "Llama-3.1-70B-Instruct" locally deployed with TGI
Describe the bug
I am trying the very first example of Instructor library but with a locally deployed model endpoint. I get the following error: InstructorRetryException: Failed to deserialize the JSON body into the target type: tool_choice: data did not match any variant of untagged enum ToolTypeDeserializer at line 1 column 161
To Reproduce
python 3.10
instructor 1.6.3
import openai
import instructor
from pydantic import BaseModel
class User(BaseModel):
name: str
age: int
instructor_client = instructor.from_openai(openai.OpenAI(
base_url="http://10.10.78.13:8080/v1",
api_key="unused",
))
user = instructor_client.chat.completions.create(
model="Llama-3.1-70B-Instruct",
messages=[
{"role": "user", "content": "Create a user"},
],
response_model=User,
)
print(user.name)
print(user.age)
Expected behavior
I would expect the user to be a valid BaseModel json schema but the instructor_client.chat.completions.create()
call is not working.
Other details
- My local model endpoint works perfectly when calling it using the openAI client like this:
client = openai.OpenAI(
base_url="http://10.10.78.13:8080/v1",
api_key="unused"
)
response = self.client.chat.completions.create(
model="Llama-3.1-70B-Instruct",
messages=[{"role": "user", "content": "Create a user"}])
print(response.choices[0].message.content)
- If the problem is related with my locally deployed model, can someone show me how to use instructor library with a locally deployed model? The final objective for it to work is to apply it when using deepeval framework.
geokanaan commented
did you manage to solve this issue?