Add support for AzureOpenAI
Closed this issue · 5 comments
samarth6341 commented
Add support for AzureOpenAI
ivanleomk commented
Question - I'm not super familiar with the Azure OpenAI client.
Can you not use the normal OpenAI client and just change the api endpoint and auth params?
How does the azure OpenAI client differ
qkxie commented
@ivanleomk This is very simple.
from openai import AzureOpenAI
llm = AzureOpenAI(your params)
client = instructor.from_openai(llm)
In fact, any LLM which is compatible with openAI SDK can use this method.
Flopsky commented
Here is more precise exemple
import instructor
from openai import AzureOpenAI
from pydantic import BaseModel
# Define your data model
class UserInfo(BaseModel):
name: str
age: int
# Initialize the Azure OpenAI client with your credentials
client = AzureOpenAI(
api_key="your-azure-api-key",
azure_endpoint="https://your-resource-name.openai.azure.com", # Your Azure endpoint
#add your oder params
)
# Patch the client with instructor
client = instructor.from_openai(client)
# Use the patched client to extract structured data
user_info = client.chat.completions.create(
model="your-deployed-model-name", # The name you gave to your deployed model
response_model=UserInfo,
messages=[
{"role": "user", "content": "John Doe is 30 years old."}
]
)
print(user_info.name) # Output: John Doe
print(user_info.age) # Output: 30
samarth6341 commented
Thank you so much.
ivanleomk commented
Closing this issue since it's possible to do so with the default OpenAI integration