Assistant prompt not working when using a proxy server
ayush0x00 opened this issue · 0 comments
ayush0x00 commented
The bug
I am using a proxy server which expose an endpoint, and after processing the request makes a call to azure openai endpoint. The server is returning the response, which is exactly same as it should return by directly calling the azure endpoint. However, the returned prompt is not being shown by my python notebook. When I directly make call to the azure api, the notebook shows the received prompt. I have attached ss of notebook and the received response from server
gpt_azure = models.AzureOpenAI(azure_endpoint="http://localhost:7777/openai/deployments/gpt-3.5-turbo/chat/completions?api-version=2023-05-15", model="gpt-3.5-turbo", api_key="")
global lm
with system():
lm = gpt_azure + "You are a helpful assistant."
with user():
lm += "Hello...how are you?"
with assistant():
lm += gen(name="resp")
print(lm["resp"])
The response sent by the proxy server.
System info:
- OS : MacOS
- Guidance Version: 0.1.15