wkok/openai-clojure

Azure Chat Completion API availability

Closed this issue · 3 comments

I see that Chat Completions is now available for Azure: https://learn.microsoft.com/en-us/azure/cognitive-services/openai/reference#chat-completions

I see you're loading azure_openai.json to create the requests, but that version doesn't include the chat-completions. Is there a new update to this spec that could be dropped in? Would love to use this feature, appreciate the feedback!

wkok commented

Hello!

I've updated the azure spec, and added support for azure chat in this branch: https://github.com/wkok/openai-clojure/tree/azure-chat-completions

As azure chat is still in preview, I'll keep in this branch and only merge/release once azure spec is stable.

As I do not have an azure subscription, I cannot test this, I would appreciate it if you could test this branch and let me know of any issues?

Basic usage should be as simple as:

(api/create-chat-completion {:model "gpt-3.5-turbo"
                             :messages [{:role "system" :content "You are a helpful assistant."}
                                                {:role "user" :content "Who won the world series in 2020?"}
                                                {:role "assistant" :content "The Los Angeles Dodgers won the World Series in 2020."}
                                                {:role "user" :content "Where was it played?"}]}
                            {:impl :azure})

Can confirm that create-chat-completion works now for {:impl :azure}! Thanks!

One thing to note, is that the azure api defines the gpt-3.5 model as gpt-35-turbo.

One other api that is available in Azure Rest API that doesn't have a current path in your implementation is for getting a list of models: https://learn.microsoft.com/en-us/rest/api/cognitiveservices/azureopenaistable/models/list?tabs=HTTP

wkok commented

In anticipation of the stable release of azure chat this month, I've released support for this in v0.6.0 of this library

The Azure Management API's are not currently supported as it is not included in the same openai swagger spec, it is a completely separate thing which does not quite fit in with the current architecture of this library. Might need to revisit/rethink this someday, but in the meantime, I think it will remain unsupported (unless someone provides a PR 😉