BCG-X-Official/artkit

Add LLM Connector for Azure OpenAI Models

Opened this issue · 5 comments

Justification

The Azure OpenAI API is compatible with OpenAI's API. With Azure OpenAI, one can set up their own deployments of the common GPT and Codex models. When calling the API, they need to specify the deployment they want to use.

Details

The new class will live in src/artkit/model/llm/azureopenai/

Be sure to update the docs:

Please see the tutorial: Creating New Model Classes

Thank you for raising this issue! We agree this would be a good enhancement

Hi @tezansahu (or anyone else interested), we would love to invite you to contribute here! For reference, we created a similar connector for Amazon's titan bedrock model in this file and would be happy to provide more guidance on this.

I've added additional details about the requirements:

Issue

As an extension to ARTKIT, we would like the package to be able to support OpenAI models deployed on Azure.

Solution

  • Leverage the OpenAIChat object implementation here located here and implement a similar class called _azureopenai.py located in src/artkit/model/llm/azure
  • Use the existing paradigms within the _openai.py implementation but instead, import the AsyncAzureOpenAI SDK instead of the AsyncOpenAI SDK. The source for AsyncAzureOpenAI is located here
  • Update the RELEASE_NOTES.rst file:
    • Increment a minor version here

Notes

There are differences between the inference parameters between OpenAI and Azure OpenAI - some of those differences are located in the Azure SDK documentation located here

Sure, I could look into this. I will be happy to contribute here.

@matthew-wong-bcg @seanggani I have raised the PR with Azure OpenAI integration (including test & docs). Please review.