Azure OpenAI Setup and API's

Connecting Azure OpenAI with other Azure and Microsoft Services

This GitHub repository serves as a guide for users who want to establish a connection between different Azure services and Azure OpenAI.

The repository provides step-by-step instructions, and other resources that will help users effectively and efficiently connect Azure services such as Azure Synapse, Microsoft Power Platform and others, to Azure OpenAI.

Whether you are a beginner or an experienced developer, this repository will provide you with the information and tools you need to successfully connect Azure services to Azure OpenAI.

Before to jumping to the Azure OpenAI API, this guide will show how to have access to Azure OpenAI services in your Azure Subscription as well the step by step on how to setup this service.

Topics that you will work in this guide

Introduction of Using OpenAI in Azure

Using OpenAI in Azure provides several advantages over using OpenAI standalone. Here are some of the key benefits of using OpenAI in Azure:

  • Integration with other Azure services: By using OpenAI in Azure, you can easily integrate it with other Azure services such as Azure Machine Learning, Azure Databricks, Microsoft Power Platform, and others, which can help you streamline your workflows and achieve your goals faster.

  • Scalability: Azure provides a scalable platform that can help you meet the demands of your applications and workloads, regardless of the size of your deployment. This can be particularly beneficial for organizations that need to process large amounts of data or run computationally intensive workloads.

  • Reliability and security: Azure is a highly secure and reliable platform that provides a number of features and tools to help you protect your data and applications. By using OpenAI in Azure, you can benefit from the security and reliability of the Azure platform.

  • Cost-effectiveness: Azure provides a cost-effective solution for running and managing applications and workloads. This can be particularly important for organizations with limited budgets that need to get the most out of their resources.

  • Access to expert support: By using OpenAI in Azure, you can benefit from the support of a highly skilled and experienced team of experts who can help you with any questions or issues you may have.

In summary, using OpenAI in Azure provides a more integrated, scalable, secure, cost-effective, and supported solution for organizations looking to leverage the power of OpenAI.

Enrolling in OpenAI in Azure

To get you started, you first need to request access to Azure OpenAI for your subscription. Use this link as soon as possible to request access. Engineering will work to approve your access in 3-10 business days.

image

Creating an Azure OpenAI instance after receiving your confirmation

  • In your Azure Portal, proceed with the login using your user that has the subscription ID used on the last step to register Azure OpenAI.

  • On the search bar, type Azure OpenAI

Recording 2023-02-10 at 09 19 09

  • As soon as you have your Azure OpenAI created, please create an instance of Azure OpenAI clicking on Create

  • Create Azure OpenAI:

    • Select the Subscription
    • Resource Group: Create a new Resource Group or select an existing one. In this case, a new Resource Group will be created.
    • Region: In this example, East US was selected, there are other 2 options: South Central US or West Europe. Feel free to select the one that is aligned with your region.
    • Provide a name for the Instance, in this example, OpenAI-Github was used.
    • Pricing Tier: Standard S0
    • Click on Review + Create and create your instance

Recording 2023-02-10 at 09 21 21

  • Now, click on "Go to Resource"

image

Exploring Azure OpenAI and Creating a deployment model

  • Within the Azure OpenAI resource, click "Explore"

image

Once there, you will be able to find the ways to get started with Azure OpenAI Service and explore examples for prompt completion.

  • As a next step, click on "Create new deployment"

Recording 2023-02-10 at 09 36 04

On the Deploy Model, you should choose the model name. There are a few options. Davinci is the most capable model family and can perform any task the other models can perform and often with less instruction.

For applications requiring a lot of understanding of the content, like summarization for a specific audience and creative content generation, Davinci is going to produce the best results. You can have more details about other models checking on Microsoft Learn.

Just an observation: No worries, build your Model using Davinci text 3 and the Azure Open AI Playground will let you know which model you should be using for which specific use case ;-)

Recording 2023-02-10 at 09 44 48

Exploring Azure OpenAI Studio and Summarization

  • As soon as you have your model created, click on the Deployed Model
  • And them, click on "Open in Playground"

Recording 2023-02-10 at 09 46 07

Now it's time to work on the Model that you just deployed on the previous step. Select the Deployment Model, in this example, "text-davinci-003" and as well one of the examples or use cases that you are targeting, in this example, "Summarize Text" will be used.

Recording 2023-02-10 at 09 47 37

You will be able to see that a message will appears showing which model this example would works better. In case of Summarization, the model text-davinci-002 would works better. Click on "Create deployment"

As a new Deployment model was created, named "text-davinci-002", select it, as well the "Summarize text" and click in Generate on the bottom of the page to see the results of the example, in green.

Recording 2023-02-10 at 09 49 12

Understanding the Max Length (tokens)

The Max length (tokens) on Azure OpenAI means a set a limit on the number of tokens to generate in a response. The system supports a maximum of 4000 tokens shared between a given prompt and response completion. (One token is roughly 4 characters for typical English text)

Token refers to a unit of text used to represent a word or piece of punctuation in a computational process. Tokens are often used as the basis for processing and generating natural language text using artificial intelligence models such as OpenAI's GPT-3. The number of tokens in a given piece of text can impact the complexity of processing and generating a response, and setting a token limit can help to ensure that the generated response remains concise and manageable.

Additional GPT-3 Playground models

  • Azure OpenAI Studio is a platform that provides access to OpenAI's advanced AI technologies, including its cutting-edge language models, to help businesses and developers build advanced applications. Some of the services available in Azure OpenAI Studio include:

    • GPT-3 Language Model: This is the latest and most advanced language model from OpenAI, capable of performing a wide range of natural language processing tasks such as text generation, language translation, and sentiment analysis.
    • Text Generation: A service that allows you to generate new text based on a prompt or sample text, using the GPT-3 language model.
    • Text Classification: A service that allows you to classify text into predefined categories based on its content, using machine learning algorithms.
    • Named Entity Recognition: A service that identifies named entities in text, such as people, organizations, and locations.
    • Sentiment Analysis: A service that analyzes the sentiment or emotional tone of a piece of text.
    • Text Translation: A service that translates text from one language to another.
    • Natural Language to SQL: This service can be used to simplify the process of querying databases, as it enables users to ask questions in plain English and receive results in the form of structured data.

Recording 2023-02-10 at 11 57 37

These are just a few examples of the services available in Azure OpenAI Studio. The platform is continually evolving and expanding its capabilities, so new services and features may become available in the future.

Accessing Azure OpenAI API's

To access Azure OpenAI APIs, you will need to have an Azure account and a subscription to the Azure OpenAI service. Once you have access to the Azure OpenAI platform, you can use the API by following these steps:

Obtain API credentials: To access the API, you will need to obtain an API key or access token that can be used to authenticate your API requests. You can obtain API credentials from the Azure portal.

To connect the Azure OpenAI API to other Azure services, you will need to use the API to retrieve data and then store that data in an Azure database or storage service. You can also use the API to trigger other Azure services, such as Azure Functions, to perform additional processing on the data.

Note that access to the Azure OpenAI API is subject to usage limits and other restrictions, as specified in the Azure OpenAI service agreement. To avoid hitting usage limits, you should carefully manage your API usage and implement rate limiting in your application.

  • Step 1: Access the API call as a reference for your application. Back to the Summarization use case, and click on View Code on the top right.

image

image

  • Step 3: Go back to the Azure Portal, inside of your Azure OpenAI Services to access the API Key.

The Keys and Endpoints are located underneath the Resource Management menu on the left blade (as highlighted on the picture below)

image

  • Step 4: These keys are used to access your Cognitive Service API. Do not share your keys. Store them securely in a vault such as an Azure Key Vault. We also recommend regenerating these keys regularly. Only one key is necessary to make an API call. When regenerating the first key, you can use the second key for continued access to the service.

image

Examples of API utilization and services utilization

Using Power Apps to access Azure OpenAI services

You can use Microsoft Power Apps to access Azure OpenAI services. Power Apps is a low-code platform that allows you to build custom business applications for web and mobile devices. You can use Power Apps to connect to a variety of data sources, including Azure OpenAI APIs, and to build custom applications that interact with the data provided by the API.

Here's a general outline of how you can use Power Apps to access Azure OpenAI services:

  • Step 1 Access the Power App Portal accessing Power App and create a Flows clicking the Flows on the left blade. Click on "New Flow" and then on "Instant Cloud Flow"

Recording 2023-02-10 at 13 32 10

  • Step 2 In Microsoft Power Apps, you can use variables to store and manipulate data in your app. By initializing a variable, you give it an initial value that can be used as a starting point for further computations or manipulations within your app. This is what we will do in our second step.
    • Click in New Step
    • Search for Initialize variable
    • Use a "prompt" for Name
    • Type: String
    • Value, select Ask in Power App (which will be translated to Initialize Variable parameter

Recording 2023-02-10 at 15 07 04

  • Step 3 This is the time to use what you have noted on the step * Accessing Azure OpenAI API's

    * New Step
    * Search for HTTP
    * HTTP Premium
    * Method: POST
    * URL: 
       -  <Provide your API Address URL>
       -  Content-Type: application/json
       -  api-key: <Provide the API Key you have documented earlier>
    * Body:
    

Remember to remove the key from your code when you're done, and never post it publicly. For production, use a secure way of storing and accessing your credentials like Azure Key Vault. See the Cognitive Services security article for more information.

{
  "prompt": @{triggerBody()['Initializevariable_Value']},
  "max_tokens": 1000,
  "temperature": 1
}

Recording 2023-02-10 at 16 05 00

  • Step 4 Save and Test the HTTP API Call

    • Save the Flow
    • Click Test on top right
    • Manually
    • Test
    • Enter a reference text for the test
    • Run the test
    • Check if you have all green checks on each steps

Recording 2023-02-10 at 16 34 01

Using Azure Synapse to access Azure OpenAI services

Azure Synapse and Azure OpenAI can be connected to bring the benefits of both platforms to your workflow.

To connect Azure Synapse and Azure OpenAI, you can use Azure Synapse Studio to access OpenAI's GPT-3 language model and use its capabilities to enhance the data in your Azure Synapse workspace. You can use Azure Synapse Studio notebooks to call the OpenAI API, retrieve the results, and perform data analytics or machine learning tasks on the data returned by the API.

The reason to connect Azure Synapse and Azure OpenAI is that Azure Synapse provides a unified workspace for big data analytics and data warehousing, while Azure OpenAI provides access to the largest and most advanced language model, GPT-3. By connecting the two platforms, you can leverage the power of Azure OpenAI's advanced language capabilities in your data analytics workflows, enabling you to extract insights from unstructured data and enhance your overall data analysis capabilities.

For example, you can use the Azure OpenAI API to generate natural language summaries of large datasets stored in Azure Synapse, perform sentiment analysis on customer feedback data, or extract key entities and relationships from text data. These capabilities can greatly enhance the insights you can gain from your data, enabling you to make informed decisions and drive business success.

The Azure Synapse and Azure OpenAI tutorial on Microsoft Learn shows how to apply large language models at a distributed scale using Azure Open AI and Azure Synapse Analytics.

Reference Notebook from the example above:

image