This repository contains many notebooks that explain how Azure AI Search works, including several showcasing how vector search works.
-
Run
azd up
on azure-search-openai-demo with GPT-4-vision enabled. This will create the necessary resources for the Azure OpenAI, Azure AI Search services, and the Computer Vision service. -
Create a .env with these variables, with the values taken from
.azure/ENV-NAME/.env
in the azure-search-openai-demo repository.AZURE_OPENAI_SERVICE=YOUR-SERVICE-NAME AZURE_OPENAI_DEPLOYMENT_NAME=YOUR-OPENAI-DEPLOYMENT-NAME AZURE_OPENAI_ADA_DEPLOYMENT=YOUR-EMBED-DEPLOYMENT-NAME AZURE_SEARCH_SERVICE=YOUR-SEARCH-SERVICE-NAME AZURE_COMPUTERVISION_SERVICE=YOUR-COMPUTERVISION-SERVICE-NAME AZURE_TENANT_ID=YOUR-TENANT-ID
-
Login to your Azure account using the Azure CLI. Specify
--tenant-id
if you deployed that repo to a non-default tenant.azd auth login
-
Create a Python virtual environment or open the project in a container.
-
Install the requirements:
pip install -r requirements.txt
These are the available notebooks, in suggested order:
- Vector Embeddings Notebook
- Azure AI Search Notebook
- Image Search Notebook
- Azure AI Search Relevance Notebook
- RAG with Azure AI Search
- RAG Evaluation
You can find video recordings going through the notebooks here.