Azure Functions app to forecast demand of ambulances in Lebanon.
Built to support Lebanese Red Cross (LRC) during COVID-19 response. Based on this template.
- All relevant code for forecasts is in
my-function/__init__.py
- The schedule (how often the forecasts are generated) is set in
function.json
; to change it, edit the cron expression in the fieldschedule
Locally:
- Python 3.7
pip install -r requirements.txt
- Azure Functions Core Tools and Visual Studio Code
In Azure:
- An Azure Resource Group for linux resources (BEST PRACTICE: make a new resource group for each project)
- The role of "Contributor" in that resource group
- Test and debug locally. Example: to run the function locally, execute this command from the project root folder
$ func start --functions <my-function> --python --verbose --build remote
- When ready, deploy to Azure using Visual Studio Code
You will now be able to monitor your function in the Azure portal. A new resource of type "Application Insights" will be created, where you can monitor runs, errors, etc. Good to know: in Azure portal you can also check the logs within the Function App (Functions > <my-function> > Code + Test > Logs
)
If your function takes data as input/output, the recommended workflow is to store the data in an Azure storage account and download/upload it from/to there.
When you create a new function app with Visual Data Studio a new storage account will be created automatically in the same resource group; you can use this one or an existing one. Good to know: individual files in Azure storage are called 'blobs' and directories 'containers'.
- Create a container within the storage via the Azure portal
- Configure the function in
__init__.py
They way your function exchange data with the storage is via azure-storage-blob, which needs the rights credentials. If you are using the default storage that is created with the function, the credentials are already accessible as an environmental variable named AzureWebJobsStorage
; you can get it with
credentials = os.environ['AzureWebJobsStorage']
and then get your data with e.g.
blob_service_client = BlobServiceClient.from_connection_string(credentials)
blob_client = blob_service_client.get_blob_client(container='<my-container>', blob='<my-data-file>')
data = pickle.loads(blob_client.download_blob().readall())
[OPTIONAL] If you are using a different storage account
- Copy the credentials from the Azure portal
- Add them in the function settings, so that they will be callable within the function as environmental variables
- Add them in
local.settings.json
, in order to be able to run the function locally
If your function needs to use an API (e.g. Google Maps) and requires credentials, do NOT store them in __init__.py
, since this will expose them to whoever has access to the resource group. The recommended workflow is to store them in an Azure Key Vault.
- Create an Azure Key Vault in the same resource group
- Ask to be given the role of "Key Vault Secrets Officer" in the vault (ask the admin of the resource group)
- Add your credentials in the vault under
Secrets
, via the Azure portal - Integrate the credentials in your function app