page type | languages | products/technologies | |||||
---|---|---|---|---|---|---|---|
sample |
|
|
These samples showcase the following:
- How to use Terraform to provision and deploy Azure Functions based on either docker image or running from a package.
- How to create a docker image to run a Java Azure Function. The assets include:
- DockerFile: The docker file where the Maven build is done and the artifacts are moved to a final usable Azure Function v3 image.
- docker-compose: a wrapper for the DockerFile to help ease running the docker file.
- Azure Pipeline Config: An Azure DevOps pipeline yaml configuration file that runs the docker file to create the docker image, and then extract the test results from the image and expose them to the pipeline, as well as push the image to a registry.
File/folder | Description |
---|---|
Sample-Java-Azure-Function |
Sample Java Azure Function, DockerFile, docker-compose, and Azure pipeline files. |
Terraform |
Sample Terraform code. |
.gitignore |
Define what to ignore at commit time. |
CHANGELOG.md |
List of changes to the sample. |
CONTRIBUTING.md |
Guidelines for contributing to the sample. |
README.md |
This README file. |
LICENSE |
The license for the sample. |
This Terraform module simplifies provisioning Azure Functions based on either docker image or running from a package.
Before using this Terraform model,
- we expect you to have basic knowledge about:
- Terraform 0.12+ installed.
- Azure subscription for the module to run deployments within.
- Azure Storage Account for tracking Terraform remote backend state.
- This module assumes that the following resources are deployed in your Azure subscription:
- Provisions a set of azure function apps.
- Supports deployment from a docker image.
- Supports enabling functions to run from package.
- Supports azure resource tags.
- Supports Application Insights integration.
- Supports Service Plan integration.
- Function App Setting configuration.
module "function_app" {
source = "<root directory>/function-app"
fn_name_prefix = var.function_name_prefix
service_plan_name = var.service_plan_name
resource_group_name = var.resource_group_name
storage_account_name = var.storage_account_name
docker_registry_server_username = var.docker_username
docker_registry_server_password = var.docker_password
app_insights_instrumentation_key = var.app_insights_instrumentation_key
fn_app_config = {
var.first_function_name: {
image : var.function_image,
zip : null,
hash: null
},
var.second_function_name: {
image : null,
zip : var.function_package_location,
hash : var.function_package_sha
}
}
}
Clone the repo to your local machine and navigate to Terraform directory.
Using any of the options in Terraform documentation, you can configure the following variables:
-
Each function app created will be in the format
fn_app_prefix-function_app_name
. -
The username and password for the docker registry in order to be able to pull the images to deploy the function apps.
-
This is a map where the key is the
function_app_name
and the value is an object that contains the definition of what's being deployed. It has one of two possible structures:- For Docker based deployment, the object has one field:
image
: which refers to the docker image name to deploy.
- For running from a package, it should contains the fields:
-
zip
: contains an http reference to the package.This will enable your function app to run from a package by adding a
WEBSITE_RUN_FROM_PACKAGE
setting to your function app settings. -
hash
: contains a hash of the zip file for downloads integrity check.
-
In a terminal window, run the following commands:
cd terraform/function-app
terraform init
terraform apply
- For Docker based deployment, the object has one field:
Before using this DockerFile,
- we expect you to have basic knowledge about:
- Azure Functions
- Building images with DockerFiles
- Java programming language.
- Docker installed.
Inside the Sample-Java-Azure-Function directory you'll find:
- a sample java azure function.
- a DockerFile that builds a Java project inside a docker image and provides a runnable Azure Function image.
- a Docker-Compose file.
- a sample Azure Pipeline yaml file.
In order to use the sample docker file provided, you will need to:
-
In "docker-compose.yml" file, replace
<image-name>
with your docker image name. -
Use the env_sample file to create your own .env with values to REGISTRY and STORAGE_CONNECTION_STRING.
-
In "Dockerfile" file, replace
<function-app-name>
with your function app name. -
In a terminal window, run the following commands:
cd Sample-Java-Azure-Function
docker-compose up
-
Use Postman, Curl, or any web browser to test the sample function on port 9010.
ex.
http://localhost:9010/api/HttpExample?name=abc
Before using this,
- we expect you to have basic knowledge about:
- A service Connection created in your Azure DevOps.
- Azure Storage Account deployed in your Azure subscription for function logs.
- Create a new repo with the attached pipeline file a long with Docker file and the Azure Function project in Sample-Java-Azure-Function directory.
- In "azure-pipeline.yml" file, replace
<image-name>
with your image name. - In Azure DevOps, create and set the following variables:
- SERVICE_CONNECTION_NAME
- STORAGE_CONNECTION_STRING
- Use Azure devops to create a new pipeline with the Azure pipeline yaml configuration.
- Run the pipeline.
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.