This template takes a minimum amount of parameters and deploys a VM that can be used as SAP Application Server, using the latest patched version of the selected operating system. This template uses Premium Storage with Managed Disks. Filesystems are created via custom script.
Size | SAP VM | Storage (SAPEXE) |
---|---|---|
Small | E4ds_v4 (4vCPU, 32GiB, 6.044 SAPS) | 1xP10 |
Medium | E8ds_v4 (8vCPU, 64GiB, 12.088 SAPS) | 1xP15 |
Large | E16ds_v4 (16vCPU, 128GiB, 24.175 SAPS) | 1xP20 |
Steps:
- Fork this repository
- Connect Azure DevOps with your forked repository (https://docs.microsoft.com/en-us/azure/devops/boards/github/connect-to-github?view=azure-devops)
- Create a pipeline similar to the example in this repository (azure-pipelines.yml) by adapting to your Azure envrionment
- Enter required variables to the pipeline configuration
There might be multiple solutions to handle this situation which is most common for SAP environments. The challenge here is that the deployed HANA VM has no access to Github to download the diskConfig.sh script. Furthermore you might want to keep the provided details like subscription and subnetId from the azuredeployparamfile.json file private. For me the following solution works fine:
- Create a storage account with a private endpoint on selected networks (SAP subnets) in your Azure subscription
- Create a container with read access in this storage account
- Upload the diskConfig.sh file into the container
- Get the URL and update the link to diskConfig.sh in azuredeploy.json of your forked repository
- Preferable let the pipeline only run manually to avoid automatic deployments during every repository change