Faster deployment time
Stev3nsen opened this issue · 3 comments
Hey guys,
we really appreciate your great work here!
I wanted to discuss how we can speed up our deployment times. We use Azure DevOps to build and deploy multiple (10+) http and event functions to Cloud Functions. For each function we use a gcloud CLI command. A single function deployment takes up to 2 minutes. In total we need more than 20 minutes for the whole project to deploy.
Here is a typical gcloud command we use for our http functions:
gcloud functions deploy $NAME
--runtime dotnet3
--project $PROJECT
--region $REGION
--trigger-http
--security-level=secure-always
--entry-point $ENTRY_POINT
--set-build-env-vars=GOOGLE_BUILDABLE=$GOOGLE_BUILDABLE
--set-env-vars=GOOGLE_CLOUD_PROJECT=$PROJECT
Thats the dependencies we use:
<ItemGroup> <PackageReference Include="FirebaseAdmin" Version="2.2.0" /> <PackageReference Include="Google.Cloud.Firestore" Version="2.4.0" /> <PackageReference Include="Google.Cloud.Functions.Hosting" Version="1.0.0" /> <PackageReference Include="Google.Cloud.SecretManager.V1" Version="1.7.0" /> <PackageReference Include="Google.Cloud.Storage.V1" Version="3.7.0" /> <PackageReference Include="Google.Events.Protobuf" Version="1.0.0" /> <PackageReference Include="Polly" Version="7.2.3" /> </ItemGroup>
Any ideas how we can improve our deployment times?
Are all of your functions in the same project? I am curious how you are using the GOOGLE_BUILDABLE
env var.
One optimization we have been discussing is packing multiple functions into one container. There are a lot of idiosyncrasies that we would need to work out, but the idea would be to run one build, to produce one container image, that gets booted up with different a GOOGLE_FUNCTION_TARGET
for every function you wish to invoke. Would that be possible with how you have your project structured now? Or are you depending on us compiling each function separately?
Thanks for your quick response.
We have a top-level solution (.sln) containing 3 projects (.csproj). Fairly simple:
- Domain.csproj
- Application.csproj
- Infrastructure.csproj
The infrastructure project depends on the domain and application project. It also contains all the functions we have. In case we want to deploy two functions A and B we execute following (simplified) commands:
gcloud functions deploy A --set-build-env-vars=GOOGLE_BUILDABLE=Infrastructure --entry-point Namespace.To.Our.FunctionA
gcloud functions deploy B --set-build-env-vars=GOOGLE_BUILDABLE=Infrastructure --entry-point Namespace.To.Our.FunctionB
Like the readme says here in the section Deploying a function with a local project dependency we have to use the GOOGLE_BUILDABLE
env var.
I think your idea to use one container image for multiple function targets could work quit nice with our project structure. We have serveral functions that are triggered by document changes through our Firestore DB. Could this be a problem?
Is there any possibility to deploy functions in parallel?
Hi @Stev3nsen , as you've probably seen I don't believe the gcloud functions deploy
command has a way to skip the waiting. I've filed an internal feature request for you, but don't have an expected ETA.
In the meantime my recommendation would be to:
- Parallelize the commands themselves using your shell or CD system.
- Try using an infrastructure as code tool that can run deploys in parallel like Terraform's
google_cloudfunctions_function
provider
I'm going to close this issue in the meantime since it's more broadly impacting than just the dotnet framework, but please reach out if you need any other help!