terraform-google-modules/terraform-google-github-actions-runners

Support multiple github action runners per compute instance

raj-saxena opened this issue · 4 comments

It might be useful in some cases to register multiple action runners on a large compute machine to parallelize jobs & utilise the remaining capacity of the instance while the instance-group is scaling up (which takes some time).

I have made & tested the changes in my fork (https://github.com/raj-saxena/terraform-google-github-actions-runners/pull/2/files).
Example -

module "runner-mig" {
  source                     = "github.com/raj-saxena/terraform-google-github-actions-runners//modules/gh-runner-mig-vm?ref=1468894a76e090779f9a0a7a8d004722cf50a585"
  create_network             = true
  project_id                 = var.project_id
  repo_owner                 = var.repo_owner
  gh_token                   = var.gh_token
  gh_runner_instances_per_vm = 3
}

Screenshot 2021-10-21 at 09 22 03

Screenshot 2021-10-21 at 09 22 10

@bharathkkb Would this be something that you are interested in? If yes, I'll create a PR soon based on your reply.

@ustuehler-brands4friends possible to give some feedback on the above 👆🏼 ?

I am not a maintainer of this project, but I would rather have multiple smaller instances on standby or use a different runtime (for example, GKE) to scale up runners quickly.

When multiple jobs are executing on the same machine, they might compete for resources and run unpredictably. Even worse, if the same runner VM accepts jobs from multiple repositories and/or workflows, secrets from one job could be leaked to another job that should not have access to them.

Thanks for your feedback.
Our main reason for avoiding GKE as the runtime is because some of our jobs use container and run inside it. This creates the problem of running containers in action-runner-containers which has its own set of problems and should be avoided.

While the concerns that you raise are true, we are comfortable accepting them as trade-offs in our organisation for now.

Assuming the same might be true for other users of the module, we wanted to have this flexibility into the module for users who are in a similar situation, understand the tradeoffs & are okay with it.

This issue is stale because it has been open 60 days with no activity. Remove stale label or comment or this will be closed in 7 days