griddynamics/mpl

Active modules list cannot be used as a stack in parallel builds

Closed this issue ยท 10 comments

When running parallel builds there is a concern about how the active modules list is maintained. Currently, it is used as a stack in MPLModule.groovy

pushing as
try {
MPLManager.instance.pushActiveModule(module_path)
Helper.runModule(module_src, module_path, [CFG: Helper.flatten(cfg)])
}

poping as
finally {
.........
MPLManager.instance.popActiveModule()
}

when executing parallelly, the active module list is the same for all branches. so there might be instances where one branch will pop an active module that was pushed by another branch.
Therefore should we use active modules list as just a list instead of a stack? We just need to remove the module from the list by giving the module path as follow

in MPLManager.groovy change popActiveModule

public popActiveModule(String path) {
if(activeModules.contains(path)){
activeModules -= path
}
}

Hi @Gayanvir ,

Yeah, I think it's a major issue. Need to check how to fix it for the parallel execution. Will keep you updated.

Thank you

Sooo, unfortunately fix will not be simple - MPL need to split the execution context... With configs I think it's simple - CFG will be a full copy of the parent config - here is no issue. But the shared context like activeModules, postStepsErrors, modulePostSteps - will be affected.
I think we can use step id from the pipeline to navigate the right branch. It will help with choosing the right context and will allow to execute any number of parallel threads per pipeline.

I am new to MPL and Jenkins. please correct me if I am wrong. What I could notice is activeModules cannot have duplicate modules. Here https://github.com/griddynamics/mpl/blob/master/vars/MPLModule.groovy#L36 it says "Loop protection: There is no way to run currently active module again". So I guess activeModules need to have unique values. This might not well suited for parallel stages. But if we have this restriction, I think we can simply remove a module from activeModule by just removing by its path. Just like
activeModules -= path

instead of
activeModules.pop()

No, this activeModules also a part of different mechanisms inside MPL to determine the poststeps and so on. I was able to run multiple Test modules from the parallel stages, but I think it's just a luck - and for sure if we will go forth with the complex parallel builds - it will cause a multiple issues.

Infinity loop protection - is here to make sure the module will not call itself again (but it can call the parent one).

Checking the ways to solve the issue - it seems that the parallel builds in pipeline are still not good for daily use, so be careful: env is shared in the pipeline.
Anyway I think there is another way to pass the build identifier, hopefully I will find it soon.

Seems I'm moving forward with the next solution:

  1. Adding StepStartNode / StepEndNode guards to the running of the module (additionally I need to identify them somehow)
  2. Using CpsThread.current().head.get().getEnclosingBlocks() determining the current branch of the modules
  3. Following the branch to make sure we using the right MPLManager context.

First step is partially done, now the Pipeline Steps of the build looks much better - now each module in the separated block:
after_guards

Just improved view of the Pipeline Steps:

  • Hid the rows with checking existence of .jenkins project modules - used Helper.pathExists() instead of the step fileExists().
  • Added LabelAction and ArgumentsAction to store the MPL data in the blocks.
    improved_view

Hi @Gayanvir ,
I just prepared the fix - it still need to be polished, but looks like working well. The one thing I need to check - is the complicated tests execution.
Could you please check the fix with your pipeline logic?

Thank you

Hi @sparshev ,
Sorry for the late reply and I highly appreciate your efforts on this. Our pipeline is not much complex but has parallel builds. I tried your fixes and it is working well. Good job !!!

Thanks

@Gayanvir thank you for testing - I will test it little bit more and merge with master asap.