- Create the pipeline in a test repo
- create a simple jenkins file to load the files
- test various configurations:
load('src/test.groovy')
load('vars/test.groovy')
It can be done! The files can be loaded as vars and used as functions.
There are some tricks/caveats:
-
The script must end with
return this
-
Unless the var is defined prior to the pipeline block, it will go out of scope like any other var when its scope closes.
steps { script { def testScript = load('src/test_src.groovy') } }
testScript
will go out of scope when thescript
block closes.
Instead, define the var prior to the pipeline block:
def testScript
pipeline {
agent any
stages {
stage('test') {
steps {
script {
testScript = load('src/test_src.groovy')
}
}
}
}
}
-
The file must be loaded, and subsequently used, in
script
blocks. -
You can define a
call()
function and call it with the name of the script.def call() { echo 'this is a call' }
script { testScript() }
-
You can also use class members inline:
script { def aValue = load('src/test_src.groovy').aFunction() }
This is a viable way to load local files as vars and use them as functions. It's not as clean as loading them as shared libraries, but it's a good option for small, simple functions that don't need to be shared across multiple pipelines.