Having trouble getting coverage for "vars" scripts in shared library
Opened this issue · 9 comments
Expected Behavior
Get coverage information for tests run via jenkins-spock in jacoco reports
Actual Behavior
I am getting coverage for anything tested with plain Spock - mostly classes in src directory, but anything in "vars" directory shows 0 - even though tests have run successfully
Steps to Reproduce
Please include a numbered list of steps that other people can use to recreate the "Actual Behavior."
create a library with:
vars/testStep.groovy
def call(String arg){
def msg = "I was called with ${arg}"
echo msg
return msg
}
and a test spec for it:
import com.homeaway.devtools.jenkins.testing.JenkinsPipelineSpecification
class testStepSpec extends JenkinsPipelineSpecification{
def step
void setup() {
script_class_path = ['vars']
step = loadPipelineScriptForTest('testStep.groovy')
}
def "call Test"(){
when:
step.call('test') == "I was called with test"
then:
1 * getPipelineMock("echo")("I was called with test")
}
}
Additional Information
This test runs fine, but jacoco report shows:
It looks like loadPipelineScriptForTest()
hides the script from jacoco - is there a way around it?
(may be worth noting that I am using gradle)
interestingly enough, if I use classpath loader and plain Spock like this:
import spock.lang.Specification
class testStepSpec extends Specification{
def "call Test"(){
given:
def step = new testStep()
step.echo = {}
expect:
step.call('test') == "I was called with test"
}
}
it shows coverage, but obvously I lose all the jenkins-spock bits
Feel like I am talking to myself here, but I did find a solution that seems to split the difference and seems to work. Basically, do not use loadPipelineScriptForTest
for loading vars classes. Instead create class as if without jenkins-spock and then use addPipelineMocksToObjects()
directly to add mocks:
import com.homeaway.devtools.jenkins.testing.JenkinsPipelineSpecification
class testStepSpec extends JenkinsPipelineSpecification{
def "call Test"(){
given:
def step = new testStep()
addPipelineMocksToObjects(step)
step.echo = {}
expect:
step.call('test') == "I was called with test"
}
}
How can 'new testStep()' be work?
I mean GroovyScripts from /vars are not visible in the test, I can't import or instantiate them.
I have class in vars what I can't use otherwise:
def constants = new GroovyScriptEngine('.').with {
loadScriptByName('vars/Constants.groovy')
}
Oh, sounds like you do not have vars
added as a directory containing sources - here is a basic gradle example configuring both the sources and tests (I like to separate my unit tests for src and vars dirs):
sourceSets {
main {
groovy {
srcDirs = ['src', 'vars']
}
resources {
srcDirs = ['resources']
}
}
test {
groovy {
srcDirs = ['test/src', 'test/vars']
}
resources {
srcDirs = ['test/resources']
}
}
}
I am using maven, I can't use Gradle, I have compiling issues.
Could you please give me working build.gradle?
I figured out the problem was the package definition in the test, the IDEA put it automatically so I removed.
But the 'getPipelineMock's are not working as before, I can't reproduce the issue on a simple example, I am on it.
You should be able to do same thing in Maven TBH - just need to look up how to add a source dir (quick google says do this:)
<build>
<sourceDirectory>src/, vars/</sourceDirectory>
...
</build>
My Gradle config is a bit convoluted at the moment, and much of it is not terribly useful to people (we do crazy bits like compiling help for vars on the fly) - but I think it would be a good idea to make a clean reference repo - esp. since I figured out how to properly test things under CPS (and yes, that does break the code coverage/confuses the heck out of Jacoco :-( )
That said, in the interest of full disclosure, once I activated CPS, I could not get much use out of JenknsSpock and ended up just using plain Spock with a wrapper to execute CPS code
I went ahead and tried to create a clean-ish reference version with Gradle and CPS support here: mlasevich/jenkins-pipeline-library-reference
Thank you very much the reference repo, unfortunately I didn't have opportunity to try it yet, but I am curious :)
Hi @deblaci,
I came across the same problem with jacoco-maven-plugin
and finally found a workaround.
<plugins>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco-maven-plugin.version}</version>
<executions>
<!-- should have prepare-agent and/or prepare-agent-integration with configuration.propertyName being set differently if you have both since they will be used separately by surefire and failsafe in their argLine -->
</executions>
<configuration>
<!-- the path doesn't matter as long as it matches with maven-antrun-plugin -->
<classDumpDir>${project.build.directory}/class-dumps</classDumpDir>
</configuration>
</plugin>
<!-- workaround to use/overwrite specific classes from class dumps -->
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-antrun-plugin</artifactId>
<version>${maven-antrun-plugin.version}</version>
<executions>
<execution>
<id>overwrite-vars-classes</id>
<!-- has to be done before merging/generating report(s) with jacoco -->
<phase>post-integration-test</phase>
<goals>
<goal>run</goal>
</goals>
<configuration>
<target name="overwrite the vars classes from class-dumps to classes">
<!-- overwrite because the dumped class is older than the classes restored by jacoco's restore-instrumented-classes -->
<copy todir="${project.build.outputDirectory}" overwrite="true">
<fileSet dir="${project.build.directory}/class-dumps" includes="*.class">
<!-- the first .* includes the groovy closures -->
<filename regex="^foo.*\.[^.]+\.class$" />
<mapper type="regexp" from="^([^.]+)\.[^.]+\.class$" to="\1.class" />
</fileSet>
</copy>
</target>
</configuration>
</executions>
</plugin>
</plugins>
Thanks,
Leo