Gradle Jacoco and JUnit5 - jenkins

We just ported our unit tests to JUnit5. Realizing that this is still rather early adoption with little hints on google.
The most challenging was to get jacoco code coverage for the Junit5 tests which we use on jenkins. Since this took me almost a day to figure out, I thought I share. Nevertheless, if you know of a better solution I would be interested to know!
buildscript {
dependencies {
// dependency needed to run junit 5 tests
classpath 'org.junit.platform:junit-platform-gradle-plugin:1.0.0-M2'
}
}
// include the jacoco plugin
plugins {
id 'jacoco'
}
dependencies {
testCompile "org.junit.jupiter:junit-jupiter-api:5.0.0-M2"
runtime "org.junit.jupiter:junit-jupiter-engine:5.0.0-M2"
runtime "org.junit.vintage:junit-vintage-engine:4.12.0-M2"
}
apply plugin: 'org.junit.platform.gradle.plugin'
Then the problem seems to be that junitPlatformTest as defined in the org.junit.platform.gradle.plugin is defined too
late in the gradle lifecycle phase and hence is unknown when the script is parsed.
The following hack is needed in order to still be able to define a jacoco task which observes the junitPlatformTest task.
tasks.whenTaskAdded { task ->
if (task.name.equals('junitPlatformTest')) {
System.out.println("ADDING TASK " + task.getName() + " to the project!")
// configure jacoco to analyze the junitPlatformTest task
jacoco {
// this tool version is compatible with
toolVersion = "0.7.6.201602180812"
applyTo task
}
// create junit platform jacoco task
project.task(type: JacocoReport, "junitPlatformJacocoReport",
{
sourceDirectories = files("./src/main")
classDirectories = files("$buildDir/classes/main")
executionData task
})
}
}
Finally it is necessary to configure the junitPlatform plugin. The following code allows command line configuration of which junit 5 tags shall be run:
You can run all tests with 'unit' tag by running:
gradle clean junitPlatformTest -PincludeTags=unit
You can run all tests which are missing both unit and integ tag using
gradle clean junitPlatformTest -PexcludeTags=unit,integ
If no tags are provided all tests will be run (default).
junitPlatform {
engines {
include 'junit-jupiter'
include 'junit-vintage'
}
reportsDir = file("$buildDir/test-results")
tags {
if (project.hasProperty('includeTags')) {
for (String t : includeTags.split(',')) {
include t
}
}
if (project.hasProperty('excludeTags')) {
for (String t : excludeTags.split(',')) {
exclude t
}
}
}
enableStandardTestTask false
}

Thank you, so the hack now looks like this:
project.afterEvaluate {
def junitPlatformTestTask = project.tasks.getByName('junitPlatformTest')
// configure jacoco to analyze the junitPlatformTest task
jacoco {
// this tool version is compatible with
toolVersion = "0.7.6.201602180812"
applyTo junitPlatformTestTask
}
// create junit platform jacoco task
project.task(type: JacocoReport, "junitPlatformJacocoReport",
{
sourceDirectories = files("./src/main")
classDirectories = files("$buildDir/classes/main")
executionData junitPlatformTestTask
})
}

Can be also resolved with direct agent injection:
subprojects {
apply plugin: 'jacoco'
jacoco {
toolVersion = "0.7.9"
}
configurations {
testAgent {
transitive = false
}
}
dependencies {
testAgent("org.jacoco:org.jacoco.agent:0.7.9:runtime")
}
tasks.withType(JavaExec) {
if (it.name == 'junitPlatformTest') {
doFirst {
jvmArgs "-javaagent:${configurations.testAgent.singleFile}=destfile=${project.buildDir.name}/jacoco/test.exec"
}
}
}
}
then report will be available with jacocoTestReport task

To get a reference to the junitPlatformTest task, another option is to implement an afterEvaluate block in the project like this:
afterEvaluate {
def junitPlatformTestTask = tasks.getByName('junitPlatformTest')
// do something with the junitPlatformTestTask
}
See my comments on GitHub for JUnit 5 for further examples.

You just need to add #RunWith(JUnitPlatform.class) to your package
#RunWith(JUnitPlatform.class)
public class ClassTest {
}

Related

Jenkins dsl configure block makes duplicate tabs

I'm trying to create a job DSL which creates a multibranch-pipeline job,
The job is being created successfully but there are some missing configurations in the multi-pipeline job so I tried to use the "configure" block.
The configure block indeed was created but it created a duplicate "tag" of jenkins.branch.BranchSource I guess I am missing something' I tried tons of manipulations but nothing worked for me.
Any advice?
This is my groovy DSL:
multibranchPipelineJob('TestDocker_pipeline_DSL') {
branchSources {
git {
remote(gitUrl)
credentialsId('Dev_Builder_ssh')
//includes("(V[0-9]+.[0-9]+([.]+[0-9]+)*)|(master)")
}
configure {
it / sources / data / "jenkins.branch.BranchSource" << "jenkins.plugins.git.GitSCMSource" {
id("8fd33e1d-07b6-4cc4-8f1c-a18d955b4b6e")
remote(gitUrl)
credentialsId('Dev_Builder_ssh')
traits{
"jenkins.scm.impl.trait.RegexSCMHeadFilterTrait"{
regex("V[0-9]+.[0-9]+([.]+[0-9]+)*)|(master)")
}
}
}
}
}
factory {
workflowBranchProjectFactory {
scriptPath('main/Docker/DockerJenkinsfileSlave.groovy')
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(3)
}
}
}
And this is the job XML being created:
Well After a lot of struggling I think that my problem was that I didn't define some of the TAGS as plugins in the groovy DSL and removing the "git" section also helped.
So the Final groovy that finally worked was this one:
branchSources {
configure {
it / sources / data / "jenkins.branch.BranchSource" << source (class: "jenkins.plugins.git.GitSCMSource", plugin:"git#3.9.2") {
remote(gitUrl)
credentialsId('Dev_Builder_ssh')
includes('*')
excludes('')
ignoreOnPushNotifications(false)
traits{
"jenkins.scm.impl.trait.RegexSCMHeadFilterTrait"{
regex("(V[0-9]+.[0-9]+([.]+[0-9]+)*)|(master)")
}
}
}
}
}
Which resulted this beutifull XML job:

Issues with multibranch pipeline job DSL

I am having issues with multibranch pipeline for job DSL plugin to automate the creation of multibranch pipeline job.
The piece am having issues with is how to let set the path to the Jenkinsfile on the repo. I have looked online for documentation but found nothing to help. I have even tried to get example scripts but multibranch job DSL scripts are rare on the internet. Matter of fact could not find any that has Jenkinsfile set in it
jobs.groovy
folderName = "${JENKINS_PATH}"
folder(folderName)
multibranchPipelineJob("${folderName}/jenkins_multibranch_devops") {
branchSources {
git {
remote("https://gitlab.com/${REPO_PATH}")
credentialsId('gitlab_credentials')
includes('*')
}
}
configure { project ->
project / factory {
scriptPath('jenkins/Jenkinsfile')
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(14)
}
}
}
Here is what i have and its failing because i am obviously missing some stuffs which is why am looking for help
What am i missing and where can i get documentation if i plan on adding more and more to this jobs.groovy file and want to know how to know what stuffs to add because current doc page doesn't help at all
You can set it using this:
multibranchPipelineJob("${folderName}/jenkins_multibranch_devops") {
branchSources {
git {
remote("https://gitlab.com/${REPO_PATH}")
credentialsId('gitlab_credentials')
includes('*')
}
}
factory {
workflowBranchProjectFactory {
scriptPath('jenkins/Jenkinsfile')
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(14)
}
}
}
Documentation is available through the Job DSL API viewer in your jenkins installation: https://{your-jenkins}/plugin/job-dsl/api-viewer/index.html

How to use Job DSL with Accurev SCM?

I am using the following groovy script to create a Job DSL that uses Accurev as SCM.
Please let me know how should the correct script look like.
job('payer-server') {
scm {
accurev {
/**What to insert here **/
}
}
triggers {
scm('H/15 * * * *')
}
steps {
maven {
goals('-e clean install')
mavenOpts('-Xms256m')
mavenOpts('-Xmx512m')
properties skipTests: true
mavenInstallation('Maven 3.3.3')
}
}
}
Currently there is no built-in support for Accurev SCM. Someone already filed a feature request as JENKINS-22138.
But you can use a Configure Block to generate the necessary config XML. There is an example for configuring Subversion, which can be adapted to Accurev.
job('example') {
configure { project ->
project.remove(project / scm) // remove the existing 'scm' element
project / scm(class: 'hudson.plugins.accurev.AccurevSCM') {
serverName('foo')
// ...
}
}
triggers {
// ...
}
steps {
// ...
}
}
Please leave a comment on the feature request to describe which options of Accurev SCM you need to configure initially.

Gradle Geb saucelabs plugin

I'm following the example from http://www.gebish.org/manual/0.9.2/sauce-labs.html#gradle_geb_saucelabs_plugin but am unable to get it working. My build.gradle script is as follows:
buildscript {
repositories {
jcenter()
}
dependencies {
classpath "org.grails:grails-gradle-plugin:2.0.0"
classpath 'org.gebish:geb-gradle:0.9.2'
}
}
version "0.1"
group "example"
apply plugin: "grails"
apply plugin: "geb-saucelabs"
repositories {
grails.central() //creates a maven repo for the Grails Central repository (Core libraries and plugins)
maven { url "http://repository-saucelabs.forge.cloudbees.com/release" }
}
grails {
grailsVersion = '2.3.5'
groovyVersion = '2.1.9'
springLoadedVersion '1.1.3'
}
dependencies {
bootstrap "org.grails.plugins:tomcat:7.0.50" // No container is deployed by default, so add this
compile 'org.grails.plugins:resources:1.2' // Just an example of adding a Grails plugin
sauceConnect "com.saucelabs:sauce-connect:3.0.28"
}
sauceLabs {
browsers { //5
firefox_linux_19 //Could not find property 'reporting' on root project 'gradleGrailsError'.
chrome_mac
internetExplorer_vista_9
}
task { //6
testClassesDir = test.testClassesDir
testSrcDirs = test.testSrcDirs
classpath = test.classpath
}
account { //7
username = System.getenv("SAUCE_ONDEMAND_USERNAME")
accessKey = System.getenv("SAUCE_ONDEMAND_ACCESS_KEY")
}
}
When I run $gradle test, I get the following error: Could not find property 'reporting' on root project... This error occurs on the line which specifies firefox_linux_19 as the browser. Can someone please advise how I can get the geb-saucelabs plugin working correctly? Thanks.
After a lot of trial and error, I got the following to work:
sauceLabs {
tasks.withType(Test) {
reports.junitXml.destination = reporting.file("test-results/$name")
reports.html.destination = reporting.file("test-reports/$name")
}
browsers { //5
firefox_linux_19
chrome_mac
internetExplorer_vista_9
}
account { //7
username = System.getenv("SAUCE_ONDEMAND_USERNAME")
accessKey = System.getenv("SAUCE_ONDEMAND_ACCESS_KEY")
}
}
The addition of the tasks.withType(Test) was the key, and I also removed the task closure which was listed in the sample code.

Use PMD's Copy/Paste Detector with Gradle

I'd like to use Copy/Paste Detector in my Gradle build.
This is why I've decided to translate the following Ant task (which I've found here) into Gradle syntax:
<target name="cpd">
<taskdef name="cpd" classname="net.sourceforge.pmd.cpd.CPDTask" />
<cpd minimumTokenCount="100" outputFile="/home/tom/cpd.txt">
<fileset dir="/home/tom/tmp/ant">
<include name="**/*.java"/>
</fileset>
</cpd>
</target>
This is how the translation looks currently:
check << {
ant.taskdef(name: 'cpd', classname: 'net.sourceforge.pmd.cpd.CPDTask', classpath: configurations.pmd.asPath)
ant.cpd(minimumTokenCount: '100', outputFile: file('build/reports/pmd/copyPasteDetector.txt').toURI().toString()) {
fileset(dir: 'src'){
include(name: '**.java')
}
}
}
Unfortunately calling gradle check yields an net.sourceforge.pmd.cpd.ReportException, the stacktrace is here.
How can I scan my source code with the Copy/Paste Detector using Gradle 1.9?
Thanks!
You can also use my gradle-cpd-plugin. See https://github.com/aaschmid/gradle-cpd-plugin for further informationen. Applying the cpd plugin automatically adds it the cpd as dependency of check task.
Note: I am not very happy with the name cpd for extension (see toolVersion) and task, suggestions welcome ;-)
Currently, it is version 0.1 but I am on it to switch from using CPD's ant task internally to directly call it. This will include support of all parameters etc. Here is a usage example:
apply plugin: 'cpd'
buildscript {
repositories {
mavenCentral()
}
dependencies {
classpath 'de.aaschmid.gradle.plugins:gradle-cpd-plugin:0.1'
}
}
// optional - default is 5.1.0
cpd {
toolVersion = '5.0.5'
}
tasks.cpd {
reports {
text.enabled = true
xml.enabled = false
}
source = files('src/main/java')
}
Guys from gradle forums suggest that you use CPD in gradle like that:
task cpd(dependsOn: ':pmdSetup') {
// Combine all source sets
allSource = files {
allprojects.findAll { proj ->
proj.hasProperty("sourceSets")
}.collect { proj ->
proj.sourceSets.collect { ss ->
ss.java
}
}
}
// Declare this task's inputs and outputs.
inputs.files allSource
outDir = file("$buildDirName/cpd")
outputs.dir outDir
// outputs.files file("$outDir.path/cpd.xml")
doLast {
outDir.mkdirs()
// Keep a reference to the gradle project for use inside the
// ant closure, where "project" refers to the ant project.
gproj = project
ant {
cpd(minimumTokenCount: '100', format: 'xml',
outputFile: outDir.path + '/cpd.xml') {
fileset(dir: projectDir.getPath()) {
// Convert the gradle sourceSet to an ant
// fileset.
allSource.each { file ->
include(name: gproj.relativePath(file))
}
}
}
}
}
}
and, of course, apply plugin: 'pmd' before.
The definition of my outputFile caused the problem.
I adapted this build.gradle and I'm now happy with the following solution:
check << {
File outDir = new File('build/reports/pmd/')
// Make sure the output dir exists to prevent a ReportException
outDir.mkdirs()
ant.taskdef(name: 'cpd', classname: 'net.sourceforge.pmd.cpd.CPDTask',
classpath: configurations.pmd.asPath)
ant.cpd(minimumTokenCount: '100', format: 'text',
outputFile: new File(outDir , 'cpd.txt')) {
fileset(dir: "src/main/java") {
include(name: '**/*.java')
}
}
}
Thanks Andrey Regentov and Perryn Fowler for their input.

Resources