Can the Jenkins JaCoCo plugin produce multiple reports? - jenkins

I'm building a JaCoCo report in my Jenkins pipeline:
pipeline {
stages {
stage("One") {
// build something...
jacoco()
}
}
}
This works well with just the one report. But when I add a second one:
pipeline {
stages {
stage("One") {
// build something...
jacoco()
}
stage("Two") {
// build something else...
jacoco()
}
}
}
... then, Jenkins will give me two JaCoCo report buttons in the sidebar, but they will both link to one of the reports. Looks like the second one gets generated, but overwrites the first one.
Can I configure something so both reports are available?

Related

Jenkins User Acceptance code issue in Jenkins pipeline. How to implement multiple acceptance code in a single pipeline

I am trying to create a pipeline in which after my deployment, I will perform functional test and on the basis of that I want to conclude that whether I want to proceed further or not. I used Jenkins "input" feature. I am getting the message to proceed further, but when I click OK the nothing happened, It is stucked there only. And also after first Approval I have send approval step below, after that only I have to release the result.
I am not able to understand how to achieve as I am new to this. The pipeline code is mentioned below:
pipeline {
agent any
tools {
// Install the Maven version configured as "M3" and add it to the path.
maven "mvn"
jdk "jdk8"
}
stages {
stage('SCM Checkout') {
steps {
println "============= SCM Checkout =============="
}
}
stage('Code Inspection'){
steps {
println "============== SonarQube Scanning ======================="
}
}
stage('Build, Package & JUnit'){
steps {
println "============== Build, Package & JUnit ================"
}
}
stage('Deploy'){
steps {
println "============== Deploy and Split Traffic=================="
}
}
stage('Functional & Performance Test'){
steps {
println "=========== Functional and Performance Test ==============="
}
}
stage('A/B Testing'){
input {
message "Functional & Performance Test done. Should we continue?"
ok "OK"
}
steps {
println "=========== A/B Testing ==============="
}
}
stage('Release'){
input {
message "A/B Testing done. Should we continue?"
ok "OK"
}
steps {
println "========= Final Release =================="
}
}
}}
Is there any other way to achieve this? or who can I improve this code to achieve the desired result.
Use the input feature like this:
stage('Release'){
steps {
input message: "A/B Testing done. Should we continue?"
println "========= Final Release =================="
}
}
Make sure you also have the Pipeline: Input Step Plugin (a component of Pipeline Plugin) installed and activated:

Multiconfiguration / matrix build pipeline in Jenkins

What is modern best practice for multi-configuration builds (with Jenkins)?
I want to support multiple branches and multiple configurations.
For example for each version V1, V2 of the software I want builds targeting
platforms P1 and P2.
We have managed to set up multi-branch declarative pipelines. Each build has its own docker so its easy to support multiple platforms.
pipeline {
agent none
stages {
stage('Build, test and deploy for P1) {
agent {
dockerfile {
filename 'src/main/docker/Jenkins-P1.Dockerfile'
}
}
steps {
sh buildit...
}
}
stage('Build, test and deploy for P2) {
agent {
dockerfile {
filename 'src/main/docker/Jenkins-P2.Dockerfile'
}
}
steps {
sh buildit...
}
}
}
}
This gives one job covering multiple platforms but there is no separate red/blue status for each platform.
There is good argument that this does not matter as you should not release unless the build works for all platforms.
However, I would like a separate status indicator for each configuration. This suggests I should use a multi-configuration build which triggers a parameterised build for each configuration as below (and the linked question):
pipeline {
parameters {
choice(name: 'Platform',choices: ['P1', 'P2'], description: 'Target OS platform', )
}
agent {
filename someMagicToGetDockerfilePathFromPlatform()
}
stages {
stage('Build, test and deploy for P1) {
steps {
sh buildit...
}
}
}
}
There are several problems with this:
A declarative pipeline has more constraints over how it is scripted
Multi-configuration builds cannot trigger declarative pipelines (even with the parameterized triggers plugin I get "project is not buildable").
This also begs the question what use are parameters in declarative pipelines?
Is there a strategy that gives the best of both worlds i.e:
pipeline as code
separate status indicators
limited repetition?
This is a partial answer. I think others with better experience will be able to improve on it.
This is currently untested. I may be barking up the wrong tree.
Please comment or add a better answer.
Do not use pipeline parameters except where you need user input
Use a hybrid of a scripted and declarative pipeline
(see also https://stackoverflow.com/a/46675227/1569204)
Have a function which declares a pipeline based on parameters:
(see also https://jenkins.io/doc/book/pipeline/shared-libraries/)
Use nodes to create visible indicators in the pipeline (at least in blue ocean)
So something like the following:
def build(string platform) {
switch(platform) {
case P1:
dockerFile = 'foo'
indicator = 'build for foo'
break
case P2:
dockerFile = 'bar'
indicator = 'build for bar'
break
}
pipeline {
agent {
dockerfile {
filename "$dockerFile"
}
node {
label "$indicator"
}
}
stages {
steps {
echo "build it"
}
}
}
}
The relevant code could be moved to a shared library (even if you don't actually need to share it).
I think the cleanest approach is to have this all in a pipeline similar to the first one you presented, the only modification I would see here is making those parallel, so you would actually try and build/test for both platforms.
To reuse the previous stage's workspace you could do: reuseNode true
Something similar to this flow, that would have parallel build for platforms
pipeline {
agent 'docker'
stages {
stage('Common pre') { ... }
stage('Build all platforms') {
parallel {
stage('Build, test and deploy for P1') {
agent {
dockerfile {
filename 'src/main/docker/Jenkins-P1.Dockerfile'
reuseNode true
}
}
steps {
sh buildit...
}
}
stage('Build, test and deploy for P2') {
agent {
dockerfile {
filename 'src/main/docker/Jenkins-P2.Dockerfile'
reuseNode true
}
}
steps {
sh buildit...
}
}
}
}
stage('Common post parallel') { ... }
}
}

How to add "single conditional steps" under build section using dsl script

I'm currently trying to develop a DSL script that can create a jenkins job with all required plugins and options.
I think I've almost completed all the section. But, I stuck up under build section where I've to include "conditional steps (single)" under Build.
Actually what I wanted is this
But, what I get is this
Here's the code that I used,
job('Sample_dev') {
steps {
conditionalSteps {
condition {
alwaysRun()
}
}
maven {
goals('install')
}
}
}
You have done few mistakes there:
Using multi-step DSL for achieving single step.
Pushed maven outside context like individual step.
Wrong DSL for Maven Step declaration.
Try following
job('Sample_dev')
{
steps{
singleConditionalBuilder{
condition{
alwaysRun()
}
buildStep {
maven{
targets('install')
name('')
pom('')
properties('')
jvmOptions('')
usePrivateRepository(false)
settings {
standard()
}
globalSettings {
standard()
}
injectBuildVariables(false)
}
}
runner {
fail()
}
}
}
}
The creator has deployed most on this url https://jenkinsci.github.io/job-dsl-plugin. But I would suggest you install in you local instance and access it via http://<your-jenkins-host>:<port> /plugin/job-dsl/api-viewer/index.html as Job DSL support auto generation so there is bright chance that plugin not listed above still has DSL support.

How can I implement a retry option using Jenkins pipeline plugin

Currently in build flow plugin we use following approach. This code will retry twice.
programs_create_servers_retry_count=2
retry(programs_create_servers_retry_count) {
build( "create_virtual_servers",j_SL_data_center_local: programs_create_servers_dc_1,j_random_id_local: random_id)
}
How can do the same Jenkins Pipeline plugin?
Check retry method in Jenkins DSL : https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#retry-retry-the-body-up-to-n-times
Example piece of code will look like that:
stage('stageName') {
try {
...
} catch(error) {
retry(3) {
do smth
}
}
}
Where number 3 is number of attempts to retry.

Visualize Jenkins pipeline or multibranch pipeline jobs

I have one Pipeline job for each component in my Jenkins 2.0. All of them
consist of many stages (build, UT, IT etc.), so they're working as a
pipeline for a component.
The components are depending on each other in a specified order, so I used "Build after other projects are built" (I also tried JobFanIn Plugin) to trigger these "mini-pipelines" after each other. This works like a pipeline of "mini pipelines"
I'd like to visualize the relationship between the jobs. For this purpose I've found 2 plugins:
Delivery Pipeline Plugin
Build Pipeline Plugin
Both introduce a new View type, but none of them supports the "Pipeline" or "Multibranch pipeline" job types (introduced in Jenkins 2.0), these jobs are not visible in the related dropdown list on the view config page.
How can I visualize the relation of these job types? Is there any other plugin which supports these types?
Thinking about this.
I don't think a visualisation of multi branch pipelines makes sense in the same way it would for a single branch build.
The reason is that each bench of a mb pipeline can have a different build configuration. Eg with master triggering a promotion job but branch doing something else or nothing.
Do the best one could do I think is trace an individual build number and it's links. Can't do it at the job level.
Jenkins blue ocean plugins give the rich view to visualize all types (parallel, sequential stages) view out of the box.
Let say if you have a pipeline like this
pipeline {
agent any;
stages {
stage('build') {
stages {
stage('compile') {
steps {
echo "steps for unitest"
}
}
stage('security scan') {
parallel {
stage('sonarqube') {
steps {
echo "steps for parallel sonarqube"
}
}
stage('blackduck') {
steps {
echo "steps for parallel blackduck"
}
}
}
}
stage('package') {
steps {
echo "steps for package"
}
}
}
}
stage('deployment') {
stages {
stage('dev') {
steps {
echo "Development"
}
}
stage('pp') {
when { branch 'master' }
steps {
echo "PreProduction"
}
}
stage('prod') {
when {
branch 'master'
beforeInput true
}
input {
message "Deploy to production?"
id "simple-input"
}
steps {
echo "Production"
}
}
}
}
}
}
It will visualize like this :
is this what you are looking for?
Note- it can customize. but this view is per build ..you can't create a dashboard from it and combine it all in one

Resources