Using multiple CheckStylePublisher in Jenkins pipeline - jenkins

My Jenkins pipeline definition looks like this (excerpt):
eslint: {stage('ES Lint') {
sh "bin/eslint ..."
step([$class: 'CheckStylePublisher', pattern:...])
}},
stylelint: {stage('Style Lint') {
sh "..."
sh "..."
step([$class: 'CheckStylePublisher', pattern: '...'])
step([$class: 'CheckStylePublisher', pattern: '...'])
}},
...
I have many steps that call the CheckStylePublisher. Unfortunately when displaying the output in the Jenkins UI, I see several CheckStylePublisher graphs, but they have no reference to the respective step.
Is there any way to name the CheckStylePublisher instances and link them to the stage name somehow? Thanks

Related

How to pass parameters from Jenkins file to Groovy script file, when both these file are in different jobs

I have two Jenkins job say Job1 and Job2. Job1 uses Jenkinsfile to execute and Job2 uses Groovy script to execute.
Parameters being used are of type key-value pair, and these parameters will be passed from Jenkinsfile to Groovy Script so as to run Job2 parallel to Job1.
Please suggest a way to do the above scenario.
Below is my code:
Jenkins(caller)file used in Job1:
#!groovy
library <internal library I am using>
pipeline {
agent any
stages {
stage ('callParallelJob') {
steps {
script {
------- Some code------
def gitCommitterEmail = sh (
script: 'git --no-pager show -s --format=\'%ae\'',
returnStdout: true
).trim()
echo "git committer: ${gitCommitterEmail}"
appConfig.gitCommitterEmail = gitCommitterEmail
def gitBranch = "${env.GIT_BRANCH}".trim()
appConfig.gitBranch = gitBranch
def jenkinsBuildUrl = "${env.BUILD_URL}".trim()
appConfig.jenkinsBuildUrl = jenkinsBuildUrl
def ciData = [
'git': [
'git_repo': "${git_repo}",
'git_branch': "${env.BRANCH_NAME}",
'git_committer': "${gitCommitterEmail}",
'git_commit': "${GIT_COMMIT}"
],
'jenkins': [
'build_url': "${env.BUILD_URL}",
'job_name': "${JOB_NAME}",
'timestamp': "${BUILD_TIMESTAMP}"
]
]
writeYaml file: 'ci.yml', data: ciData
echo 'calling another job'
build job: '<job2Namehere>, parameters: [
string(name: 'Testparam', value: "123"),
], propagate: true, wait: false
echo 'called another job'
}
}
}
I want to pass appConfig and ci.yml data to below groovy script
Groovy file used in Job2 :
pipeline {
agent any
stages {
stage('Experiment'){
steps{
script{
sh 'echo ${Testparam}'
-------Some code here-------
}}}}}
You need to configure the called job to accept parameters, then your code should work.

jenkins pipeline, unstash from a sub job

I have a separate build pipeline that uses jenkinsfile to build the code.
I trigger it from a deploy pipeline and want to get build results.
The reason for this is that devs can define build steps but deploy is out of there control.
Here's a sample code in jenkins job builder:
- job:
name: Build and Deploy
project-type: pipeline
dsl: |
node {
stage('Build') {
# that job does stash inside a jenkinsfile
build job: "Build"
sh 'cp -rv "../Build/dist/" ./' # this is a workaround
stash includes: 'dist/*.zip', name: 'archive'
}
stage('Deploy') {
unstash 'archive'
sh "..."
}
}
So how can I unstash code stash-ed in a sub-job?
P.S.: there's also a workaround with artefacts:
In a sub-job:
archiveArtifacts artifacts: '*.zip', fingerprint: true
main DSL:
dsl: |
node {
def build_job_number = 0
def JENKINS = "http://127.0.0.1:8080"
stage('Build') {
def build_job = build job: "Build"
build_job_number = build_job.getNumber()
}
stage('Deploy') {
sh "wget -c --http-user=${USER} --http-password=${TOKEN} --auth-no-challenge ${JENKINS}/job/Build/${build_job_number}/artifact/name.zip"
sh "..."
}
}
The issue here is that API token is required.
If you go with archiveArtifacts you can use copyArtifacts to complement it.
As far as I know stash/unstash only work within the same job, so your other option would be to tick the Preserve stashes from completed builds in the pipeline's configuration so you can re-use them.

How can I rename Jenkins' pull request builder's "status check" display name on GitHub

We have a project on GitHub which has two Jenkins Multibranch Pipeline jobs - one builds the project and the other runs tests. The only difference between these two pipelines is that they have different JenkinsFiles.
I have two problems that I suspect are related to one another:
In the GitHub status check section I only see one check with the following title:
continuous-integration/jenkins/pr-merge — This commit looks good,
which directs me to the test Jenkins pipeline. This means that our build pipeline is not being picked up by GitHub even though it is visible on Jenkins. I suspect this is because both the checks have the same name (i.e. continuous-integration/jenkins/pr-merge).
I have not been able to figure out how to rename the status check message for each Jenkins job (i.e. test and build). I've been through this similar question, but its solution wasn't applicable to us as Build Triggers aren't available in Multibranch Pipelines
If anyone knows how to change this message on a per-job basis for Jenkins Multibranch Pipelines that'd be super helpful. Thanks!
Edit (just some more info):
We've setup GitHub/Jenkins webhooks on the repository and builds do get started for both our build and test jobs, it's just that the status check/message doesn't get displayed on GitHub for both (only for test it seems).
Here is our JenkinsFile for for the build job:
#!/usr/bin/env groovy
properties([[$class: 'BuildConfigProjectProperty', name: '', namespace: '', resourceVersion: '', uid: ''], buildDiscarder(logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')), [$class: 'ScannerJobProperty', doNotScan: false]])
node {
stage('Initialize') {
echo 'Initializing...'
def node = tool name: 'node-lts', type: 'jenkins.plugins.nodejs.tools.NodeJSInstallation'
env.PATH = "${node}/bin:${env.PATH}"
}
stage('Checkout') {
echo 'Getting out source code...'
checkout scm
}
stage('Install Dependencies') {
echo 'Retrieving tooling versions...'
sh 'node --version'
sh 'npm --version'
sh 'yarn --version'
echo 'Installing node dependencies...'
sh 'yarn install'
}
stage('Build') {
echo 'Running build...'
sh 'npm run build'
}
stage('Build Image and Deploy') {
echo 'Building and deploying image across pods...'
echo "This is the build number: ${env.BUILD_NUMBER}"
// sh './build-openshift.sh'
}
stage('Upload to s3') {
if(env.BRANCH_NAME == "master"){
withAWS(region:'eu-west-1',credentials:'****') {
def identity=awsIdentity();
s3Upload(bucket:"****", workingDir:'build', includePathPattern:'**/*');
cfInvalidate(distribution:'EBAX8TMG6XHCK', paths:['/*']);
}
};
if(env.BRANCH_NAME == "PRODUCTION"){
withAWS(region:'eu-west-1',credentials:'****') {
def identity=awsIdentity();
s3Upload(bucket:"****", workingDir:'build', includePathPattern:'**/*');
cfInvalidate(distribution:'E6JRLLPORMHNH', paths:['/*']);
}
};
}
}
Try to use GitHubCommitStatusSetter (see this answer for declarative pipeline syntax). You're using a scripted pipeline syntax, so in your case it will be something like this (note: this is just prototype, and it definitely must be changed to match your project specific):
#!/usr/bin/env groovy
properties([[$class: 'BuildConfigProjectProperty', name: '', namespace: '', resourceVersion: '', uid: ''], buildDiscarder(logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')), [$class: 'ScannerJobProperty', doNotScan: false]])
node {
// ...
stage('Upload to s3') {
try {
setBuildStatus(context, "In progress...", "PENDING");
if(env.BRANCH_NAME == "master"){
withAWS(region:'eu-west-1',credentials:'****') {
def identity=awsIdentity();
s3Upload(bucket:"****", workingDir:'build', includePathPattern:'**/*');
cfInvalidate(distribution:'EBAX8TMG6XHCK', paths:['/*']);
}
};
// ...
} catch (Exception e) {
setBuildStatus(context, "Failure", "FAILURE");
}
setBuildStatus(context, "Success", "SUCCESS");
}
}
void setBuildStatus(context, message, state) {
step([
$class: "GitHubCommitStatusSetter",
contextSource: [$class: "ManuallyEnteredCommitContextSource", context: context],
reposSource: [$class: "ManuallyEnteredRepositorySource", url: "https://github.com/my-org/my-repo"],
errorHandlers: [[$class: "ChangingBuildStatusErrorHandler", result: "UNSTABLE"]],
statusResultSource: [ $class: "ConditionalStatusResultSource", results: [[$class: "AnyBuildResult", message: message, state: state]] ]
]);
}
Please check this and this links for more details.
You can use the Github Custom Notification Context SCM Behaviour plugin https://plugins.jenkins.io/github-scm-trait-notification-context/
After installing go to the job configuration. Under "Branch sources" -> "GitHub" -> "Behaviors" click "Add" and select "Custom Github Notification Context" from the dropdown menu. Then you can type your custom context name into the "Label" field.
This answer is pretty much like #biruk1230's answer. But if you don't want to downgrade your github plugin to work around the bug, then you could call the API directly.
void setBuildStatus(String message, String state)
{
env.COMMIT_JOB_NAME = "continuous-integration/jenkins/pr-merge/sanity-test"
withCredentials([string(credentialsId: 'github-token', variable: 'TOKEN')])
{
// 'set -x' for debugging. Don't worry the access token won't be actually logged
// Also, the sh command actually executed is not properly logged, it will be further escaped when written to the log
sh """
set -x
curl \"https://api.github.com/repos/thanhlelgg/brain-and-brawn/statuses/$GIT_COMMIT?access_token=$TOKEN\" \
-H \"Content-Type: application/json\" \
-X POST \
-d \"{\\\"description\\\": \\\"$message\\\", \\\"state\\\": \\\"$state\\\", \
\\\"context\\\": \\\"${env.COMMIT_JOB_NAME}\\\", \\\"target_url\\\": \\\"$BUILD_URL\\\"}\"
"""
}
}
The problem with both methods is that continuous-integration/jenkins/pr-merge will be displayed no matter what.
This will be helpful with #biruk1230's answer.
You can remove Jenkins' status check which named continuous-integration/jenkins/something and add custom status check with GitHubCommitStatusSetter. It could be similar effects with renaming context of status check.
Install Disable GitHub Multibranch Status plugin on Jenkins.
This can be applied by setting behavior option of Multibranch Pipeline Job on Jenkins.
Thanks for your question and other answers!

How to use Jenkins String Parameter in pipeline

We am using Jenkins Pipeline to configure jobs in jenkins. For a bunch of jobs we need user input for which we use parameterised build where user can input parameter values and later we use the values in our .jenkinsfile in sh like
sh "./build-apply.sh ${accountnumber} ${volumename} ${vpcname} services ${snapshotid}"
This used to work with
Jenkins 2.16
Pipeline 2.3
Groovy 2.15
However, when I rebuild Jenkins to:
2.16 or latest 2.26
Pipeline 2.5
Pipeline: Groovy 2.19
The above sh stopped working. Error being
groovy.lang.MissingPropertyException: No such property: accountnumber for class: groovy.lang.Binding
Any idea what I am missing? Is the syntax not correct?
For reference full Jenkinsfile for reference
node {
// Mark the code checkout 'stage'....
stage 'Checkout'
git branch: '****', credentialsId: '***', url: '****'
stage 'Provision Volume'
withCredentials([[$class: 'AmazonWebServicesCredentialsBinding', accessKeyVariable: '*****',
credentialsId: '****',
secretKeyVariable: '*****']]) {
// Run the terraform build
env.PATH = "${env.PATH}:/jenkins/terraform"
sh "./build-apply.sh ${accountnumber} ${volumename} ${vpcname} services ${snapshotid}"
}
}
Copy and paste the below code in the pipeline script
node: {
stage ('BCCdlVsLib') {
build job: 'BCCdlVsLib', parameters:
[
[$class: 'StringParameterValue', name: 'BINPATH', value: 'BINPATH'],
[$class: 'StringParameterValue', name: 'SOURCEFILE', value: 'SOURCEFILE']
]
}
In the jobs (BCCdlVsLib) enable the option "this project is parametrized" and enter 2 string parameters job_binpath,job_sourcefile.
Print the variables in the pipeline jobs
echo job_binpath
echo job_sourcefile
After run the pipeline job,will get the below output.
BINPATH
SOURCEFILE

How do I use the report plugin on (PMD, PHPCPD, checkstyle, Jdepend...) in a Jenkins pipeline?

I'm using Jenkins 2.x with a Jenkinsfile to run a pipeline.
I have built a job using Jenkinsfile and I want to invoke the Analysis Collector Plugin so I can view the report.
Here is my current Jenkinsfile:
#!groovy
node {
stage 'Build '
echo "My branch is: ${env.BRANCH_NAME}"
sh 'cd gitlist-PHP && ./gradlew clean build dist'
stage 'Report'
step([$class: 'JUnitResultArchiver', testResults: 'gitlist-PHP/build/logs/junit.xml'])
step([$class: 'hudson.plugins.checkstyle.CheckStylePublisher', checkstyle: 'gitlist-PHP/build/logs/phpcs.xml'])
step([$class: 'hudson.plugins.dry.DryPublisher', CopyPasteDetector: 'gitlist-PHP/build/logs/phpcpd.xml'])
stage 'mail'
mail body: 'project build successful',
from: 'siregarpandu#gmail.com',
replyTo: 'xxxx#yyyy.com',
subject: 'project build successful',
to: 'siregarpandu#gmail.com'
}
I want to invoke invoke Checkstyle, Junit and DRY plugin from Jenkins. How do I configure these plugins in the Jenkinsfile? Do these plugins support pipelines?
The following configuration works for me:
step([$class: 'CheckStylePublisher', pattern: 'target/scalastyle-result.xml, target/scala-2.11/scapegoat-report/scapegoat-scalastyle.xml'])
For junit configuration is even easier:
junit 'target/test-reports/*.xml'
step([$class: 'hudson.plugins.checkstyle.CheckStylePublisher', checkstyle: 'gitlist-PHP/build/logs/phpcs.xml'])
Also according to source code repo, the argument 'checkstyle' should be named 'pattern'.
Repo:
https://github.com/jenkinsci/checkstyle-plugin/blob/master/src/main/java/hudson/plugins/checkstyle/CheckStylePublisher.java#L42
This is how I handle this:
PMD
stage('PMD') {
steps {
sh 'vendor/bin/phpmd . xml build/phpmd.xml --reportfile build/logs/pmd.xml --exclude vendor/ || exit 0'
pmd canRunOnFailed: true, pattern: 'build/logs/pmd.xml'
}
}
PHPCPD
stage('Copy paste detection') {
steps {
sh 'vendor/bin/phpcpd --log-pmd build/logs/pmd-cpd.xml --exclude vendor . || exit 0'
dry canRunOnFailed: true, pattern: 'build/logs/pmd-cpd.xml'
}
}
Checkstyle
stage('Checkstyle') {
steps {
sh 'vendor/bin/phpcs --report=checkstyle --report-file=`pwd`/build/logs/checkstyle.xml --standard=PSR2 --extensions=php --ignore=autoload.php --ignore=vendor/ . || exit 0'
checkstyle pattern: 'build/logs/checkstyle.xml'
}
}
JDepend
stage('Software metrics') {
steps {
sh 'vendor/bin/pdepend --jdepend-xml=build/logs/jdepend.xml --jdepend-chart=build/pdepend/dependencies.svg --overview-pyramid=build/pdepend/overview-pyramid.svg --ignore=vendor .'
}
}
The full example you can find here: https://gist.github.com/Yuav/435f29cad03bf0006a85d31f2350f7b4
Reference links
https://jenkins.io/doc/pipeline/steps/
It appears that the plugins need to be modified to support working as Pipeline Steps, so if they have not been updated, they don't work.
Here is a list of compatible plugins that have been updated:
https://github.com/jenkinsci/pipeline-plugin/blob/master/COMPATIBILITY.md
And here is the documentation about how the plugins need to be updated to support Pipelines:
https://github.com/jenkinsci/pipeline-plugin/blob/master/DEVGUIDE.md

Resources