Jenkins: change the build history from different job - jenkins

I have two jenkins jobs:
build the project
deploy it
Both are working well and I can trigger the deploy job from the project build job.
Steps:
Build with parameters in the application's job >> check deploy on dev >> build
Add a yellow star badge to the build history in the application job - with groovy post-build action (code below)
Trigger the deploy job as post-build action
Question
After the deploy job was finished and failed change the build history of the application job (yellow star >> eg red one) - from the deploy job. How can I do that?
if ("true".equals(manager.build.buildVariables.get('DEPLOY_ON_DEV'))) {
manager.addBadge("star-gold.gif", "SNAPSHOT deployed on DEV")
}

This took me a while to develop but now it works like a charm in Post-build Actions → Add post-build action → Groovy Postbuild → Groovy script:
import hudson.model.Build
import hudson.model.Cause
import hudson.model.Project
import jenkins.model.Jenkins
import org.jvnet.hudson.plugins.groovypostbuild.GroovyPostbuildAction
def log = manager.listener.logger
log.println(' ----------------')
log.println(' Groovy Postbuild')
// decorate this build
manager.addShortText('SNAPSHOT deployed on DEV', 'black', 'gold', '1px', 'black')
manager.addInfoBadge('SNAPSHOT deployed on DEV')
manager.addBadge('star-gold.png', 'SNAPSHOT deployed on DEV')
// decorate upstream builds
Jenkins jenkins = Jenkins.getInstance()
List<Project> projects = jenkins.getAllItems(Project.class)
log.println(" This build: '${manager.build}' --> " + manager.build.getResult())
log.println(' Decorating the following upstream builds:')
//log.println(manager.build.getUpstreamBuilds()) // prints "[:]", so using this to get the upstream Builds doesn't work
for (Cause cause : manager.build.getCauses()) {
for (Project project : projects) {
if (cause.toString().contains(project.getName())) {
int no = cause.getUpstreamBuild()
Build usb = project.getBuildByNumber(no)
log.println(" ${usb}")
usb.getActions().add(GroovyPostbuildAction.createShortText(
'SNAPSHOT deployed on DEV', 'black', 'gold', '1px', 'black'));
usb.getActions().add(GroovyPostbuildAction.createInfoBadge(
'SNAPSHOT deployed on DEV'))
usb.getActions().add(GroovyPostbuildAction.createBadge(
'star-gold.png', 'SNAPSHOT deployed on DEV'))
}
} // for (projects)
} // for (causes)
log.println(' ----------------')
Note:
This adds badges regardless of the build result but I'm confident that you can add the appropriate if easily. For removing badges see Groovy Postbuild Plugin's page.
References:
Jenkins set a badge as a pre-build step
Jenkins main module 1.622 API
Groovy Postbuild Plugin
GroovyPostbuildAction.java

Related

How to load a variable within a pipeline using a property file from another freestyle project?

I have a Jenkins "freestyle" project which triggers a "pipeline" project (in fact my "freestyle" project is mentionned as a trigger in "Build Triggers" step of the pipeline project).
How could I grab values of variables from a ".properties" file created by each build of the "parent/freestyle" project?
Currently I have checked "archive artifacts" on the "parent/freestyle" projet and add following code to my "child/pipeline":
node
{
load "${WORKSPACE}/variables.properties"
echo "${PARAM_FROM_TRIGGER}"
}
pipeline
{
agent any
stages
{
stage('STEP1')
{
steps
{
sh '''
#!/bin/bash
echo 'STEP 1'
'''
}
}
}
}
I encounter an exception after the "child/pipeline" build:
java.nio.file.NoSuchFileException:
/var/lib/jenkins/workspace/my_pipeline/variables.properties
How could I load values from my property file?
Since you're already archiving the .properties file, I think you're looking for the Copy Artifact Plugin.
You can use the command:
copyArtifacts(projectName: 'sourceproject');
to copy the artifacts from parent/freestyle into the workspace of child/pipeline.

Set project version in Jenkins Pipeline with the Artifactory Gradle plugin

We want to define the project version for our Gradle project during the build of our project in the Jenkins pipeline, which will include a timestamp and a git-commit-id. (20180625180158-b8ad8df0dc0356a91707eaa241de7d62df6a29f2)
void defineVersion() {
sh "git rev-parse HEAD > .git/commit-id"
commitId = readFile('.git/commit-id')
timestamp = getCurrentTimestamp()
version = timestamp+'-'+commitId
}
This function will determine the version I want to publish our artifact with.
Next I use the Artifactory Gradle plugin to publish, but I can't find a way to set/override the project version. I want the jar to be published with version 20180625180158-b8ad8df0dc0356a91707eaa241de7d62df6a29f2
version = defineVersion() // how can we incorperate this version in our gradle build/publish?
gradleBuild = Artifactory.newGradleBuild()
gradleBuild.useWrapper = true
gradleBuild.deployer(
repo: env.BRANCH_NAME == 'master' ? 'libs-releases-local' : 'libs-snapshots-local',
server: Artifactory.server('artifactory-global'))
gradleBuild.run tasks: 'clean build artifactoryPublish'
How can we achieve this? Also I would like to pass other parameters like -x test to the run command to skip tests in this stage.
Apparently you can add parameters throug the switches parameter: https://jenkins.io/doc/pipeline/steps/artifactory/
With this you add the necessary parameters like '-x test -Pversion=' + version
For my use case I added a version property to my build.gradle: version = "${version}" so it can be overridden with the command above.

gradle artifactorypublish: jenkins pipeline does not publish properties

I'm trying to set up a jenkins pipeline for publishing a zip file to jfrog artifactory.
I am using com.jfrog.artifactory plugin to do so. This works great from command line gradle and I can run the artifactoryPublish task to publish the artifacts and tie them back to the module, which then has a tie back to the artifacts.
The artifacts show up with the properties:
build.name = `projectname`
build.number = `some large number`
And I can click from them to the build/module and back to the artifact.
However, when I run this from a jenkinsfile pipeline, the artifacts get published and get tied back to the module, but then the module does not successfully tie the module back to the artifacts.
The artifacts do not receives the build.name and build.number properties and i cannot click from the module back to the artifacts, as the module cannot find or resolve the paths back to the artifacts(a zip file and a generated pom).
I am passing the params from jenkins like:
ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER} which seems to work on other projects... but for whatever reason I cannot shake it.
I can include more jenkinsfile if that would help debug, but i'm really just checking out a repository and trying to publish it.
I have been reading heavily the documentation here:
https://www.jfrog.com/confluence/display/RTF/Gradle+Artifactory+Plugin
and haven't been able to make it work through -Pproject stuff.
Does anyone have any idea what else I can try? i don't really want to use the jenkins pipeline artifactory plugin directly because it's so nice to be able to deploy from the command line too.
build.gradle:
publishing {
publications {
ManualUpdaterPackage(MavenPublication){
artifact assembleManualUpdaterPackage
}
}
}
artifactory {
contextUrl = "${artifactoryUrl}" //The base Artifactory URL if not overridden by the publisher/resolver
publish {
defaults {
publications('ManualUpdaterPackage')
}
repository {
repoKey = project.version.endsWith('-SNAPSHOT') ? snapshotRepo : releaseRepo
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
}
task assembleManualUpdaterPackage (type: Zip){
dependsOn anotherTask
from (packageDir + "/")
include '**'
// archiveName "manualUpdaterPackage-${version}.zip"
destinationDir(file(manualUpdaterZipDir))
}
jenkinsfile snip:
withCredentials(
[
[
$class : 'UsernamePasswordMultiBinding',
credentialsId : 'validcreds',
passwordVariable: 'ORG_GRADLE_PROJECT_artifactory_password',
usernameVariable: 'ORG_GRADLE_PROJECT_artifactory_user'
]
]
) {
withEnv(
[
"ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER}",
"ORG_GRADLE_PROJECT_buildInfo.build.name=${artifactName}",
"ORG_GRADLE_PROJECT_buildInfo.build.url=${env.JOB_URL}"
]
) {
sh 'chmod +x gradlew'
sh "./gradlew --no-daemon clean artifactoryPublish"
}
}
https://www.jfrog.com/confluence/display/RTF/Working+With+Pipeline+Jobs+in+Jenkins#WorkingWithPipelineJobsinJenkins-GradleBuildswithArtifactory
Eventually my coworker recommended looking into the Artifactory Pipeline Gradle plugin instead. It is very nice to work with and we've had much quicker success with it.

Using waitForQualityGate in a Jenkins declarative pipeline

The following SonarQube (6.3) analysis stage in a declarative pipeline in Jenkins 2.50 is failing with this error in the console log: http://pastebin.com/t2ja23vC. More specifically:
SonarQube installation defined in this job (SonarGate) does not match any configured installation. Number of installations that can be configured: 1.
Update: after changing "SonarQube" to "SonarGate" in the Jenkins settings (under SonarQube servers, so it'll match the Jenkinsfile), I get a different error: http://pastebin.com/HZZ6fY6V
java.lang.IllegalStateException: Unable to get SonarQube task id and/or server name. Please use the 'withSonarQubeEnv' wrapper to run your analysis.
The stage is a modification of the example from the SonarQube docs: https://docs.sonarqube.org/display/SCAN/Analyzing+with+SonarQube+Scanner+for+Jenkins#AnalyzingwithSonarQubeScannerforJenkins-AnalyzinginaJenkinspipeline
stage ("SonarQube analysis") {
steps {
script {
STAGE_NAME = "SonarQube analysis"
if (BRANCH_NAME == "develop") {
echo "In 'develop' branch, don't analyze."
}
else { // this is a PR build, run sonar analysis
withSonarQubeEnv("SonarGate") {
sh "../../../sonar-scanner-2.9.0.670/bin/sonar-scanner"
}
}
}
}
}
stage ("SonarQube Gatekeeper") {
steps {
script {
STAGE_NAME = "SonarQube Gatekeeper"
if (BRANCH_NAME == "develop") {
echo "In 'develop' branch, skip."
}
else { // this is a PR build, fail on threshold spill
def qualitygate = waitForQualityGate()
if (qualitygate.status != "OK") {
error "Pipeline aborted due to quality gate coverage failure: ${qualitygate.status}"
}
}
}
}
}
I also created a webhook, sonarqube-webhook, with the URL http://****/sonarqube-webhook/. Should it be like that, or http://****/sonarqube/sonarqube-webhook? To access the server dashboard I use http://****/sonarqube.
In SonarQube's Quality Gates section I created a new quality gate:
I am not sure if the setting in SonarGate is correct. I do use jenkins-mocha to generate an lcov.info file that is used in Sonar to generate the coverage data.
Perhaps the quality gate setting is the wrong setting to do? The end result is to fail the job in Jenkins if coverage % is not met.
Finally, I am not sure if the following configurations in the Jenkins system configuration are at all required:
And
(It's 9000 not 900... cut text in the screen shot)
The SonarQube Jenkins plugin scans the build output for two specific lines, which it uses to get the SonarQube report task properties and project URL. If your invocation of sonar-scanner does not output these lines, the waitForQualityGate() call won't have the task ID to look them up. So you will have to figure out the correct settings to make it more verbose.
See the extractSonarProjectURLFromLogs and extractReportTask methods in the SonarUtils class of the plugin to understand how they work:
ANALYSIS SUCCESSFUL, you can browse <project URL> is used to add a link to the badge (in the build history)
Working dir: <dir with report-task.txt> is used to pass the task ID to the waitForQualityGate step
This was discovered to be a bug in the SonarQube scanner for Jenkins, when using a Jenkins slave for jobs (if the job is run on the master, it'd work). You can read more here: https://jira.sonarsource.com/browse/SONARJNKNS-282
I have tested this using a test build of v2.61 of the scanner plug-in and found it working.
The solution is to upgrade to v2.61 when released.
This stage will then work:
stage ("SonarQube analysis") {
steps {
withSonarQubeEnv('SonarQube') {
sh "../../../sonar-scanner-2.9.0.670/bin/sonar-scanner"
}
def qualitygate = waitForQualityGate()
if (qualitygate.status != "OK") {
error "Pipeline aborted due to quality gate coverage failure: ${qualitygate.status}"
}
}
}
If you're running SonarCube in a docker container check that the memory isn't exhausted. We were maxing out. Which seemed to be the issue.

Copy artifacts of multiple builds on same job in Jenkins

I'm using MultiJob plugin and have a job (Job-A) that triggers Job-B several times.
My requirement is to copy some artifact (xml files) from each build.
The difficulty I have is that using Copy Artifact Plugin with "last successful build" option will only take the last build of Job-B, while I need to copy from all builds that were triggered on the same build of Job-A
The flow looks like:
Job-A starts and triggers:
`Job-A` -->
Job-B build #1
Job-B build #2
Job-B build #3
** copy artifcats of all last 3 builds, not just #3 **
Note: Job-B could be executed on different slaves on the same run (I set the slave to run on dynamically by setting parameter on upstream job-A)
When all builds are completed, I want Job-A to copy artifact from build #1, #2 and #3 , and not just from last build.
How can I do this?
Here is more generic groovy script; it uses the groovy plugin and the copyArtifact plugin; see instructions in the code comments.
It simply copies artifacts from all downstream jobs into the upstream job's workspace.
If you call the same job several times, you could use the job number in the copyArtifact's 'target' parameter to keep the artifacts separate.
// This script copies artifacts from downstream jobs into the upstream job's workspace.
//
// To use, add a "Execute system groovy script" build step into the upstream job
// after the invocation of other projects/jobs, and specify
// "/var/lib/jenkins/groovy/copyArtifactsFromDownstream.groovy" as script.
import hudson.plugins.copyartifact.*
import hudson.model.AbstractBuild
import hudson.Launcher
import hudson.model.BuildListener
import hudson.FilePath
for (subBuild in build.builders) {
println(subBuild.jobName + " => " + subBuild.buildNumber)
copyTriggeredResults(subBuild.jobName, Integer.toString(subBuild.buildNumber))
}
// Inspired by http://kevinormbrek.blogspot.com/2013/11/using-copy-artifact-plugin-in-system.html
def copyTriggeredResults(projName, buildNumber) {
def selector = new SpecificBuildSelector(buildNumber)
// CopyArtifact(String projectName, String parameters, BuildSelector selector,
// String filter, String target, boolean flatten, boolean optional)
def copyArtifact = new CopyArtifact(projName, "", selector, "**", null, false, true)
// use reflection because direct call invokes deprecated method
// perform(Build<?, ?> build, Launcher launcher, BuildListener listener)
def perform = copyArtifact.class.getMethod("perform", AbstractBuild, Launcher, BuildListener)
perform.invoke(copyArtifact, build, launcher, listener)
}
I suggest you the following approach:
Use Execute System Groovy script from Groovy Plugin to execute the following script:
import hudson.model.*
// get upstream job
def jobName = build.getEnvironment(listener).get('JOB_NAME')
def job = Hudson.instance.getJob(jobName)
def upstreamJob = job.upstreamProjects.iterator().next()
// prepare build numbers
def n1 = upstreamJob.lastBuild.number
def n2 = n1 - 1
def n3 = n1 - 2
// set parameters
def pa = new ParametersAction([
new StringParameterValue("UP_BUILD_NUMBER1", n1.toString()),
new StringParameterValue("UP_BUILD_NUMBER2", n2.toString()),
new StringParameterValue("UP_BUILD_NUMBER3", n3.toString())
])
Thread.currentThread().executable.addAction(pa)
This script will create three environment variables which correspond to three last build numbers of upstream job.
Add three build steps Copy artifacts from upstream project to copy artifacts from last three builds of upstream project (use environment variables from script above to set build number):
Run build and checkout build log, you should have something like this:
Copied 2 artifacts from "A" build number 4
Copied 2 artifacts from "A" build number 3
Copied 1 artifact from "A" build number 2
Note: perhaps, script need to be adjusted to catch unusual cases like "upstream project has only two builds", "current job doesn't have upstream job", "current job has more than one upstream job" etc.
You can use the following example from an "Execute Shell" Build Step.
Please note it can be run only from the Jenkins Master machine and the job calling this step also triggered the MultiJob.
#--------------------------------------
# Copy Artifacts from MultiJob Project
#--------------------------------------
PROJECT_NAME="MY_MULTI_JOB"
ARTIFACT_PATH="archive/target"
TARGET_DIRECTORY="target"
mkdir -p $TARGET_DIRECTORY
runCount="TRIGGERED_BUILD_RUN_COUNT_${PROJECT_NAME}"
for ((i=1; i<=${!runCount} ;i++))
do
buildNumber="${PROJECT_NAME}_${i}_BUILD_NUMBER"
cp $JENKINS_HOME/jobs/$PROJECT_NAME/builds/${!buildNumber}/$ARTIFACT_PATH/* $TARGET_DIRECTORY
done
#--------------------------------------

Resources