How to do a parameterized remote trigger from a post build task? - jenkins

I have a jenkins job with a post build task. The post build task is a regex expression. If the regex condition it met, I want to do a parameterized remote trigger to trigger another jenkins build. From post build task I see that the regex condition can trigger a script. Is it possible to have it trigger a parameterized remote trigger?
Basically, I only want to do the second build if the regex condition is met from the first build. I don't want to have a script that executes a CURL to achieve it. Is there any other way?

I achieved this by using the Post-Build Groovy Plugin. You select it under the Post-build Actions of your job configuration in Jenkins. My groovy script performs a regex on the build log of my original job. If the regex condition is met, it triggers a new build.
Below is the example from the Groovy Plugin:
def job = Hudson.instance.getJob('MyJobName')
def anotherBuild
try {
def params = [
new StringParameterValue('FOO', foo),
]
def future = job.scheduleBuild2(0, new Cause.UpstreamCause(build), new ParametersAction(params))
println "Waiting for the completion of " + HyperlinkNote.encodeTo('/' + job.url, job.fullDisplayName)
anotherBuild = future.get()
} catch (CancellationException x) {
throw new AbortException("${job.fullDisplayName} aborted.")
}

Related

Check if Jenkins node is online for the job, otherwise send email alert

Having the Jenkins job dedicated to special node I'd like to have a notification if the job can't be run because the node is offline. Is it possible to set up this functionality?
In other words, the default Jenkins behavior is waiting for the node if the job has been started when the node is offline ('pending' job status). I want to fail (or don't start at all) the job in this case and send 'node offline' mail.
This node checking stuff should be inside the job because the job is executed rarely and I don't care if the node is offline when it's not needed for the job. I've tried external node watching plugin, but it doesn't do exactly what I want - it triggers emails every time the node goes offline and it's redundant in my case.
I found an answer here.
You can add a command-line or PowerShell block which invokes the curl command and processes a result
curl --silent $JENKINS_URL/computer/$JENKINS_NODENAME/api/json
The result json contains offline property with true/false value
I don't think checking if the node is available can be done inside the job (e.g JobX) you want to run. The act of checking, specifically for your JobX at time of execution, will itself need a job to run - I don't know of a plugin/configuration option that'll do this. JobX can't check if the node is free for JobX.
I use a lot of flow jobs (in process of converting to pipeline logic) where JobA will trigger the JobB, thus JobA could run on master check the node for JobB, JobX in your case, triggering it if up.
JobA would need to be a freestyle job and run a 'execute system groovy script > Groovy command' build step. The groovy code below is pulled together from a number of working examples, so untested:
import hudson.model.*;
import hudson.AbortException;
import java.util.concurrent.CancellationException;
def allNodes = jenkins.model.Jenkins.instance.nodes
def triggerJob = false
for (node in allNodes) {
if ( node.getComputer().isOnline() && node.nodeName == "special_node" ) {
println node.nodeName + " " + node.getComputer().countBusy() + " " + node.getComputer().getOneOffExecutors().size
triggerJob = true
break
}
}
if (triggerJob) {
println("triggering child build as node available")
def job = Hudson.instance.getJob('JobB')
def anotherBuild
try {
def params = [
new StringParameterValue('ParamOne', '123'),
]
def future = job.scheduleBuild2(0, new Cause.UpstreamCause(build), new ParametersAction(params))
anotherBuild = future.get()
} catch (CancellationException x) {
throw new AbortException("${job.fullDisplayName} aborted.")
}
} else {
println("failing parent build as node not available")
build.getExecutor().interrupt(hudson.model.Result.FAILURE)
throw new InterruptedException()
}
To get the node offline email, you could just trigger a post build action to send emails on failure.

Jenkins - how to show downstream jobs build result on Gerrit patch set?

below is my use case,
I have jobs A, B, C - A is upstream and B and C are downstream jobs. When a Patch set created in Gerrit, based on the patchset created event I trigger Job A and based on the result of this job, we trigger B and C. After the B and C is executed, I want to display the result of all three jobs on Gerrit patch set. like
Job A SUCCESS
JOB B SUCCESS
JOB C FAILED
right now I see only JOB A Build result showing up on GERRIT PATCH SET as
JOB A SUCCESS
is there any way to do this?
Do the following:
1) Configure all jobs (A, B and C) to trigger when the patch set is created.
2) Configure the jobs B and C to depend on job A
2.1) Click on "Advanced..." in Gerrit Trigger job configuration
2.2) Add the job A on the "Other jobs on which this job depends" field
With this configuration jobs B and C will wait for job A finish before they start and you'll get the result you want:
The best way to solve this is to create a small wrapper pipeline job. Let name it Build_ABC.
Configure Build_ABC to trigger on the Gerrit event you wish. The job will be responsible for running the other 3 builds and in the event of any failures in these jobs your Build_ABC will fail and report this back to Gerrit. You will not be able to see immediately which job failed in your Gerrit message but you will be able to see in in your Jenkins pipeline overview.
In the below scripted pipeline script you see a pipeline that calls Build_A and waits for the result. If the build succeeds it will continue to execute Build B and C in parallel. In my example I made Build C failed which caused the whole pipeline job to fail.
This is a revised version of my fist answer and the script has grown a bit. As it is required to have the individual build results in the message posted to Gerrit the pipeline has been changed to catch the individual results and record them. If build A fails builds B+C will be skipped and the status will be skipped.
Next it is possible to use the gerrit review ssh command line tool to perform manual review. This way it is possible to have a custom message generated to include the individual build results. It looks like the screen shot below:
I haven't figured out how to make it a multi line comment but there is also an option to use json in the command line, have a look at that.
def build_a = "Skipped"
def build_b = "Skipped"
def build_c = "Skipped"
def build_result = "+1"
try {
stage("A") {
try {
build( job: '/Peter/Build_A', wait: true)
build_a = "Pass"
} catch (e) {
build_a = "Failed"
// throw again or else the job to make the build fail
// throwing here will prevent B+C from running
throw e
}
}
stage("After A") {
parallel B: {
try {
build( job: '/Peter/Build_B', wait: true)
build_b = "Pass"
} catch (e) {
build_b = "Failed"
// throw again or else the job to make the build fail
throw e
}
}, C: {
try {
build( job: '/Peter/Build_C', wait: true)
build_c = "Pass"
} catch (e) {
build_c = "Failed"
// throw again or else the job to make the build fail
throw e
}
}
}
} catch(e) {
build_result = "-1"
// throw again or else the job to make the build fail
throw e
} finally {
node('master') {
// Perform a custom review using the environment vars
sh "ssh -p ${env.GERRIT_PORT} ${env.GERRIT_HOST} gerrit review --verified ${build_result} -m '\"Build A:${build_a} Build B: ${build_a} Build C: ${build_c}\"' ${env.GERRIT_PATCHSET_REVISION}"
}
}
Next you should configure the Gerrit trigger to ignore the results from Jenkins or else there will be a double vote.
One more advantage is that with the Ocean Blue plugin you can get a nice graphical representation, see below image, of your build and you can examine what went wrong by clicking on the jobs.

How to trigger a different jobs in a master job

I have a job which contains a field of type string parameter (called JOB_NAME), in this string parameter, I will just fill it with another job name.
in the same job, I created "trigger/call builds on another project", in the latest, I will just provide the $JOB_NAME but it is not working.
My second question is how to fill $JOB_NAME field with some existing job using regular expresion or something else.
Can someone provide me clear steps, I am not that expert in Jenkins.
Thanks a lot
You can use Groovy Postbuild Plugin to achieve this.
Once you install the plugin, go to Post-build Actions and choose Groovy Postbuild option and add the following script to it.
Then, whenever you run your job it will ask for JOB_NAME as you have defined it as a parameter in your job and whatever project name you will enter, it will trigger that job in the downstream.
import hudson.model.*
import jenkins.model.*
def build = Thread.currentThread().executable
def jobPattern = manager.build.buildVariables.get("JOB_NAME")
def matchedJobs = Jenkins.instance.items.findAll { job ->
job.name =~ /$jobPattern/
}
matchedJobs.each { job ->
println "Triggering ${job.name} in the down stream...."
job.scheduleBuild(1, new Cause.UpstreamCause(build), new ParametersAction([ new StringParameterValue("PROPERTY1", "PROPERTY1VALUE"),new StringParameterValue("PROPERTY2", "PROPERTY2VALUE")]))
}

Get parameters of Jenkins build by job name and build id

I am using Jenkins Pipeline plugin and I need to get all parameters of particular build by its id and job name from other job.
So, basically i need something like this.
def job = JobRegistry.getJobByName(jobName)
def build = job.getBuild(buildId)
Map parameters = build.getParameters()
println parameters['SOME_PARAMETER']
I figured it out.
I can retrieve parameters like this
def parameters = Jenkins.instance.getAllItems(Job)
.find {job -> job.fullName == jobName }
.getBuildByNumber(buildId.toInteger())
.getAction(hudson.model.ParametersAction)
println parameters.getParameter('SOME_PARAMETER').value
I suggest you to review "Pipeline Syntax" in a pipeline job, at bottom of Pipeline plugin, and you can see Global Variable Reference, like docker/pipeline/env/etc.
So what you need, JOB_NAME / BUILD_ID is given in "env" list

Call a jenkins job by using a variable for build the name

I try to launch a job from a parametrized trigger and I would compute the name from a given variable.
Is it possible to set in field :
Build Triggers Projects to build
a value like this
${RELEASE}-MAIN-${PROJECT}-LOAD_START
?
Unfortunately, this isn't possible with the Build Triggers. I looked for a solution for this "higher order build job" that would allow you to create a dynamic build name with a one of the parameterized build plugins, but I couldn't find one.
However, using the Groovy Postbuild Plugin, you can do a lot of powerful things. Below is a script that can be modified to do what you want. In particular, notice that it gets environmental variables using build.buildVariables.get("MY_ENV_VAR"). The environmental variable TARGET_BUILD_JOB specifies the name of the build job to build. In your case, you would want to build TARGET_BUILD_JOB using these two environmental variables:
build.buildVariables.get("RELEASE")
build.buildVariables.get("PROJECT")
The script is commented so that if you're not familiar with Groovy, which is based off Java, it should hopefully make sense!
import hudson.model.*
import hudson.model.queue.*
import hudson.model.labels.*
import org.jvnet.jenkins.plugins.nodelabelparameter.*
def failBuild(msg)
{
throw new RuntimeException("[GROOVY] User message, exiting with error: " + msg)
}
// Get the current build job
def thr = Thread.currentThread()
def build = thr?.executable
// Get the parameters for the current build job
// For ?:, see "Elvis Operator" (http://groovy.codehaus.org/Operators#Operators-ElvisOperator)
def currentParameters = build.getAction(ParametersAction.class)?.getParameters() ?:
failBuild("There are no parameters to pass down.")
def nodeName = build.getBuiltOnStr()
def newParameters = new ArrayList(currentParameters); newParameters << new NodeParameterValue("param_NODE",
"Target node -- the node of the previous job", nodeName)
// Retrieve information about the target build job
def targetJobName = build.buildVariables.get("TARGET_BUILD_JOB")
def targetJobObject = Hudson.instance.getItem(targetJobName) ?:
failBuild("Could not find a build job with the name $targetJobName. (Are you sure the spelling is correct?)")
println("$targetJobObject, $targetJobName")
def buildNumber = targetJobObject.getNextBuildNumber()
// Add information about downstream job to log
def jobUrl = targetJobObject.getAbsoluteUrl()
println("Starting downstream job $targetJobName ($jobUrl)" + "\n")
println("======= DOWNSTREAM PARAMETERS =======")
println("$newParameters")
// Start the downstream build job if this build job was successful
boolean targetBuildQueued = targetJobObject.scheduleBuild(5,
new Cause.UpstreamCause(build),
new ParametersAction(newParameters)
);
if (targetBuildQueued)
{
println("Build started successfully")
println("Console (wait a few seconds before clicking): $jobUrl/$buildNumber/console")
}
else
failBuild("Could not start target build job")

Resources