how to get build number of other job in jenkins pipeine - jenkins

I have created a pipeline where for build there is one job and for mail notification another job. I need to get the build job latest BUILD_NUMBER(either success or failure whatever) in mail notification job and i should publish the build job latest url as a mail notification regarding the buildstatus.
How could i achieve this?

I think there are multiple ways that you could achieve this. Let's call your 2 builds A (build) and B (mail notification).
Call B from A; pass whatever information you need in B as a parameter
You can use a combination of this script and the Job API.
For #2, I'm thinking you would have something like this in your B job:
def jobname = "<name of Job A>"
def job = Jenkins.instance.getItemByFullName(jobname)
def build = job.getLastBuild()

Related

Result of each jenkins job and send by email

In my company I have a pipeline that runs several jobs. I wanted to get the result of each job and write each of these results in a file or variable, later email it to me. Is there such a possibility? Remembering that: I don't want the result of the pipeline, but the result of each of the jobs that are inside it.
I even tried to make requests via api, but for each pipeline it would have to have a code and that is not feasible at all, the maintenance issue.
When you trigger a job inside a pipeline, you use the build job step.
This step has a property called propagate that:
If enabled (default state), then the result of this step is that of the downstream build (e.g., success, unstable, failure, not built, or aborted). If disabled, then this step succeeds even if the downstream build is unstable, failed, etc.; use the result property of the return value as needed.
You can write a wrapper for calling jobs, that stores the result of each job (and maybe other data useful for debugging, like build url), so you can use it later to construct the contents of an email.
E.g.
def jobResults = [:]
def buildJobAndStoreResult(jobName, jobParams) {
def run = build job: jobName, parameters: jobParams, propagate: false
jobResults[jobName] = [
result: run.result
]
}
Then you can constuct the body of an email by iterating through the map e.g.
emailBody = "SUMMARY\n\n"
jobResults.each() { it ->
​ str += "${it.key}: ${it.value.result}\n"
}
And use the mail step to send out a report.
It's worth thinking if you want your pipeline to fail after sending the email if any of the called jobs failed, and adding links from your email report to the failed jobs and caller pipeline.

How to discard old builds in Jenkins, based on their completion status?

I have a Jenkins job set up to poll for a change in a git repository every 10 minutes. If it doesn't find one (99/100 times, this is what happens) it aborts the build early and marks it as UNSTABLE . If it finds a change, it goes through with it and marks it as SUCCESS, storing its artifacts. I know I can use a plugin to discard old builds, but it only allows for these options:
As you can see, there is no option to filter by completion status.
Ideally, I want to discard all but the latest UNSTABLE build and keep all SUCCESS or FAILED builds and their artifacts. If this is not possible, simply discarding all UNSTABLE builds would also work.
Note: I am using a declarative Pipeline
One possibility would be to discard builds programmatically. Get your job object with def job = Jenkins.instance.getItem("JobName") Since you are using declarative pipeline, job is of type WorkflowJob [1] and you can get all its builds with
job.getBuilds(). Now you can check the result of each build (WorkflowRun objects [2]) and decide if you want to delete it or not. Something like follows should work
def job = Jenkins.instance.getItem("JobName")
job.getBuilds().each {
if(it.result.toString() == "UNSTABLE") {
it.delete()
job.save()
}
}
You could create a new job that executes the code above and is triggered after YourJob has been built.
[1] https://javadoc.jenkins.io/plugin/workflow-job/org/jenkinsci/plugins/workflow/job/WorkflowJob.html
[2] https://javadoc.jenkins.io/plugin/workflow-job/org/jenkinsci/plugins/workflow/job/WorkflowRun.html

Passing data (variable/parameter) from one downstream job to upstream job in order to pass the data to another downstream job

I have a scenario where I am triggering two downstream jobs one after another sequentially from an upstream job.
I need to return data xyz = 3.1416 (parameter/variable) generated in the first downstream job (Job A) back to the upstream job or read data xyz (parameter/variable) generated in the first downstream job (Job A) from the upstream job.
I want to do that as the upstream job needs to pass this data to the other downstream job (Job B).
All these jobs are pipeline jobs.
I am writing the upstream job as an abstraction layer and to automate the trigger of the 2 downstream jobs sequentially one after another.
structure / flowchart of jobs
There are a couple of approaches trying to solve that problem. Calling jobs up and downstream isn't a good idea, because you can create a circular reference between them (i.e.: A calls B, that calls A again, that calls B...). Your Jenkins probably won't break because it is limited by the number of workers, but still...
Solution A: Use artifacts
You can store your values in JSON or YAML files and then create Jenkins artifacts using the archiveArtifacts() step and the Copy Artifact plugin. That way, jobs and builds can share information amongst them.
Solution B: Use buildVariables
There's a way to downstream jobs return values back to upstream jobs, using a resource known as buildVariables. Here's the code from the upstream job:
def ret = build job: 'downstream_job'
print "The returned value from the triggered job was ${ret.buildVariables.RETURNED_VALUE}"
And in the downstream job:
environment {
RETURNED_VALUE = ""
}
stages {
stage('Doing something') {
steps {
script {
print("Hi, I was triggered!")
env.RETURNED_VALUE = "Blah blah blah"
}
}
}
}
buildVariables can access any environment variable from the downstream job, except build parameters.
Best regards.

jenkins, how to run multiple remote jobs without stopping on failure

I have a jenkins job that I'm using to aggregate the execution of multiple other jobs that only perform testing. Because they are testing, I want all the jobs to run regardless of any failures. I do want to keep track of wether or not there has been a failure so that I can set the end result to FAILURE rather than SUCCESS if need be.
At the moment I am calling 1 remote job via bash script and jenkins-cli. I have a 2nd child job that is local, so I'm using "trigger/call builds on other jobs" build step to run that one.
Any ideas on how to accomplish this?
If you can use build_flow-plugin it is easy, if you use pipeline it is possible too but can't give you example. Have to look it up if that is the case.
https://wiki.jenkins-ci.org/display/JENKINS/Build+Flow+Plugin:
def result = SUCCESS
ignore(FAILURE){
def job1 = build('job1')
result = job1.result.combine(result)
}
ignore(FAILURE){
def job2 = build('job2')
result = job1.result.combine(result)
}
build.result = result.combine(build.result)
http://javadoc.jenkins.io/hudson/model/Result.html

Notify about / check for SCM poll failure in Jenkins

I would like to check for or get notifications about SCM poll failures in Jenkins (for example, when the repository URL had changed, or branch got deleted). I thought about these:
a) A Jenkins console script, which would list such faulty jobs
b) Configuring/installing plugin for Jenkins to notify me somehow about that fact (e-mail, anything)
c) External script/executable (bash, python, ...), which would list builds which failed in last X hours due to SCM poll failure
As you mentioned in your question, one way to tackle this problem is by using a script. For example, Groovy Postbuild.
Since Groovy Postbuild scripts run on the master, you can access each job's scm-polling.log found on the file system using standard IO functions.
For example, assuming a Windows master, here is some (untested) pseudocode to give you some ideas:
def error = false;
def jobsDirectory = new File("C:\\Jenkins\\jobs");
jobsDirectory.eachFile {
def pollingLog = new File(it.path + "\\scm-polling.log");
if(pollingLog.text =~ "ERROR")
{
manager.listener.logger.println(it.path + " has polling errors.");
error = true;
}
}
if(error) {
manager.build.buildFailure();
}
Once you have marked the build as failure, you can use the standard email functionality of Jenkins to send an email or format it to look nice using the Email-ext plugin.

Resources