How to fetch Build ID of Job triggered from another job - jenkins

I have the following situation in a Jenkinsfile of Job A:
...
... // Some execution
...
call Job B
// When Job B runs successfully
params.some_var_used_in_Job_C = BUILD ID of Job B
call Job C
I have to know the BUILD ID of Job B after it succeeds and I need to pass it as a params to Job C. Can anyone suggest how I can do this?
Also is it possible that I can pass some variable from Job B to Job A (so that I can send that value to Job C later) ?

Should be as simple as this:
node {
stage('Test') { // for display purposes
def jb = build wait: true, job: 'JobB'
println jb.fullDisplayName
println jb.id
//this will show everything available but needs admin privs to execute
println jb.properties
}
}
If you want to pass a simple string from job B to Job A then in Job B you can set an env variable
env.someVar = "some value"
then back in job A
println jb.buildVariables.someVar

#Kaus Untwale answer is correct. I've copied his answer into a declarative pipeline and added error handling.
From a upstream job:
pipeline {
agent any
stages {
stage('Run job') {
steps {
// make build as unstable on error
// remove this if not needed
catchError(buildResult: 'UNSTABLE') {
script {
def jb = build wait: true, job: 'test2', propagate: false
println jb
println jb.fullDisplayName
println jb.id
// throw an error if build failed
// this still allows you to get the job infos you need
if (jb.result == 'FAILURE') {
error('Downstream job failed')
}
}
}
}
}
}
}
Get the build within a downstream job:
# job: test2
pipeline {
agent any
stages {
stage('Upstream') {
steps {
script {
// upstream build if available
def upstream = currentBuild.rawBuild.getCause(hudson.model.Cause$UpstreamCause)
echo upstream?.shortDescription
// the run of that cause holds more infos
def upstreamRun = upstream?.getUpstreamRun()
echo upstreamRun?.number.toString()
}
}
}
}
}
See the api docs for the run class. You'll also need to allow some calls from the downstream example as approved scripts or disable the Groovy Sandboxo on that job.

Related

How to block upstream/downstream build in Jenkins declarative pipeline?

I have 3 downstream build jobs which are triggered when 1 upstream job 'Project U' has been built successfully. Example:
triggers {
pollSCM('H/5 * * * *')
upstream(upstreamProjects: 'Project U', threshold: hudson.model.Result.SUCCESS)
}
This works as expected, however, if code changes are committed to all parts at the same time, the upstream and downstream builds start building simultaneously.
I want to avoid this, because the downstream builds will run twice, and the first run is quite useless as the upstream commit has not been built yet. So I would like to configure the downstream jobs to block their build while the upstream job is building.
I know how to do this in Jenkins Freestyle job in the user interface (also see this answer):
But I cannot find how to do this in a Jenkins declarative pipeline?
This approach works:
waitUntil {
def job = Jenkins.instance.getItemByFullName("Project U")
!job.isBuilding() && !job.isInQueue()
}
When this downstream is started, it will check if the upstream job is either active or queued, and if so, will wait until its build has finished.
I haven't been able to find out how to programmatically access the current job's upstream job(s), so the job names need to be copied. (There is a method getBuildingUpstream(), which would be much more convenient, but I haven't found a way to obtain the current Project object from the Job instance.)
I finally ended up creating this function in the Jenkins shared library:
/*
vars/waitForJobs.groovy
Wait until none of the upstream jobs is building or being queued any more
Parameters:
upstreamProjects String with a comma separated list of Jenkins jobs to check
(use same format as in upstream trigger)
*/
def call( Map params) {
projects = params['upstreamProjects'].split(', *')
echo 'Checking the following upstream projects:'
if ( projects.size() == 0) {
echo 'none'
} else {
projects.each {project ->
echo "${project}"
}
}
waitUntil {
def running = false
projects.each {project ->
def job = Jenkins.instance.getItemByFullName(project)
if (job == null) {
error "Project '${project} not found"
}
if (job.isBuilding()) {
echo "Waiting for ${project} (executing)"
running = true
}
if (job.isInQueue()) {
echo "Waiting for ${project} (queued for execution)"
running = true
}
}
return !running
}
}
The nice thing is that I can just copy the parameter from the upstream trigger, because it uses the exact same format. Here's an example of how it looks like:
pipeline {
[...]
triggers {
pollSCM('H/5 * * * *')
upstream(upstreamProjects: 'Project U1, Project U2, Project U3', threshold: hudson.model.Result.SUCCESS)
}
[...]
stages {
stage('Wait') {
steps {
script{
// Wait for upstream jobs to finish
waitForJobs(upstreamProjects: 'Project U1, Project U2, Project U3')
}
}
[...]
}
}
}

How to get parent jenkins job's name and build number from child job

I have a Build Flow jenkins parent job with a DSL script that starts a Build Flow child job, also with a DSL script. Is there a way (groovy API) to get the parent job's name and build number in the child job?
Here is the Groovy script to get your Upstream Job details.
pipeline {
agent any
stages {
stage('Build') {
steps {
script {
def cause = currentBuild.getBuildCauses('org.jenkinsci.plugins.workflow.support.steps.build.BuildUpstreamCause')
if(cause.size() > 0) { // This is triggred by a upstream Job
def parentJobName = cause[0].upstreamProject
def parentBuildNumber = cause[0].upstreamBuild
echo "Parent JOb: $parentJobName"
echo "Parent Build Number: $parentBuildNumber"
}
}
}
}
}
}
Update
You can get all the causes without passing the Class.
def cause = currentBuild.getBuildCauses()

Is there a any way to get current stage (running, successfu,failure,aborted) of jenkins pipeline from another pipeline

I have two pipelines. Pipeline A (Application build) and pipeline B (App check). Pipeline A triggers the pipeline B and both runs simultaniously.
In pipeline B before a specific stage (run check) I need to verify if the pipeline A is successful. If not wait and check for some time till pipeline A gets finished. So pipeline B can proceed with the check if "A" is successful or exit with a failure.
What I need to know is, is there a any way to check the build status of pipeline A from pipeline B using pipeline "A"s build number. I passes the build number of Pipeline A to Pipeline B.
I looked if there's any env variable for status check but I couln'd find any.
I passes the build number of Pipeline A to Pipeline B.
You can create Pipeline B like below. Here you can use waitUntil for waiting.
pipeline {
agent any
stages {
stage('Build') {
steps {
script {
echo "Waiting"
def jobName = "JobA"
def buildNum = "92"
waitUntil { !isPending(jobName, buildNum) }
if(getStatus(jobName, buildNum).equals('SUCCESS')) {
echo "Job A is Successful"
} else {
echo "Job A Failed"
}
}
}
}
}
}
def isPending(def JobName, def buildNumber) {
def buildA = Jenkins.instance.getItemByFullName(JobName).getBuild(buildNumber)
return buildA.isInProgress()
}
def getStatus(def JobName, def buildNumber) {
def status = Jenkins.instance.getItemByFullName(JobName).getBuild(buildNumber).getResult().toString()
return status
}

Jenkins Pipeline - how to execute sequential sub-jobs and propagate the error while keeping the sequence

I am trying to have a pipeline that executes multiple sequential jobs.
The problem is that if I have the "propagate false" flag, the jobs are executed but the pipeline build always returns 'Success' regardless the sub-jobs status.
If I want the pipeline reflects the 'Fail' status when a sub-job fails, and remove the propagate flag, the sequence is broken at that failure point, and no more jobs are executed.
Can you help me getting the best way to achieve this?
I hope I was clear. Thank you very much.
pipeline{
stages{
stage('Tests'){
steps{
parallel(
'TestSet':{
build wait: true, job: 'Test A'
build wait: true, job: 'Test B'
build wait: true, job: 'Test C'
}
)
}
}
}
}
When you run the build step it actually returns an RunWrapper object (see Java Docs).
The RunWrapper has a getResult() function which enables you to get the result of the build that was executed, along side many other proproteins of the executed build, like the build number.
You can then run your jobs with the propagate false option, save the results, examine them after all builds are finished and then run your required logic.
For example:
pipeline{
stages{
stage('Tests'){
steps{
script {
parallel(
'TestSet': {
// Collect all results of build execution into a map
def results = [:]
results['Test A'] = build(wait: true, job: 'Test A').getResult()
results['Test B'] = build(wait: true, job: 'Test B').getResult()
results['Test C'] = build(wait: true, job: 'Test C').getResult()
// Analyze the result
failedJobs = results.findAll(it.value != 'SUCCESS') // you can also use Result.SUCCESS instead of the string
if (failedJobs){
error "The following jobs have failed: ${failedJobs.collect{it.key}.join(',')}"
}
}
)
}
}
}
}
}

How can i use 'parallel' option in jenkins pipeline in the 'post' section?

I looked at many pipeline examples and how to write the post build section in a pipeline script. But never got my answer i was looking for.
I have 4 jobs - say Job A,B,C and D. I want job A to run first, and if successful it should trigger Job B,C,D in parallel. If Job A fails, it should trigger only Job B. Something like below:
pipeline {
agent any
stages {
stage('Build_1') {
steps {
sh '''
Build Job A
'''
}
}
post {
failure {
sh '''
Build Job B
'''
}
success {
sh '''
Build Job B,C,D in parallel
'''
}
}
}
I tried using 'parallel' option in post section but it gave me errors. Is there a way to build Job B,C,D in parallel, in the post 'success' section?
Thanks in advance!
The parallel keyword actually can work inside a post condition as long as it is encapsulated inside a script block, as the script blocks is just a fallback to the scripted pipeline which will allow you to run parallel execution step wherever you want.
The following should work fine:
pipeline {
agent any
stages {
stage('Build_1') {
steps {
// Build Job A
}
}
}
post {
failure {
// run job B
build job: 'Job-B'
}
success {
script {
// run jobs B, C, D in parallel
def jobs = ['Job-B', 'Job-C', 'Job-D']
parallel jobs.collectEntries { job ->
["Building ${job}" : {
build job: job
}]
}
}
}
}
}
This is just an example and specific parameters or configuration (for the build keyword) can be added to each job execution according to your needs.
The error message is quiet clear about this:
Invalid step "parallel" used - not allowed in this context - The
parallel step can only be used as the only top-level step in a stages
step
The more restrictive declarative syntax does not allow the usage of parallel in thw post section at the moment.
If you don't want to switch to the scripted syntax, another option that should work: Build the jobs B,C,D in parallel in a second stage and move the the failure condition in the post section of your first stage. As a result job B,C,D will run if A is successful. If A is not successful only job B will run.
pipeline {
agent any
stages {
stage('one') {
steps {
// run job A
}
post {
failure {
// run job B
}
}
}
stage('two') {
steps {
parallel(
// run job B, C, D
)
}
}
}
}

Resources