how to get the build number from a triggered job - jenkins

In my jenkins pipeline, i trigger a job like this:
stage('Run downstream') {
parallel {
stage('partA') {
steps {
script {
if (env.GIT_BRANCH == 'origin/master') {
build job: 'downstream', wait: true
}
}
}
}
stage('partB') {
steps {
script {
if (env.GIT_BRANCH == 'origin/master') {
build job: 'downstream', wait: true, parameters: [
string(name: 'param', value: 'overriden value')
]
}
}
}
}
}
}
the downstream job creates an artifact which I'd like to copy to the triggering job. How would I get the build number for each invocation of the job so that I can pull their artifacts?

I changed:
build job: 'downstream', wait: true
to:
triggeredBuild = build job: 'downstream', wait: true
buildNumber = triggeredBuild.getNumber()

Related

Jenkins groovy if condition within steps does not work

I have the following stage in groovy script of a jenkins job":
stage('Remove servers') {
when {
expression { params.DO_REMOVE == true }
}
steps {
script {
parallel RemoveSource: {
sh """set -x
export KUBECONFIG=${source_config}
kubectl get ns ${source_namespace} || exists="False"
"""
echo "${exists}"
if ("${exists}" != "False") {
build job: 'RemoveFCC',
parameters: [string(name: 'Branch', value: Branch),
booleanParam(name: 'build_ansible', value: false),
string(name: 'pipeline', value: 'yes')]
} else {
echo "Server does not exist. skipped fcc run"
}
},
RemoveTarget: {
sh """set -x
export KUBECONFIG=${target_config}
kubectl get ns ${target_namespace} || exists="False"
"""
echo "${exists}"
if ("${exists}" != "False") {
build job: 'RemoveFCC',
parameters: [string(name: 'Branch', value: Branch),
booleanParam(name: 'build_ansible', value: false),
string(name: 'pipeline', value: 'yes')]
} else {
echo "Server does not exist. skipped fcc run"
}
}
}
}
}
Even though echo "${exists}" prints False the if condition is still getting executed. I am not sure what am I missing here. Tried things like adding when instead of if.

How to run the same scripted pipeline multiple times in jenkins?

I went over the internet to find out a way to run the same pipeline multiple times but have not found an answer.
Basically, the pipeline has 5 stages and I want this pipeline to run 5 times, each iteration of the pipeline will execute the stages in the same order in the code.
node{
def build_ok = true
try{
stage('#1 SoftSync 4.5.1 CPU Usage Test') {
build job: 'SoftSync_4.5.1_CPU_Usage_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_CPU_Usage_Test.robot')]
}
} catch(e) {
build_ok = false
echo e.toString()
}
try{
stage ('#2 SoftSync 4.5.1 Improvments to system time management Test '){
build job: 'SoftSync_4.5.1_Improvments_To_System_TimeManagement_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_Improvments_to_system_time_management.robot')]
}
}catch(e) {
build_ok = false
echo e.toString()
}
try {
stage ('#3 SoftSync 4.5.1 Telematics and statistics Test'){
build job: 'SoftSync_4.5.1_Telematics_and_statistics_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_Telementry_and_Statistics.robot')]
}
}catch(e) {
build_ok = false
echo e.toString()
}
//try{
//stage ('#4 SoftSync 4.5.1 PTP Profiles Slave Lock Test'){
build job: 'SoftSync_4.5.1_PTP_Profiles_SlaveLock_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/PTP_Bundle/SoftSync_PTP_Lock_validation.robot')]
//}
//}catch(e) {
// build_ok = false
// echo e.toString()
//}
try{
stage ('#5 SoftSync 4.5.1 Alarms Test'){
build job: 'SoftSync_4.5.1_Alarms_Test', parameters: [string(name: 'LOG_LEVEL', value: 'debug'), string(name: 'FILE_PATH', value: 'TLV_SoftSync/Management_Bundle/SoftSync_Alarms_Test.robot')]
}
}catch(e) {
build_ok = false
echo e.toString()
}
if(build_ok) {
currentBuild.result = "SUCCESS"
} else {
currentBuild.result = "FAILURE"
}
post{
always{
junit allowEmptyResults: true, testResults: '/var/lib/jenkins/output/*.xml'
}
}
}
I have seen some usages with arrays and .each function but all for parallel usages witch is not help full in my case...
any ideas?
You can wrap your stages with a loop. Something like below.
node {
def itrList = ["Run1", "Run2", "Run3"]
itrList.each { val ->
def build_ok = true
try {
stage("#1 SoftSync 4.5.1 CPU Usage Test with $val") {
echo "111111"
}
} catch (e) {
build_ok = false
echo e.toString()
}
try {
stage("#2 SoftSync 4.5.1 Improvments to system time management Test with $val") {
echo "22222222222"
}
} catch (e) {
build_ok = false
echo e.toString()
}
if (build_ok) {
currentBuild.result = "SUCCESS"
} else {
currentBuild.result = "FAILURE"
}
}
}

can we run Jenkins file in pipeline?

I've Pipeline Generic Webhook from Bitbucket, this is a job to trigger another job.
currentBuild.displayName = "Generic-Job-#" + env.BUILD_NUMBER
pipeline {
agent any
triggers {
GenericTrigger(
genericVariables: [
[key: 'actorName', value: '$.actor.display_name'],
[key: 'TAG', value: '$.push.changes[0].new.name'],
[key: 'REPONAME', value: '$.repository.name'],
[key: 'GIT_URL', value: '$.repository.links.html.href'],
],
token: '11296ae8d97b2134550f',
causeString: ' Triggered on $actorName version $TAG',
printContributedVariables: true,
printPostContent: true
)
}
stages {
stage('Build Job DEVELOPMENT') {
when {
expression { return params.TARGET_ENV == 'DEVELOPMENT' }
}
steps {
build job: 'DEVELOPMENT',
parameters: [
[$class: 'StringParameterValue', name: 'FROM_BUILD', value: "${BUILD_NUMBER}"],
[$class: 'StringParameterValue', name: 'TAG', value: "${TAG}"],
[$class: 'StringParameterValue', name: 'GITURL', value: "${GIT_URL}"],
[$class: 'StringParameterValue', name: 'REPONAME', value: "${REPONAME}"],
[$class: 'StringParameterValue', name: 'REGISTRY_URL', value: "${REGISTRY_URL}"],
]
}
}
}
}
Another Pipeline
pipeline {
agent any
stages {
stage('Cleaning') {
steps {
cleanWs()
}
}
def jenkinsFile
stage('Loading Jenkins file') {
jenkinsFile = fileLoader.fromGit('Jenkinsfile', "${GIT_URL}", "${TAG}", null, '')
}
jenkinsFile.start()
}
}
can i run Jenkinsfile in Pipeline ? Because every project I make has a different Jenkinsfile, it can't be the same, but when I run this it doesn't execute the Jenkinsfile
it works for me :D
Sample Pipeline
stage 'Load a file from GitHub'
def jenkinsFile = fileLoader.fromGit('<path-jenkinsfile>', "<path-git>", "<branch>", '<credentials>', '')
stage 'Run method from the loaded file'
jenkinsFile
pipeline {
agent any
stages {
stage('Print Hello World Ke #1') {
steps {
echo "Hello Juan"
}
}
}
}
before run the pipeline, you must install plugin "Pipeline Remote Loader Plugin Version"

Jenkins pipeline execute job and get status

pipeline {
agent { label 'master' }
stages {
stage('test') {
steps {
script {
def job_exec_details = build job: 'build_job'
if (job_exec_details.status == 'Failed') {
echo "JOB FAILED"
}
}
}
}
}
}
I have a pipeline that executing build job, how can I get Job result in jenkins pipeline ?
It should be getResult() and status should be FAILURE not Failed.
so your whole code should be like this
pipeline {
agent { label 'master' }
stages {
stage('test') {
steps {
script {
def job_exec_details = build job: 'build_job', propagate: false, wait: true // Here wait: true means current running job will wait for build_job to finish.
if (job_exec_details.getResult() == 'FAILURE') {
echo "JOB FAILED"
}
}
}
}
}
}
Where is a second way of getting results:
pipeline {
agent { label 'master' }
stages {
stage('test') {
steps {
build(job: 'build_job', propagate: true, wait: true)
}
}
}
post {
success {
echo 'Job result is success'
}
failure {
echo 'Job result is failure'
}
}
}
}
You can read more about 'build' step here

Jenkins trigger another job

i have a pipeline that triggers the same job if it fails. but when the second job is triggered, the first pipeline still stay opened till the second one success or fails, i would like to know if i can close the pipeline after the trigger was made for the second one.
pipeline {
agent any
stages {
stage('test') {
steps {
script {
input message: 'Proceed?', ok: 'Yes', submitter: 'admin'
}
echo "helloworld"
}
post {
aborted{
script{
retry(1) {
input "Retry the job ?"
build(job: 'pipelines/testCS')
}
}
}
success {
script{
sh 'echo "continue"'
}
}
}
}
stage('deploy'){
steps{
sh 'echo "deploy"'
}
}
}
post {
aborted {
echo "pipeline has been aborted"
}
}
}
Simply pass wait: false for the build step:
build(job: 'pipelines/testCS', wait: false)
See documentation for all parameters.

Resources