Jenkins pipeline execute job and get status - jenkins

pipeline {
agent { label 'master' }
stages {
stage('test') {
steps {
script {
def job_exec_details = build job: 'build_job'
if (job_exec_details.status == 'Failed') {
echo "JOB FAILED"
}
}
}
}
}
}
I have a pipeline that executing build job, how can I get Job result in jenkins pipeline ?

It should be getResult() and status should be FAILURE not Failed.
so your whole code should be like this
pipeline {
agent { label 'master' }
stages {
stage('test') {
steps {
script {
def job_exec_details = build job: 'build_job', propagate: false, wait: true // Here wait: true means current running job will wait for build_job to finish.
if (job_exec_details.getResult() == 'FAILURE') {
echo "JOB FAILED"
}
}
}
}
}
}

Where is a second way of getting results:
pipeline {
agent { label 'master' }
stages {
stage('test') {
steps {
build(job: 'build_job', propagate: true, wait: true)
}
}
}
post {
success {
echo 'Job result is success'
}
failure {
echo 'Job result is failure'
}
}
}
}
You can read more about 'build' step here

Related

How to use parallelsAlwaysFailFast() in scripted pipeline?

How to use parallelsAlwaysFailFast() in Jenkins Scripted Pipeline?
I could not find any example for this.
Edited, here is the code I use and the 'Blue Ocean' screenshot:
stage("Build") {
parallel([
failFast: true,
"Stage 1":{
stage("Stage 1") {
stage("a1") {
println("a1")
};
stage("a2") {
println("a2")
}
}
},
"Stage 2":{
stage("Stage 2") {
stage("b1") {
sh '''pwd'''
};
stage("b2") {
echo '''Here we can see the InterruptedException'''
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
error "Failing the stage"
}
}
}
}
])
}
Blue Ocean img
How can I make all the stages that are executed in parallel to fail as well?
Thanks.
As far as I know, there is no way to change that behavior for all the future parallel stages.
However one can change it for any given set of parallel stages, like this:
def parallel_stages = [:].asSynchronized()
parallel_stages['one'] = {
stage ('One') {
script {
println "One"
}
}
}
parallel_stages['two'] = {
stage ('Two') {
script {
println "Two"
}
}
}
// Here you set this for the given parallel stage
parallel_stages.failFast = true
parallel parallel_stages

How to continue Jenking stage on failure

I would like to run pipeline with 2 stages. If any of the stage is failed, next stage should be started (not skipped). Currently if 1st stage is failed, next stage will be skipped.
Thank you for any help.
pipeline {
options { buildDiscarder(logRotator(numToKeepStr: '5', artifactNumToKeepStr: '5')) }
agent { label 'docker_gradle' }
triggers {
cron(env.BRANCH_NAME == 'develop' || env.BRANCH_NAME == 'master' ? '#daily' : '')
}
stages {
stage('Init') {
steps {
sh 'chmod +x gradlew'
}
}
stage('task1') {
when { anyOf { branch 'feature/*'; branch 'develop' }}
steps {
container(name: 'gradle') {
sh 'gradle clean task1'
}
}
}
stage('task2') {
when { anyOf { branch 'feature/*'; branch 'develop' }}
steps {
container(name: 'gradle') {
sh 'gradle clean task2'
}
}
}
}
post {
always {
script {
currentBuild.result = currentBuild.result ?: 'SUCCESS'
cucumber buildStatus: 'UNSTABLE',
failedFeaturesNumber: 1,
failedScenariosNumber: 1,
skippedStepsNumber: 1,
failedStepsNumber: 1,
reportTitle: 'Reoport',
fileIncludePattern: '**/cucumber.json',
sortingMethod: 'ALPHABETICAL',
trendsLimit: 100
}
}
}
}
1.You can change sh 'gradle clean task1' to
sh 'gradle clean task1 || true'
This will make sh return success even if sh scrip fails
2.You can also use try catch
Check this link: https://www.jenkins.io/doc/book/pipeline/syntax/#flow-control
for example:
stage("task1"){
steps {
script {
try {
sh 'gradle clean task1'
} catch (err) {
echo err.getMessage()
}
}
}
}

jenkins pipeline updateGitlabCommitStatus not working

GITLAB_VERSION: GitLab Enterprise Edition 13.9.3-ee
JENKINS_VERSION: 2.263.4
I have crated a jenkins pipeline which is being triggered by change in gitlab, but its not updating gitlab status.
pipeline {
agent any
stages {
stage('cloning from gitlab'){
steps{
git credentialsId: '7d13ef14-ee65-497b-8fba-7519f5012e81', url: 'git#git.MYDOMAIN.com:root/popoq.git'
}
}
stage('build') {
steps {
echo 'Notify GitLab'
updateGitlabCommitStatus name: 'Jenkins-build', state: 'pending'
echo 'build step goes here'
}
}
stage('echoing') {
steps{
echo "bla blaa bla"
}
}
stage(test) {
steps {
echo 'Notify GitLab'
echo 'test step goes here'
updateGitlabCommitStatus name: 'Jenkins-build', state: 'success'
}
}
}
}
its not showing any pipline in gitlab, any suggestions?
I think you miss a "gitlabBuilds" command in an "option" block declaring the steps you will have in your build.
options {
gitLabConnection('xxx-gitlab')
gitlabBuilds(builds: ['step1', 'step2', 'step3'])
}
Then you can reference those steps with the "updateGitlabCommitStatus" but you'd better use the "gitlabCommitStatus" command like this:
pipeline {
agent any
options {
gitLabConnection('xxx-gitlab')
gitlabBuilds(builds: ['step1', 'step2', 'step3'])
}
stages {
stage('step1'){
steps{
gitlabCommitStatus(name:'step1') {
git credentialsId: '7d13ef14-e', url: 'xxxxxxx'
}
} // end steps
} // end stage
stage('step2'){
steps{
gitlabCommitStatus(name:'step2') {
.......
}
} // end steps
} // end stage
}
pipeline {
agent {
label 'agent_gradle'
}
options {
gitLabConnection('Gitlab Jenkins integration API connection test')
gitlabBuilds(builds: ['step1', 'step2'])
}
stages {
stage('Build') {
steps {
gitlabCommitStatus(name: 'step1') {
container(name: 'gradle') {
echo 'Building the application...'
}
}
}
}
stage('Test') {
steps {
gitlabCommitStatus(name: 'step2') {
container(name: 'gradle') {
echo 'Testing the application...'
}
}
}
}
}
}

Jenkins trigger another job

i have a pipeline that triggers the same job if it fails. but when the second job is triggered, the first pipeline still stay opened till the second one success or fails, i would like to know if i can close the pipeline after the trigger was made for the second one.
pipeline {
agent any
stages {
stage('test') {
steps {
script {
input message: 'Proceed?', ok: 'Yes', submitter: 'admin'
}
echo "helloworld"
}
post {
aborted{
script{
retry(1) {
input "Retry the job ?"
build(job: 'pipelines/testCS')
}
}
}
success {
script{
sh 'echo "continue"'
}
}
}
}
stage('deploy'){
steps{
sh 'echo "deploy"'
}
}
}
post {
aborted {
echo "pipeline has been aborted"
}
}
}
Simply pass wait: false for the build step:
build(job: 'pipelines/testCS', wait: false)
See documentation for all parameters.

How to modify variable defined in script block in declarative pipeline of jenkins

I have declared a variable TENTATIVE_VERSION in my script, and I need to define/modify it with the value coming from executing a script (or from the script itself in other stage), how can I do this? my current script is something like this:
pipeline {
agent {
label 'machine1'
}
stages {
stage('Non-Parallel Stage') {
agent{label "machine2"}
steps {
script {
TENTATIVE_VERSION="1.0" // working
// TENTATIVE_VERSION="sh echo 123" //not working
}
}
}
stage('Parallel Stage') {
parallel {
stage('A') {
agent {label 'machine3'}
steps {
echo "On other machine"
echo "${TENTATIVE_VERSION}"
build job: 'otherJob', parameters: [[$class: 'StringParameterValue', name: 'VERSION', value: "${TENTATIVE_VERSION}"],
[$class: 'StringParameterValue', name: 'RELEASE', value: '1']]
}
}
stage('B') {
agent {label "machine4"}
steps {
script {
STATUS_S = "OK"
}
echo "On a machine"
}
}
stage('C') {
agent {label "machine5"}
steps {
script {
STATUS_R = "OK"
}
echo "On a machine"
}
}
}
}
}
Try following:
pipeline {
agent {
label 'machine1'
}
stages {
stage('Non-Parallel Stage') {
agent{label "machine2"}
steps {
script {
TENTATIVE_VERSION = sh(returnStdout: true, script: "echo 123").trim()
}
}
}
}
}

Resources