I have a multibranch job in my jenkins, what I have a webhook setup from my github to my jenkins that send every pull request changes and issue comments.
What I'm trying to do is let github send the pull request changes for indexing purposes, but don't run the job, unless the developer add comment 'test' in the comment on the github pull request.
This is my Jenkinsfile,
pipeline {
agent { label 'mac' }
stages {
stage ('Check Build Cause') {
steps {
script {
def cause = currentBuild.buildCauses.shortDescription
echo "${cause}"
}
}
}
stage ('Test') {
when {
expression {
currentBuild.buildCauses.shortDescription == "[GitHub pull request comment]"
}
}
steps {
sh 'bundle exec fastlane test'
}
}
}
}
So I want if the trigger isn't GitHub pull request comment, don't run anything. I've tried this but it doesn't work. I've tried print currentBuild.buildCauses.shortDescription variable and it prints [GitHub pull request comment], but the job still won't run with my when expression
How can I do this? Thanks
So actually the problem is because currentBuild.buildCauses.shortDescription return ArrayList instead of plain String.
I didn't really think this was meant an array [GitHub pull request comment], so I manage to fix the issue with just array index.
currentBuild.buildCauses.shortDescription[0]
This return the correct build trigger GitHub pull request comment. So for anyone who also stumbles to this issue, this is how I fixed it
pipeline {
agent { label 'mac' }
stages {
stage ('Test') {
when {
expression {
currentBuild.buildCauses.shortDescription[0] == "GitHub pull request comment"
}
}
steps {
sh 'bundle exec fastlane test'
}
}
}
}
Related
In my jenkins pipeline i have several post-success actions in different stages that define environment variables depending on the stage results (in success case). I am also using parallel and sequential stages, but i do not believe this is affecting the result. Below is a trimmed excerpt:
stage('stage1') {
agent {
label 'agent1'
}
when {
expression {
return Jenkins.instance.getItem("${'job1'}").isBuildable()
}
environment name: 'job0', value: 'success'
}
steps {
catchError(buildResult: 'FAILURE', stageResult: 'FAILURE') {
build job: 'job1'
}
}
post {
success {
script {
env.job1 = "success"
echo "job1: ${env.job1} !"
}
}
}
}
Now whenever the build result is set to FAILURE within a stage by catchError, all post actions in other stages do not evaluate the related stage result, but the build result instead, which i do not want (and believe they should not do), referring to https://www.jenkins.io/doc/book/pipeline/syntax/#post :
The post section defines one or more additional steps that are run upon the completion of a Pipeline’s or stage’s run (depending on the location of the post section within the Pipeline)
But in my case, after the first error no more stage level post-success actions are executed..
My workaround so far is to modify catcherror, add a post-failure action that sets another variable which is evaluated in the very last stage to set the build result:
stage('stage1') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
build job: 'job1'
}
}
post {
success { .. }
failure {
script {
env.testFail = "true"
echo "testFail: ${testFail} !"
}
}
}
}
..
stage('set buildResult') {
when {
environment name: 'testFail', value: 'true'
}
steps {
script {
bat '''
echo "testFail variable found. Changing buildresult to FAILURE."
exit 1
'''
}
}
}
Is there anything I missed in the docs, or can anybody confirm this behavior? I would appreciate to remove my workaround and let the catchError do all the work for me.
I have already searched through official docs and mailing list, including jenkins issues, but have not found information about this issue anywhere.
This is my first time contributing to stackoverflow, so please forgive me if I made mistakes or did not express myself clearly (leaving reader-only mode).
Jenkins - v2.204.2
Pipeline - v2.6
Pipeline Declarative - v1.8.4
In Jenkins Pipeline, how can I copy the artifacts from a previous build to the current build?
I want to do this even if the previous build failed.
Stuart Rowe also recommended to me on the Pipeline Authoring Sig Gitter channel that I look at the Copy Artifact Plugin, but also gave me some sample Jenkins Pipeline syntax to use.
Based on the advice that he gave, I came up with this fuller Pipeline example
which copies the artifacts from the previous build into the current build,
whether the previous build succeeded or failed.
pipeline {
agent any;
stages {
stage("Zeroth stage") {
steps {
script {
if (currentBuild.previousBuild) {
try {
copyArtifacts(projectName: currentBuild.projectName,
selector: specific("${currentBuild.previousBuild.number}"))
def previousFile = readFile(file: "usefulfile.txt")
echo("The current build is ${currentBuild.number}")
echo("The previous build artifact was: ${previousFile}")
} catch(err) {
// ignore error
}
}
}
}
}
stage("First stage") {
steps {
echo("Hello")
writeFile(file: "usefulfile.txt", text: "This file ${env.BUILD_NUMBER} is useful, need to archive it.")
archiveArtifacts(artifacts: 'usefulfile.txt')
}
}
stage("Error") {
steps {
error("Failed")
}
}
}
}
Suppose you want a single file to from previous build, you can even use curl to place file in workspace before mvn invocation.
stage('Copy csv') {
steps {
sh "mkdir -p ${env.WORKSPACE}/dump"
sh "curl http://<jenkins-url>:<port>/job/<job-folder>/job/<job-name>/job/<release>/lastSuccessfulBuild/artifact/dump/sample.csv/*view*/ -o ${env.WORKSPACE}/dump/sample.csv"
}
}
Thanks,
Ashish
You Can Use Copy Artifact Plugin
For configuration visit https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin
I have a Bitbucked repo, and I want to satrt my Jenkins pipeline job only afrer commit with tag like "release-1.0.*"
So, I seted my job up with pipeline script:
pipeline {
agent any
stages {
stage ('Prepare') {
when {
tag "release*"
}
steps {
git branch: 'tag1', url: 'git#bitbucket.org:m*********ny/tests.git'
}
}
stage ('Deploy') {
steps {
sshPublisher(publishers: [sshPublisherDesc(configName: "JenkinsSrv", transfers: [sshTransfer(execCommand: 'pwd')])])
}
}
}
post ('POST BUILD'){
always {
echo 'This is post action!!!'
}
}
}
Also, I turned on Bitbucked webhook plugin, than my repo notify Jenkins about new changes.
But my solution doesn't work. Help me resolve this case.
enter image description here
According to the official documentation for a Jenkins pipeline, the option you are looking for is the changelog condition inside the when directive. For example:
when { changelog 'release*' }
Currently I'm able to use a post directive in my Jenkinsfile. Is there a way to trigger a pre-build step similar to this ?
post {
always {
sh '''rm -rf build/workspace'''
}
}
I believe this newer question may have the answer: Is there a way to run a pre-checkout step in declarative Jenkins pipelines?
pre is a cool feature idea, but doesn't exist yet. skipDefaultCheckout
and checkout scm (which is the same as the default checkout) are the
keys:
pipeline {
agent { label 'docker' }
options {
skipDefaultCheckout true
}
stages {
stage('clean_workspace_and_checkout_source') {
steps {
deleteDir()
checkout scm
}
}
stage('build') {
steps {
echo 'i build therefore i am'
}
}
}
}
I looked at many pipeline examples and how to write the post build section in a pipeline script. But never got my answer i was looking for.
I have 4 jobs - say Job A,B,C and D. I want job A to run first, and if successful it should trigger Job B,C,D in parallel. If Job A fails, it should trigger only Job B. Something like below:
pipeline {
agent any
stages {
stage('Build_1') {
steps {
sh '''
Build Job A
'''
}
}
post {
failure {
sh '''
Build Job B
'''
}
success {
sh '''
Build Job B,C,D in parallel
'''
}
}
}
I tried using 'parallel' option in post section but it gave me errors. Is there a way to build Job B,C,D in parallel, in the post 'success' section?
Thanks in advance!
The parallel keyword actually can work inside a post condition as long as it is encapsulated inside a script block, as the script blocks is just a fallback to the scripted pipeline which will allow you to run parallel execution step wherever you want.
The following should work fine:
pipeline {
agent any
stages {
stage('Build_1') {
steps {
// Build Job A
}
}
}
post {
failure {
// run job B
build job: 'Job-B'
}
success {
script {
// run jobs B, C, D in parallel
def jobs = ['Job-B', 'Job-C', 'Job-D']
parallel jobs.collectEntries { job ->
["Building ${job}" : {
build job: job
}]
}
}
}
}
}
This is just an example and specific parameters or configuration (for the build keyword) can be added to each job execution according to your needs.
The error message is quiet clear about this:
Invalid step "parallel" used - not allowed in this context - The
parallel step can only be used as the only top-level step in a stages
step
The more restrictive declarative syntax does not allow the usage of parallel in thw post section at the moment.
If you don't want to switch to the scripted syntax, another option that should work: Build the jobs B,C,D in parallel in a second stage and move the the failure condition in the post section of your first stage. As a result job B,C,D will run if A is successful. If A is not successful only job B will run.
pipeline {
agent any
stages {
stage('one') {
steps {
// run job A
}
post {
failure {
// run job B
}
}
}
stage('two') {
steps {
parallel(
// run job B, C, D
)
}
}
}
}