Jenkins job triggered on changeset or first build - jenkins

The changeset declaration causes a Jenkins pipeline stage to execute when files matching the changeset specification are changed between runs of the pipeline.
This is all very well and good, but if it's the first run of the pipeline, then this will be skipped as no changes are detected.
How do you write a when condition that is triggered on a change of files or the first run of the pipeline?

You can do something like the below.
stage('Example Deploy') {
when { expression {
return (currentBuild.changeSets.size() > 0 || currentBuild.number == 1)
}
}
steps {
echo 'RUN==================='
}
}
Update
With changeset
when {
anyOf {
changeset '**/*.c'
expression {
currentBuild.number == 1
}
}
}

Related

Jenkins - Fail a job if condition is not met

In the Jenkins job which tests some values, I have something like this:
stage('Check value'){
if value == 5:
//return failure of a job
}
stage('Send results'){
....
}
In one stage those values are being checked, if value == 5 a job should return failure. I've tried with exit 1 and with return -1 but it doesn't work.
If you gracefully return from a stage the next stage will always run. So one way to get around this is to conditionally run all other Stages. Another solution is to simply generate an error. Refer to the following pipeline.
pipeline {
agent any
stages {
stage('Build_1') {
steps {
echo "Running!!!"
script {
def value = 5
if(value == 5) {
currentBuild.result = 'ABORTED'
error("Abort the build.")
// throw new Exception("ERRRRORRRRRR") -> Or you can throw an error.
}
}
}
}
stage('Build_2') {
steps {
echo "Running!!! 22222"
}
}
}
}

Build stages in Jenkins only when specific files are changed but use a function

I want to update my Jenkins pipeline in way that certain stages are only build when some specific files are changed (git is already integrated in the pipeline). I found a promising solution on this site, which would go like this for my use case (this code run successful):
stage("TEST STAGE 1") {
when {
anyOf { changeset "*dir1/*"; changeset "*dir2/*"; changeset "*somefile" }
}
steps {
// Do stuff
}
}
But I have more stages (TEST STAGE 2 and TEST STAGE 3) which should also be triggered only when these files are changed. To avoid writing the same code over and over (which would be bad practice), I implemented a function (I got the code from here):
def runStage(changeset) {
return {
changeset ==~ ("*dir1/*"||"*dir2/*"||"*somefile")
}
}
I call this function in the TEST stages (also TEST 2 and 3):
stage("TEST STAGE 1") {
when {
expression{ runStage(changeset) }
}
steps {
// Do stuff
}
}
But now my pipeline fails when entering the first TEST stage. I get this error:
hudson.remoting.ProxyException: groovy.lang.MissingPropertyException: No such property: changeset for class: WorkflowScript
Do you have an idea what I am doing wrong?
I found a solution.
This is my function:
def runStage() {
CHANGE_SET = sh (
script: 'git log -2 --name-only --oneline --pretty="format:"',
returnStdout: true
).trim()
echo "Current changeset: ${CHANGE_SET}"
return (CHANGE_SET ==~ "(.*)dir1(.*)|(.*)dir2(.*)|(.*)somefile")
}
I call it in my pipeline stage like this:
stage("TEST STAGE 1") {
when {
expression { runStage() }
}
steps {
//stuff
}
}
I would have prefered using changeset in the when block instead of git log, but it looks like it can't be done for my case.

Jenkins: triggeredBy 'UpstreamCause' always evaluates to false in When condition

In Jenkins I have an Upstream Project A that triggeres a Downstream Project B.
Project A:
pipeline {
agent any
stages {
stage('Hello') {
steps {
echo "Some message"
build(job: 'B', wait: false)
}
}
}
}
Project B
pipeline {
agent none
tools {
maven "maven"
}
stages {
stage('Triggered By Upstream') {
when {
triggeredBy "UpstreamCause"
}
steps {
echo "Triggered by Upstream project"
}
}
}
}
Here project A successfully triggers the project B. But the stage in project B will be skipped due to the condition in when being False. It seems like a bug to me. Does anyone know what's wrong in this code?
Apparently there is no cause as UpstreamCause, but rather BuildUpstreamCause, which evaluates to true when the pipeline is triggered from an upstream project. So the when clause in project B should be written as:
when {
triggeredBy "BuildUpstreamCause"
}
in order for the stage to be run when triggered from an upstream project.

Jenkins multibranch pipeline postbuild only on specific branch (declarative)

How can you create a pipeline that does its stages on every branch, but the postbuild action only on specific branches.
I have seen the when {branch 'production'} option, but it seems to me, that this only works for stage blocks and not for post blocks.
Is there a way to do something like
pipeline {
agent { any }
stages {
stage('build') {
steps {
bat script:"echo build"
}
post {
always {
when {
branch 'production'
}
bat script:"echo publish"
}
}
}
}
}
the if (env.BRANCH_NAME == 'production') seems to be only for scripted pipelines
Conditional post section in Jenkins pipeline has an answer: use script and if inside the post condition.
post {
always {
script {
if (env.BRANCH_NAME == 'production') {
// I have not verified whether your step works here;
// conditional emailext works as expected.
bat script:"echo publish"
}
}
}
}

Job DSL to create "Pipeline" type job

I have installed Pipeline Plugin which used to be called as Workflow Plugin earlier.
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin
I want to know how can i use Job Dsl to create and configure a job which is of type Pipeline
You should use pipelineJob:
pipelineJob('job-name') {
definition {
cps {
script('logic-here')
sandbox()
}
}
}
You can define the logic by inlining it:
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'logic'
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
or load it from a file located in workspace:
pipelineJob('job-name') {
definition {
cps {
script(readFileFromWorkspace('file-seedjob-in-workspace.jenkinsfile'))
sandbox()
}
}
}
Example:
Seed-job file structure:
jobs
\- productJob.groovy
logic
\- productPipeline.jenkinsfile
then productJob.groovy content:
pipelineJob('product-job') {
definition {
cps {
script(readFileFromWorkspace('logic/productPipeline.jenkinsfile'))
sandbox()
}
}
}
I believe this question is asking something how to use the Job DSL to create a pipeline job which references the Jenkinsfile for the project, and doesn't combine the job creation with the detailed step definitions as has been given in the answers to date. This makes sense: the Jenkins job creation and metadata configuration (description, triggers, etc) could belong to Jenkins admins, but the dev team should have control over what the job actually does.
#meallhour, is the below what you're after? (works as at Job DSL 1.64)
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
This example worked for me. Here's another example based on what worked for me:
pipelineJob('Your App Pipeline') {
def repo = 'https://github.com/user/yourApp.git'
def sshRepo = 'git#git.company.com:user/yourApp.git'
description("Your App Pipeline")
keepDependencies(false)
properties{
githubProjectUrl (repo)
rebuild {
autoRebuild(false)
}
}
definition {
cpsScm {
scm {
git {
remote { url(sshRepo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
If you build the pipeline first through the UI, you can use the config.xml file and the Jenkins documentation https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob to create your pipeline job.
In Job DSL, pipeline is still called workflow, see workflowJob.
The next Job DSL release will contain some enhancements for pipelines, e.g. JENKINS-32678.
First you need to install Job DSL plugin and then create a freestyle project in jenkins and select Process job DSLs from the dropdown in the build section.
Select Use the provided DSL script and provide following script.
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage name 1') {
steps {
// your logic here
}
}
stage('Stage name 2') {
steps {
// your logic here
}
}
}
}
}
''')
}
}
}
Or you can create your job by pointing the jenkinsfile located in remote git repository.
pipelineJob("job-name") {
definition {
cpsScm {
scm {
git {
remote {
url("<REPO_URL>")
credentials("<CREDENTIAL_ID>")
}
branch('<BRANCH>')
}
}
scriptPath("<JENKINS_FILE_PATH>")
}
}
}
If you are using a git repo, add a file called Jenkinsfile at the root directory of your repo. This should contain your job dsl.

Resources