Jenkins DSL job configuring slave - jenkins

I have a DSL plugin file which creates couple of jobs like pipeline, freshly jobs. I wanted to know what would be syntax (for dsl file only not jenkinsfile) that i can run these jobs on specific agent or slave. Code sample given below. I tried to use label('JenkinsEC2Slave'),But it is actually running my DSL job on slave, not the one which are created by DSL. Labels are from ec2 plugin and they should be launched on demand.
pipelineJob('Build_Docker_Images') {
label('JenkinsEC2Slave')
configure {
it / definition / lightweight(true)
}
triggers {
scm('#midnight')
}
concurrentBuild(false)
parameters {
stringParam('ECR_REPO', 'xxxxxxxxxxx.dkr.ecr.eu-west-2.amazonaws.com')
}
definition {
cpsScm {
scm {
scriptPath ('ci-cd/pipelines/base_docker_images/Jenkinsfile')
git {
branches('*/master')
remote {
url ('git#github.com:xxxxxxxxxx.git')
credentials ('jenkins-key')
}
}
}
}
}

You can use labels to select build agents in Jenkins. label is also the property of JobDSL, which allows you to specify labels for a job. Quoting the DSL viewer:
job('example') {
label('x86 && ubuntu')
}

Related

Configure job-dsl using scriptler

I'm going to write a freestyle job using job-dsl. job should build the scriptler script. I have experience writing a similar job for a pipeline. I don't know what it should look like for a job building scripter script .
pipelineJob('test') {
definition {
cpsScm {
scm {
git {
remote {
url('https://gitlab.com/repo.git')
credentials('cred')
}
branch('*/master')
}
scriptPath('src/test.Jenkinsfile')
}
lightweight()
}
}
}
A freestyle job just needs to run the script from the scriptler catalog. all. Nothing more.

Jenkins Pipeline: how can I disable the current job from inside a pipeline?

Is there a way to disable a Jenkins job from inside a pipeline?
I use the Disable Failed Job plugin very heavily https://plugins.jenkins.io/disable-failed-job , so that when a Jenkins job fails, it is automatically disabled.
This is handy for the workflow that I have.
This plugin unfortunately does not work with Jenkins Pipeline.
Stuart Rowe helped me come up with the following:
pipeline {
agent any
stages {
stage('disable build') {
steps {
script {
// See:
// https://ci.jenkins.io/pipeline-syntax/globals#currentBuild
// https://javadoc.jenkins.io/plugin/workflow-support/org/jenkinsci/plugins/workflow/support/steps/build/RunWrapper.html
// https://javadoc.jenkins.io/jenkins/model/ParameterizedJobMixIn.ParameterizedJob.html#setDisabled-boolean-
currentBuild.rawBuild.getParent().setDisabled(true)
}
}
}
}
}

Pass configuration into a Jenkins Pipeline

I'm trying to find a way to pass a configuration for a Multibranch pipeline job into the jenkinsfile when it's executing.
My goal is to configure something like the following:
Branch : Server
"master" : "prodServer"
"develop" : "devServer"
"release/*", "hotfix/*" : "stagingServer"
"feature/Thing-I-Want-To-Change-Regularly" : "testingServer"
where I can then write a Jenkinsfile like this:
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
//branch is in config branches
}
steps {
//deploy to server
}
}
}
}
I'm having trouble finding a way to achieve this. EnvInject Plugin seems to be the solution for non-Pipeline projects, but it's currently got security issues and only partial Pipeline support.
If you want to deploy to different servers depending on the branch, in Multibranch Pipelines you can use:
when { branch 'master' } (decalrative)
or
${env.BRANCH_NAME} (scripted)
to access which branch you are on and then add logic to deploy to corresponding servers based on this.
Going to post my current best approach to a global config value and hope something better comes along.
In Manage Jenkins -> Configure System -> Global Properties you can define global Environment Variables which can be accessed from Jenkins jobs. Defining an MY_BRANCH variable there could be accessed from a pipeline.
when { branch: MY_BRANCH }
Or even a RegEx and used like this
when { expression { BRANCH_NAME ==~ MY_BRANCH } }
However, this has the disadvantage that the Environment Variables are shared between every Jenkins job, not just across all branches of a single job. So careful naming will be necessary.

Jenkins Job DSL: Create Maven Job which calls another Job DSL file

I am trying to create a Jenkins Job DSL which creates a Maven Project. The created Maven Project should execute another Job DSL which is in my Bitbucket Git repo. I am havin a hard time finding the right Job DSL tags/methods I need for achieving this goal. Can anyone help me out with this?
mavenJob('CD/cd-parent-master') {
scm {
git {
remote {
name('origin')
url('-------')
credentials('Bitbucket_Access')
}
}
}
description 'MavenJob which calls another Job DSL'
/* Tag which specifies that the created MavenJob should process another Job
DSL and the location of that Job DSL on my Jenkins FileSystem.*/
wrappers {
mavenRelease { }
}
jdk('JDK 8')
mavenInstallation('connecteddrive Maven 3')
goals('clean deploy')
}

Jenkins Job DSL: How to read Pipeline DSL script from file?

I want to generate my pipeline plugin based jobs via Job DSL, which is contained in a git repository that is checked out by Jenkins.
However, I think it is not very nice to have the pipeline scripts as quoted Strings inside of the Job DSL script. So I want to read them into a string and pass that to the script() function:
definition {
cps {
sandbox()
script( new File('Pipeline.groovy').text )
}
}
}
Where do I have to put Pipeline.groovy for this to work? I tried putting it right next to my DSL script, and also in the resources/ folder of my DSL sources. But Jenkins always throws a "file not found".
have you tried readFileFromWorkspace()? it should be able find the files you checkout from git.
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
The Job DSL below creates a pipeline job, which pulls the actual job from a Jenkinsfile:
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
You/your Jenkins admins may want to separate Jenkins job creation from the actual job definition. That seems sensible to me ... it's not a bad idea to centralize the scripts to create the Jenkins jobs in a single Jenkins DSL repo.

Resources