I need to back-port the Jenkins pipeline to old Jenkins job format with DSL. I'm stuck at the agent section:
agent {
dockerfile {
label 'buildDockerNode'
dir 'devops/k8s/build'
}
}
How can I use this method on old Jenkins? In old Jenkins job DSL I only see the label configuration for the corresponding Pipeline syntax.
Any idea is appreciated.
By using the pipelineJob property, your agent will be configured by the pipelineDSL. No need to define it again in jobDSL https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob
To ease the transformation to jobDSL, I would recommend to use the jenkins-pipelayer library it abstracts jobDSL for you and you can use properties files to configure your pipeline. the docs is here: https://github.com/SAP/jenkins-pipelayer/blob/master/USAGE.md#template-engine
I found the solution with buildInDocker wrapper:
https://jenkinsci.github.io/job-dsl-plugin/#path/job-wrappers-buildInDocker
job('example-2') {
wrappers {
buildInDocker {
dockerfile()
volume('/dev/urandom', '/dev/random')
verbose()
}
}
}
Related
I am converting a Jenkins Freestyle build to a pure pipeline based job.
Current configuration uses https://plugins.jenkins.io/build-blocker-plugin/ as shown in the image.
How can I use it in a Declarative pipeline?
pipeline {
agent { label 'docker-u20' }
//Don't run until the "test-job" running
blockon("test-job")
stage{}
}
I did try to look into various jenkins docs but haven't found yet.
I would like to create a Jenkins declarative pipeline and would like to have the pipeline structure as following:
mainPipeline.groovy
stage1.groovy
stage2.groovy
stage3.groovy
mainPipeline looks like the following:
pipeline {
stages {
stage('stage1') {
// Call method from the file Stage1.groovy
}
stage('stage2') {
// Call method from the file Stage2.groovy
}
}
}
I have two main questions:
How do I link these files to a Library?
How do I configure Jenkins Pipeline, so that Jenkins not only knows the main JenkinsFile which is mainPipeline but also the submodules?
I would not recommend to separate your Jenkinsfile into separate files, since there are better options:
You can execute jobs within your pipeline with Pipeline: Build Step plugin. Use this to execute stages that gonna be used by multiple jobs. For example I use this to deploy my applications in a common deploy job.
You can extend Jenkins with your own libraries, which you can load per job or for all jobs. See: Extending with Shared Libraries
For both methods the defining Jenkinsfiles/Groovy scripts can come from SCM.
If you really want to load script from the project path then check this question. If you want to use multiple Jenkinsfiles from the project path, you can just add more Jenkinsfiles as "Project Recognizers" when you configure the job.
I'm fairly new to Jenkins and a total newbie to Bamboo. I have a Jenkins Pipeline and I'm trying to create an equivalent in Bamboo (I believe it's called a Plan).
I've got some groovy code that I want to run in my Bamboo plan.
I'll simplify the code below for brevity and clarity.
Assume this file is called me_myEvent.groovy and is stored at https://github.com/myuser/repo1
def processEvent( Map args ) {
String strArg1 = args.myArg1;
String strArg2 = args.myArg2;
// etc...
}
My Jenkins pipeline has a global pipeline library (myGitLibraryFromGlobal) linking to https://github.com/myuser/repo1 and my pipeline is:
#Library('myGitLibraryFromGlobal#master') abc
pipeline {
agent any
stages {
stage('First Stage') {
steps {
script {
def myObj = new com.mysite.me_myEvent();
def returnVal = myObj.processEvent(arg1: 'foo', arg2: 'bar');
}
}
})
}
}
I've got the GitHub repo saved in Bamboo as a global linked repository called abc123.
Can I achieve the same thing in Bamboo using the script task? What would this look like in Bamboo?
The short answer is NO, as Atlassian Bamboo doesn't support the DSL Groovy or Scripted Groovy pipeline. Also, please keep in mind that when you run the Jenkins Groovy pipeline, then Jenkins adds its own environment to the script execution, it is not just running "bare" groove script (i.e. without exposed Jenkins commands and variables).
If you need to run a "bare" groovy script supporting the idea of Configuration as Code, one solution is to create a Bamboo Java/YAML spec. Then you need to create ScriptTask.
// this is called when the plan and stages are created
new Job("JobName","JobKey").tasks(new VcsCheckoutTask(), // to download the groovy script
new ScriptTask().inlineBody("groovy me_myEvent.groovy").interpreterBinSh())
Note: your Bamboo build agent should have a pre-installed groovy.
Another solution is to use the Bamboo plugin Groovy Tasks for Bamboo.
According to the documentation in https://jenkinsci.github.io/job-dsl-plugin/#method/javaposse.jobdsl.dsl.helpers.wrapper.MavenWrapperContext.buildName
Following code should update build name in Build History in Jenkins jobs:
// define the build name based on the build number and an environment variable
job('example') {
wrappers {
buildName('#${BUILD_NUMBER} on ${ENV,var="BRANCH"}')
}
}
Unfortunately, it is not doing it.
Is there any way to change build name from Jenkins Job DSL script?
I know I can change it from Jenkins Pipeline Script but it is not needed for me in this particular job. All I use in the job is steps.
steps {
shell("docker cp ...")
shell("git clone ...")
...
}
I would like to emphasise I am looking for a native Jenkins Job DSL solution and not a Jenkins Pipeline Script one or any other hacky way like manipulation of environment variables.
I have managed to solve my issue today.
The script did not work because it requires build-name-setter plugin installed in Jenkins. After I have installed it works perfectly.
Unfortunately, by default jobdsl processor does not inform about missing plugins. The parameter enabling that is described here https://issues.jenkins-ci.org/browse/JENKINS-37417
Here's a minimal pipeline changing the build's display name and description. IMHO this is pretty straight forward.
pipeline {
agent any
environment {
VERSION = "1.2.3-SNAPSHOT"
}
stages {
stage("set build name") {
steps {
script {
currentBuild.displayName = "v${env.VERSION}"
currentBuild.description = "#${BUILD_NUMBER} (v${env.VERSION})"
}
}
}
}
}
It results in the following representation in Jenkins' UI:
setBuildName("your_build_name") in a groovyPostBuild step may do the trick as well.
Needs Groovy Postbuild Plugin.
I have a Jenkins Ivy job that uses the Inject environment variables to the build process step. I am writing a DSL script so that I can dynamically create this job with the job-dsl-plugin plug-in.
I set up the following lines for this:
steps {
envInjectBuilder {
propertiesFilePath('/tmp/file')
}
}
but the steps method can only be applied to a free-style job and not to an Ivy job. I get this in the console output:
Processing DSL script ivyJob.groovy
java.lang.IllegalStateException: steps cannot be applied for Ivy jobs
Doesn't the DSL plug-in support EnvInject for an Ivy job? If it doesn't, is there a way I can do this programmatically? I know EnvInject is compatible with Ivy jobs since I can manually create the very job.
Thanks.
The EnvInject plugin allows to inject variables at several points in a build's lifecycle. A build step is only one possibility. For a Ivy project type the job property and wrapper options will work.
ivyJob('example') {
environmentVariables {
env('ONE', '1')
propertiesFile('env.properties')
keepBuildVariables(true)
}
wrappers {
environmentVariables {
env('ONE', '1')
envs(FOO: 'bar', TEST: '123')
propertiesFile('env.properties')
}
}
}
See the Job DSL API Viewer for details:
https://jenkinsci.github.io/job-dsl-plugin/#path/ivyJob-environmentVariables
https://jenkinsci.github.io/job-dsl-plugin/#path/ivyJob-wrappers-environmentVariables