How to pass paramaters to a pipelineJob in DSL - jenkins

I have very similar pipeline jobs that have only differences are parameters. The goal is to create these jobs by passing parameters in a DSL script without any code duplication.
I followed this article. So If you run the DSL script below after you implemented the steps as mentioned in the article, my script able to run.
TL;DR
In that article adds a shared library and also have Jenkinsfile use that shared library.
I have a very similar approach. The difference is I want to create my build jobs via DSL and, changes default parameters of the Jenkinsfile by settings on the DSL.
The question is how can I pass/override parameters in the Jenkinsfile.
// BTW I'll run this code below in a loop. Open for any suggesstion
pipelineJob('AwesomeBild') {
description("A pipeline created by dsl")
definition {
cpsScm {
scm {
git {
remote { url('https://github.com/jalogut/jenkinsfile-shared-library-sample.git') }
branches('master')
// how can I pass params to the file
scriptPath('Jenkinsfile')
extensions { }
}
}
}
}
}
Edit
Parameters worked well. Here is the lastest version of the DSL file.
pipelineJob('AwesomeBild') {
description("A pipeline created by dsl")
parameters {
stringParam( "key", "value" )
}
definition {
cpsScm {
scm {
git {
remote { url('https://github.com/jalogut/jenkinsfile-shared-library-sample.git') }
branches('master')
// how can I pass params to the file
scriptPath('Jenkinsfile')
extensions { }
}
}
}
}
}

The solution is simple:
just use $key, but leave the single quotes:
scriptPath('Jenkinsfile$key')

Related

Jenkins pipelineJob DSL not interpreting variables in pipeline script

I'm trying to generate Jenkins pipelines using the pipelineJob function in the jobDSL pluging, but cannot pass parameters from the DSL to the pipeline script. I have several projects that use what is essentially the same Jenkinsfile, with differences only in a few steps. I'm trying to use the JobDSL plugin to generate these pipelines on the fly, with the values I want changed in them interpreted to match the parameters to the DSL.
I've tried just about every combination of string interpretation that I can in the pipeline script, as well as in the DSL, but cannot get Jenkins/groovy to interpret variables in the pipeline script.
I'm calling the job DSL in a pipeline step:
def projectName = "myProject"
def envs = ['DEV','QA','UAT']
def repositoryURL = 'myrepo.com'
jobDsl targets: ['jobs/*.groovy'].join('\n'),
additionalParameters: [
project: projectName,
environments: envs,
repository: repositoryURL
],
removedJobAction: 'DELETE',
removedViewAction: 'DELETE'
The DSL is as follows:
pipelineJob("${project} pipeline") {
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}
pipeline.groovy:
pipeline {
agent any
environment {
REPO = repository
}
parameters {
choice name: "ENVIRONMENT", choices: environments
}
stages {
stage('Deploy') {
steps {
echo "Deploying ${env.REPO} to ${params.ENVIRONMENT}..."
}
}
}
}
The variables that I pass in additionalParameters are interpreted in the jobDSL script; a pipeline with the correct name does get generated. The problem is that the variables are not passed to the pipeline script read from the workspace - the Jenkins configuration for the generated pipeline looks exactly the same as the file, without any interpretation on the variables.
I've made a number of attempts at getting the string to interpret, including a lot of variations of "${environments}", ${environments}, $environments, \$environments...I can't find any that work. I've also tried reading the file as a gstringImpl:
script("${readFileFromWorkspace(pipeline.groovy)}")
Does anyone have any ideas as to how I can make variables propagate down to the pipeline script? I know that I could just use a for loop to do string.replaceAll() on the script text, but that seems cumbersome; there's got to be a better way.
I've come up with a way to make this work. It's not what I'd prefer, which is having the string contents of the file implicitly interpreted during job creation, but it does work; it just adds an extra step.
import groovy.text.SimpleTemplateEngine
def fileContents = readFileFromWorkspace "pipeline.groovy"
def engine = new SimpleTemplateEngine()
template = engine.createTemplate(fileContents).make(binding.getVariables()).toString()
pipelineJob("${project} pipeline") {
displayName('Pipeline')
definition {
cps {
script(template)
}
}
}
This reads a file from your workspace, then uses it as a template with the binding variables. The other changes needed to make this work are escaping any variables used in your Jenkinsfile script, like \${VARIABLE} so that they are expanded at runtime, not at the time you build the job. Any variables you want to be expanded at job creation should be referenced as ${VARIABLE}.
You could achieve what you're trying to do by defining environment variables in the pipelineJob and then using those variables in your pipeline.
They are a bit limited because environment variables are strings, but it should work for basic stuff
Ex.:
//job-dsl
pipelineJob('example') {
environmentVariables {
// these vars could be specified by parameters of this job
env('repository', 'blah')
env('environments', "a,b,c"]) //comma separated string
}
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}
}
And then in the pipeline:
//pipeline.groovy
pipeline {
agent any
environment {
REPO = env.repository
}
parameters {
choice name: "ENVIRONMENT", choices: env.environments.split(',')
//note the need to split the comma separated string above
}
}
You need to use the complete job name as a variable without the quotes. E.g., if JOBNAME is a parameter containing the entire job name:
pipelineJob(JOBNAME) {
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}

How to define pipeline stage once and using it multiple times

So I have a use case with Jenkinsfile that I know is not common, and I haven't found a solution for it yet.
Background
We currently have a multi-branch pipeline job configured to build multiple branches. This is uses to run system-testing of the products across multiple release. The Jenkins job
Clone all required repositories
Deploy the environment
Execute the automated test cases
Undeploy the environment
In order to avoid having to define the same Jenkinsfile on each branches, we created a shared library. The shared library defines the Declarative pipeline stages for the Jenkins file. The shared library has the following:
/* File name var/myStep.groovy */
def call(Map pipelineParams) {
callASharedLibraryFunction()
properties([
parameters(sharedLibraryGetParameters(pipelineParams))
])
pipeline {
// snip
stages {
stage("clone repos") { }
stage("Deploy environment") { }
stage("Executed Tests") { }
stage("Undeploy environment") { }
}
// post directives
}
}
And the Jenkins file simply defines a map, and then call myStep call.
e.g.:
/* Sample Jenkinsfile */
pipelineParams = [
FOO = "foo"
]
myStep pipelineParams
The problem
We now have a need for another Jenkins job, where some of the stages will be the same. For example, the new jobs will need to
Clone all required repositories
Deploy the environment
Do something else
And changing the behaviour of a common stage (e.g.: Clone the repo), should take effect across all the jobs that define this stage. I know we can use the when directive in the stage, however from a usability perspective, I want the jobs to be different as they are exercising different things. And the users of one job don't care about the additional stages the other job runs.
I want to avoid code duplication, and better yet, I don't want to duplicate the stage code. (including steps, when, post, etc..).
Is there a way a shared library can define the stage "implementation" with all the directives (steps, when, post, etc) once, but have it get called multiple times?
e.g.:
/* File: vars/cloneReposStageFunction.groovy */
def call() {
stage("Clone Repos") { }
}
/* File: vars/myStep.groovy */
def call(Map pipelineParams) {
pipeline {
// snip
stages {
cloneReposStageFunction()
stage("Deploy environment") { }
stage("Executed Tests") { }
stage("Undeploy environment") { }
}
// post directives
}
}
/* File: vars/myNewStep.groovy */
def call(Map pipelineParams) {
pipeline {
// snip
stages {
cloneReposStageFunction()
stage("Deploy environment") { }
stage("Do something else") { }
}
// post directives
}
}
It's an open Jenkins' Feature Request.
I've seen different ways to template a pipeline, but it's far from what you'd like to achieve.

Trigger an action within Jenkins declarative pipeline right after a stage ends or just before a stage begins?

I have the following Jenkinsfile:
pipeline {
agent any
environment { }
stages {
stage('stageA') {
steps {
... Do something with arg1, arg2 or arg3
}
}
stage('stageB') {
steps {
... Do something with arg1, arg2 or arg3
}
}
...
}
}
Is there anywhere I can specify a universal "pre-stage" or "post-stage" set of actions to perform? A use-case would be sending logging information at the end of a stage to a log manager, but it would be preferable to not copy and paste those invocations at the end of each and every stage.
As far as I know there is no generic post- or pre-stage hook in Jenkins pipelines. You can define post steps in a post section but you need one per stage.
However, if you don't want to repeat yourself, you have some options.
Use a shared lib
The place to put repeating code to it a shared library. That way allows you to declare your own steps using Groovy.
You need another repository to define a shared lib, but apart from that it is a pretty strait forward way and you can reuse the code in all of your Jenkins' pipelines.
Use a function
If you declare a function outside of the pipeline, you can call it from any stage. This is not really documented and might be prevented in the future. As far as I understand it messes with the coordination between master and agents. However, it works:
pipeline {
agent any
stages {
stage ("First") {
steps {
writeFile file: "resultFirst.txt", text: "all good"
}
post {
always {
cleanup "first"
}
}
}
stage ("Second") {
steps {
writeFile file: "resultSecond.txt", text: "all good as well"
}
post {
always {
cleanup "second"
}
}
}
post {
always {
cleanup "global" // this is only triggered after all stages, not after every
}
}
}
}
void cleanup(String stage) {
echo "cleanup ${stage}"
archiveArtifacts artifacts: "result*"
}

How to add "single conditional steps" under build section using dsl script

I'm currently trying to develop a DSL script that can create a jenkins job with all required plugins and options.
I think I've almost completed all the section. But, I stuck up under build section where I've to include "conditional steps (single)" under Build.
Actually what I wanted is this
But, what I get is this
Here's the code that I used,
job('Sample_dev') {
steps {
conditionalSteps {
condition {
alwaysRun()
}
}
maven {
goals('install')
}
}
}
You have done few mistakes there:
Using multi-step DSL for achieving single step.
Pushed maven outside context like individual step.
Wrong DSL for Maven Step declaration.
Try following
job('Sample_dev')
{
steps{
singleConditionalBuilder{
condition{
alwaysRun()
}
buildStep {
maven{
targets('install')
name('')
pom('')
properties('')
jvmOptions('')
usePrivateRepository(false)
settings {
standard()
}
globalSettings {
standard()
}
injectBuildVariables(false)
}
}
runner {
fail()
}
}
}
}
The creator has deployed most on this url https://jenkinsci.github.io/job-dsl-plugin. But I would suggest you install in you local instance and access it via http://<your-jenkins-host>:<port> /plugin/job-dsl/api-viewer/index.html as Job DSL support auto generation so there is bright chance that plugin not listed above still has DSL support.

Conditional loops in Job DSL

I'm taking the build type i.e either Maven Job or Freestyle job as an input parameter (using the build parameterized plugin) and based on the input condition create the corresponding Job
My input parameter: "maven" (to create Maven job) , else block for freestyle Job.
if(params[build_type]=="maven"){
mavenJob('example') {
using(template_job)
scm {
svn {
location(svn_url)
}
}
}
}
freeStyleJob('example') {
using(template_job)
scm {
svn {
location(svn_url)
}
}
}
I'm facing the following error message and I'm very new to groovy so please excuse. Looking forward for any suggestions.Thanks.
Processing provided DSL script ERROR: (script, line 1) No such
property: params for class: script
The Job DSL script inherits the build parameters as variables in your Job DSL. So if you have a parameter named build_type, you can use it as a variable.
if (build_type == "maven") {
mavenJob('example') {
using(template_job)
scm {
svn {
location(svn_url)
}
}
}
}
See: User Power Moves: Parameterized Seed Job

Resources