Create variable in shared library for Jenkinsfile - environment-variables

I'm new to shared libraries in Jenkins, and fairly new to Groovy as well.
I have several multibranch pipelines for different projects. I have setup email notifications for each job using an environmental variable containing a list of email addresses, which works just fine. However, several jobs share the same email addresses (depending on the project it's for) and I'd like to create a shared library for a master email list, so I don't have to update the list in each job individually if say I want to add or remove someone. I'm having trouble defining a variable in a library that can be used later in the Jenkinsfile. This is a simplified version of what I've been trying:
shared library (basically a copy paste of the environmental variables I was originally using in the individual Jenkinsfiles/jobs, which works):
Jenkinsfile-shared-libraries\vars\masterEmailList
def call () {
environment {
project1EmailList = "user1#xyz.com, user2#xyz.com, user3#xyz.com"
project2EmailList = "user2#xyz.com, user4#xyz.com, user5#xyz.com"
}
}
Jenkinsfile
#Library('Jenkinsfile-shared-libraries') _
pipeline {
agent any
stages {
stage ('email list for project 1') {
steps {
masterEmailList()
echo env.project1EmailList
}
}
}
}
The echo returns "null" rather than the email list of the project like I would expect.
Any guidance would be much appreciated!
Cheers.

The "Defining global variables" section of https://www.jenkins.io/doc/book/pipeline/shared-libraries/#defining-global-variables helped solve this one.
shared library:
Jenkinsfile-shared-libraries\vars\masterEmailList
def project1EmailList() {
"user1#xyz.com, user2#xyz.com, user3#xyz.com"
}
def project2EmailList() {
"user2#xyz.com, user4#xyz.com, user5#xyz.com"
}
Jenkinsfile:
#Library('Jenkinsfile-shared-libraries') _
pipeline {
agent any
stages {
stage ('email list for project 1') {
steps {
script {
echo masterEmailList.project1EmailList
}
}
}
}
}

Related

Jenkins - set options in a shared library for all pipelines that use the shared library

I have a bunch of repositories which use (parts of) the same Jenkins shared library for running tests, docker builds, etc. So far the shared library has greatly reduced the maintenance costs for these repos.
However, it turned out that basically all pipelines use the same set of options, e.g.:
#Library("myExample.jenkins.shared.library") _
import org.myExample.Constants
pipeline {
options {
disableConcurrentBuilds()
parallelsAlwaysFailFast()
}
agent {
label 'my-label'
}
stages {
stage {
runThisFromSharedLibrary(withParameter: "foo")
runThatFromSharedLibrary(withAnotherParameter: "bar")
...
...
In other words, I need to copy-and-paste the same option snippets in any new specific pipeline that I create.
Also, this means that I need to edit separately each Jenkinsfile (along with any peer-review processes we use internally) when I decide to change the set of options.
I'd very much like to remove this maintenance overhead somehow.
How can I delegate the option-setting to a shared library, or otherwise configure the required options for all pipelines at once?
Two options will help you the most:
Using global variables on Master/Agent level.
go to Jenkins-->Manage Jenkins-->Configure System--> Global properties.
Mark the Environment variables box then add name and value for the variable.
then you will be able to use it normally in your Jenkins pipelines as below code snippets.
Wrap the whole pipeline in a function inside shared-library.
Jenkinsfile will look like below:
#Library('shared-library') _
customServicePipeline(agent: 'staging',
timeout: 3,
server:'DEV')
shared library function
// customServicePipeline.groovy
def call(Map pipelineParams = [:]) {
pipeline {
agent { label "${pipelineParams.agent}" }
tools {
maven 'Maven-3.8.6'
jdk 'JDK 17'
}
options {
timeout(time: "${pipelineParams.timeout}", unit: 'MINUTES')
}
stages {
stage('Prep') {
steps {
echo 'prep started'
pingServer(pipelineParams.get("server"))
}
}
}
}
}

Defining agent labels in a single separated file

I am currently facing a problem, I have about 90 jenkinsfile, we recently updated one of the Jenkins agent and it has a new label now, which means that we have to go and update every jenkinsfile with the new label of that agent, you agree that this is a bit of a pain, especially since we will have to do it every time we update the agent.I was thinking if we can define all of the agents is a single file (variable=value) than we reference the variable in our jenkinsfile, so next time we upgrade the agent we do the changes in that particular file instead of 90 jenkinsfile
Yes, you can do this. I'm assuming you have the agent details in the same SCM repo you have the Pipelines. In this case, you can do something like the below.
pipeline {
agent {label getAgentFromFile()}
stages {
stage('Hello6') {
steps {
script {
echo "Hello Something"
}
}
}
}
}
def getAgentFromFile(){
def agent = "default"
node {
agent = new File( pwd() + '/agent.txt').text.trim()
println agent
}
return agent
}

Is it possible to pass stages into a Jenkinsfile via pipeline parameters

I'm currently working with a Jenkinsfile I can't directly add code to as it is not developed by my team, however I was thinking there might be some way that I could get the owners of the Jenkinsfile (just another team in my company) to allow us to add "pre" and "post" type variables to the Jenkinsfile, where we could than pass in the stages and logic.
A sample Jenkinsfile today might look like
pipeline {
stages {
stage('Clean-Up WS') {
steps {
cleanWs()
}
}
stage('Do more....
And the desired Jenkinsfile might look like
def x = stage('Clean-Up WS') {
steps {
cleanWs()
}
}
pipeline {
stages {
x()
stage('Do more....
Where x in the above example could be passed in via a Jenkins parameter
I've played around with the above and tried with similar syntax but nothing seems to work
Does anyone know if anything like this possible using Jenkinsfiles?

Jenkins pipelineJob DSL not interpreting variables in pipeline script

I'm trying to generate Jenkins pipelines using the pipelineJob function in the jobDSL pluging, but cannot pass parameters from the DSL to the pipeline script. I have several projects that use what is essentially the same Jenkinsfile, with differences only in a few steps. I'm trying to use the JobDSL plugin to generate these pipelines on the fly, with the values I want changed in them interpreted to match the parameters to the DSL.
I've tried just about every combination of string interpretation that I can in the pipeline script, as well as in the DSL, but cannot get Jenkins/groovy to interpret variables in the pipeline script.
I'm calling the job DSL in a pipeline step:
def projectName = "myProject"
def envs = ['DEV','QA','UAT']
def repositoryURL = 'myrepo.com'
jobDsl targets: ['jobs/*.groovy'].join('\n'),
additionalParameters: [
project: projectName,
environments: envs,
repository: repositoryURL
],
removedJobAction: 'DELETE',
removedViewAction: 'DELETE'
The DSL is as follows:
pipelineJob("${project} pipeline") {
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}
pipeline.groovy:
pipeline {
agent any
environment {
REPO = repository
}
parameters {
choice name: "ENVIRONMENT", choices: environments
}
stages {
stage('Deploy') {
steps {
echo "Deploying ${env.REPO} to ${params.ENVIRONMENT}..."
}
}
}
}
The variables that I pass in additionalParameters are interpreted in the jobDSL script; a pipeline with the correct name does get generated. The problem is that the variables are not passed to the pipeline script read from the workspace - the Jenkins configuration for the generated pipeline looks exactly the same as the file, without any interpretation on the variables.
I've made a number of attempts at getting the string to interpret, including a lot of variations of "${environments}", ${environments}, $environments, \$environments...I can't find any that work. I've also tried reading the file as a gstringImpl:
script("${readFileFromWorkspace(pipeline.groovy)}")
Does anyone have any ideas as to how I can make variables propagate down to the pipeline script? I know that I could just use a for loop to do string.replaceAll() on the script text, but that seems cumbersome; there's got to be a better way.
I've come up with a way to make this work. It's not what I'd prefer, which is having the string contents of the file implicitly interpreted during job creation, but it does work; it just adds an extra step.
import groovy.text.SimpleTemplateEngine
def fileContents = readFileFromWorkspace "pipeline.groovy"
def engine = new SimpleTemplateEngine()
template = engine.createTemplate(fileContents).make(binding.getVariables()).toString()
pipelineJob("${project} pipeline") {
displayName('Pipeline')
definition {
cps {
script(template)
}
}
}
This reads a file from your workspace, then uses it as a template with the binding variables. The other changes needed to make this work are escaping any variables used in your Jenkinsfile script, like \${VARIABLE} so that they are expanded at runtime, not at the time you build the job. Any variables you want to be expanded at job creation should be referenced as ${VARIABLE}.
You could achieve what you're trying to do by defining environment variables in the pipelineJob and then using those variables in your pipeline.
They are a bit limited because environment variables are strings, but it should work for basic stuff
Ex.:
//job-dsl
pipelineJob('example') {
environmentVariables {
// these vars could be specified by parameters of this job
env('repository', 'blah')
env('environments', "a,b,c"]) //comma separated string
}
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}
}
And then in the pipeline:
//pipeline.groovy
pipeline {
agent any
environment {
REPO = env.repository
}
parameters {
choice name: "ENVIRONMENT", choices: env.environments.split(',')
//note the need to split the comma separated string above
}
}
You need to use the complete job name as a variable without the quotes. E.g., if JOBNAME is a parameter containing the entire job name:
pipelineJob(JOBNAME) {
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}

How to define pipeline stage once and using it multiple times

So I have a use case with Jenkinsfile that I know is not common, and I haven't found a solution for it yet.
Background
We currently have a multi-branch pipeline job configured to build multiple branches. This is uses to run system-testing of the products across multiple release. The Jenkins job
Clone all required repositories
Deploy the environment
Execute the automated test cases
Undeploy the environment
In order to avoid having to define the same Jenkinsfile on each branches, we created a shared library. The shared library defines the Declarative pipeline stages for the Jenkins file. The shared library has the following:
/* File name var/myStep.groovy */
def call(Map pipelineParams) {
callASharedLibraryFunction()
properties([
parameters(sharedLibraryGetParameters(pipelineParams))
])
pipeline {
// snip
stages {
stage("clone repos") { }
stage("Deploy environment") { }
stage("Executed Tests") { }
stage("Undeploy environment") { }
}
// post directives
}
}
And the Jenkins file simply defines a map, and then call myStep call.
e.g.:
/* Sample Jenkinsfile */
pipelineParams = [
FOO = "foo"
]
myStep pipelineParams
The problem
We now have a need for another Jenkins job, where some of the stages will be the same. For example, the new jobs will need to
Clone all required repositories
Deploy the environment
Do something else
And changing the behaviour of a common stage (e.g.: Clone the repo), should take effect across all the jobs that define this stage. I know we can use the when directive in the stage, however from a usability perspective, I want the jobs to be different as they are exercising different things. And the users of one job don't care about the additional stages the other job runs.
I want to avoid code duplication, and better yet, I don't want to duplicate the stage code. (including steps, when, post, etc..).
Is there a way a shared library can define the stage "implementation" with all the directives (steps, when, post, etc) once, but have it get called multiple times?
e.g.:
/* File: vars/cloneReposStageFunction.groovy */
def call() {
stage("Clone Repos") { }
}
/* File: vars/myStep.groovy */
def call(Map pipelineParams) {
pipeline {
// snip
stages {
cloneReposStageFunction()
stage("Deploy environment") { }
stage("Executed Tests") { }
stage("Undeploy environment") { }
}
// post directives
}
}
/* File: vars/myNewStep.groovy */
def call(Map pipelineParams) {
pipeline {
// snip
stages {
cloneReposStageFunction()
stage("Deploy environment") { }
stage("Do something else") { }
}
// post directives
}
}
It's an open Jenkins' Feature Request.
I've seen different ways to template a pipeline, but it's far from what you'd like to achieve.

Resources