Dynamic assignment of values as default parameters in Jenkinsfile - jenkins

Whenever I run this pipeline in Jenkins I have to manually copy-paste some values from a YAML file in a remote Gitlab repository. What I would like to achieve is an auto-fill of the values that .
This is how my Jenkinsfile and the YAML look like:
Jenkinsfile
pipeline {
agent {
docker {
image 'artifactory...'
args "..."
}
}
parameters {
string(name: 'BACKEND_TAG_1', defaultValue: '', description: 'Tag...')
string(name: 'BACKEND_TAG_2', defaultValue: '', description: 'Tag...')
}
stage('prepare') {
steps {
script {
dir('application') {
git url: env.PIPELINE_APPLICATION_GIT_URL, branch: env.PIPELINE_APPLICATION_GIT_BRANCH
}
Values = readYaml file: 'application/values.yaml'
values.yaml
version:
default: 0.1.2
company_tag_1: 0.1.124
company_tag_2: 0.1.230
So I need to loop into the parameters and assign the corresponding values:
Values.each { Value ->
Value.version.minus('company')
/* This value should be assigned to the corresponding parameter BACKEND_TAG_* parameter.
e.g.: BACKEND_TAG_1.default=company_tag_1
BACKEND_TAG_2.default=company_tag_2
*/
}
Reading the YAML works fine but I don't know how to proceed in the assignment of the values.

I presume you would like to populate all parameters before click Build button. I mean after clicking "Build with Parameters" button, you basically would like to see your parameters are populated from your YAML file.
If this is the case You can use Active Choice Parameter or Extended Choice Parameter plugins for this purpose. These Plugins are able to run Groovy Script, so you can develop a small Groovy Script read and select parameters automatically.

Related

Using same Jenkinsfile for two separate jobs in same repo

I have two separate Jenkins jobs that will run on one repository: My Jenkinsfile has a step that will run with this property enabled: enableZeroDownTime. The purpose of the 2nd Jenkins Job is to run the step with this property enableZeroDownTime disabled. Does anyone know how I can control it using the same Jenkinsfile? Can I pass that using some parameter based on any properties file? I am really confused on this.
stage('CreateCustomer') {
steps {
script {
common.runStage("#CreateCustomer")
common.runStage("#SetOnboardingCustomerManifest")
common.runStage("#enableZeroDownTime")
}
}
}
Solution
I currently run multiple pipelines that use the same Jenkinsfile. The change to conditionally execute a stage is trivial.
stage('CreateCustomer') {
when {
environment name: 'enableZeroDownTime', value: 'true'
}
steps {
script {
common.runStage("#CreateCustomer")
common.runStage("#SetOnboardingCustomerManifest")
common.runStage("#enableZeroDownTime")
}
}
}
The CreateCustomer stage will only run when the enableZeroDownTime parameter is set to true ( it can be a String parameter with value true, or a boolean parameter ).
The trick here is that you cannot add the parameters{} block to your declarative pipeline. For example if you had the following
parameters {
string(name: 'enableZeroDownTime', defaultValue: 'true')
}
Both pipelines would default to true. If you had the following
parameters {
string(name: 'enableZeroDownTime', defaultValue: '')
}
Both pipelines would default to a blank default value.
Even if you manually save a different default value to the pipeline after creation it will be overwritten next run with a blank default value.
Instead you simply need to remove the parameters{} block altogether and manually add the parameters through the web interface
Additionally...
Additionally it is possible to have two pipelines use the same Jenkinsfile with different parameters. For example, lets say Pipeline A had a enableZeroDownTime parameter defaulted to true and Pipeline B had no parameters at all. In this case you can add an environment variable of the same name and set the value equal to the following ternary expression
environment {
enableZeroDownTime = "${params.enableZeroDownTime != null ? "${params.enableZeroDownTime}" : false}"
}
You can then reference this parameter in the when declarative ( or anywhere in the pipeline ) without fear of the pipeline throwing a null pointer exception.

Multiple Jenkins parameterized build value pass value to Jenkinsfile for declarative pipeline

Hi my Jenkins project is parameterised build. I have 3 variables. 1 choice and 2 string parameter. The choise perameter is do_you_want_to_deploy and string parameter is git_tag and git_branch. I want to know how can i pass this value to a jenkinsfile?
In freestyle project, I selecft 'Extra Variables' and then got Key and Value. So key i put deploy_location, value is ${do_you_want_to_deplo}. Key is which_tag, value is ${git_tag}. Key is which_ranch, value is ${git_branch}. I am performing for ansible. How can i add verbos -vvv as well? This for pipelin project. Below is my code
ansiblePlaybook(
vaultCredentialsId: 'VaultId',
inventory: 'host-inventory.yml',
playbook: 'myPlaybook.yml'
)
``
I also need pass same value to downstream project. How can this be done?
Hi my Jenkins project is parameterised build. I have 3 variables. 1
choice and 2 string parameter. The choise perameter is
do_you_want_to_deploy and string parameter is git_tag and git_branch.
I want to know how can i pass this value to a jenkinsfile?
In Jenkinsfile there is parameters block to define variables. As per your use case, parameters definition may look like below. Here, by choice in your explanation I was assuming you need a toggle but if you need a list of items then use choice parameter type.
pipeline {
...
parameters {
booleanParam(name: 'do_you_want_to_deploy', defaultValue: false, description: 'Description of do_you_want_to_deploy')
string(name: 'git_tag', defaultValue: '', description: 'Description of git_tag')
string(name: 'git_branch', defaultValue: '', description: 'Description of git_branch')
}
stages {
stage('Example') {
steps {
ansiblePlaybook(
...
)
}
}
}
}
In freestyle project, I selecft 'Extra Variables' and then got Key and
Value. So key i put deploy_location, value is ${do_you_want_to_deplo}.
Key is which_tag, value is ${git_tag}. Key is which_ranch, value is
${git_branch}. I am performing for ansible. How can i add verbos -vvv
as well?
Ansible plugin has an option extraVars that can be used to pass number of variables from the pipeline. There is another option named extras that takes a string and can be used to pass additional variables, switches etc.
Together, ansiblePlaybook may look like below,
ansiblePlaybook (
vaultCredentialsId: 'VaultId',
inventory: 'host-inventory.yml',
playbook: 'myPlaybook.yml',
extras: '-vvv',
extraVars: [
deploy_location: params.do_you_want_to_deploy,
which_tag: params.git_tag,
which_branch: params.git_branch
]
)
I also need pass same value to downstream project. How can this be
done?
As you can see from the example of ansiblePlaybook above, the parameters can be accessed via params object.

Jenkins pipelineJob DSL not interpreting variables in pipeline script

I'm trying to generate Jenkins pipelines using the pipelineJob function in the jobDSL pluging, but cannot pass parameters from the DSL to the pipeline script. I have several projects that use what is essentially the same Jenkinsfile, with differences only in a few steps. I'm trying to use the JobDSL plugin to generate these pipelines on the fly, with the values I want changed in them interpreted to match the parameters to the DSL.
I've tried just about every combination of string interpretation that I can in the pipeline script, as well as in the DSL, but cannot get Jenkins/groovy to interpret variables in the pipeline script.
I'm calling the job DSL in a pipeline step:
def projectName = "myProject"
def envs = ['DEV','QA','UAT']
def repositoryURL = 'myrepo.com'
jobDsl targets: ['jobs/*.groovy'].join('\n'),
additionalParameters: [
project: projectName,
environments: envs,
repository: repositoryURL
],
removedJobAction: 'DELETE',
removedViewAction: 'DELETE'
The DSL is as follows:
pipelineJob("${project} pipeline") {
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}
pipeline.groovy:
pipeline {
agent any
environment {
REPO = repository
}
parameters {
choice name: "ENVIRONMENT", choices: environments
}
stages {
stage('Deploy') {
steps {
echo "Deploying ${env.REPO} to ${params.ENVIRONMENT}..."
}
}
}
}
The variables that I pass in additionalParameters are interpreted in the jobDSL script; a pipeline with the correct name does get generated. The problem is that the variables are not passed to the pipeline script read from the workspace - the Jenkins configuration for the generated pipeline looks exactly the same as the file, without any interpretation on the variables.
I've made a number of attempts at getting the string to interpret, including a lot of variations of "${environments}", ${environments}, $environments, \$environments...I can't find any that work. I've also tried reading the file as a gstringImpl:
script("${readFileFromWorkspace(pipeline.groovy)}")
Does anyone have any ideas as to how I can make variables propagate down to the pipeline script? I know that I could just use a for loop to do string.replaceAll() on the script text, but that seems cumbersome; there's got to be a better way.
I've come up with a way to make this work. It's not what I'd prefer, which is having the string contents of the file implicitly interpreted during job creation, but it does work; it just adds an extra step.
import groovy.text.SimpleTemplateEngine
def fileContents = readFileFromWorkspace "pipeline.groovy"
def engine = new SimpleTemplateEngine()
template = engine.createTemplate(fileContents).make(binding.getVariables()).toString()
pipelineJob("${project} pipeline") {
displayName('Pipeline')
definition {
cps {
script(template)
}
}
}
This reads a file from your workspace, then uses it as a template with the binding variables. The other changes needed to make this work are escaping any variables used in your Jenkinsfile script, like \${VARIABLE} so that they are expanded at runtime, not at the time you build the job. Any variables you want to be expanded at job creation should be referenced as ${VARIABLE}.
You could achieve what you're trying to do by defining environment variables in the pipelineJob and then using those variables in your pipeline.
They are a bit limited because environment variables are strings, but it should work for basic stuff
Ex.:
//job-dsl
pipelineJob('example') {
environmentVariables {
// these vars could be specified by parameters of this job
env('repository', 'blah')
env('environments', "a,b,c"]) //comma separated string
}
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}
}
And then in the pipeline:
//pipeline.groovy
pipeline {
agent any
environment {
REPO = env.repository
}
parameters {
choice name: "ENVIRONMENT", choices: env.environments.split(',')
//note the need to split the comma separated string above
}
}
You need to use the complete job name as a variable without the quotes. E.g., if JOBNAME is a parameter containing the entire job name:
pipelineJob(JOBNAME) {
displayName('Pipeline')
definition {
cps {
script(readFileFromWorkspace(pipeline.groovy))
}
}
}

Define you own global variable for JenkinsJob (Not for ALL jobs!!)

I have ha Jenkins job that has a string input parameter of the build flags for the make command in my Jenkins job. My problem is that some users forget to change the parameter values when we have a release branch. So I want to overwrite the existing string input parameter (or create a new one) that should be used if the job is a release job.
This is the statement I want to add:
If branch "release" then ${params.build_flag} = 'DEBUGSKIP=TRUE'
and the code that is not working is:
pipeline {
agent none
parameters {
string(name: 'build_flag', defaultValue: 'DEBUGSKIP=TRUE', description: 'Flags to pass to build')
If {
allOf {
branch "*release*"
expression {
${params.build_flag} = 'DEBUGSKIP=TRUE'
}
}
}else{
${params.build_flag} = 'DEBUGSKIP=FALSE'
}
}
The code above explains what I want to do but I don't know to do it.
If you can, see if you could use the JENKINS EnvInject Plugin, with your pipeline, using the supported use-case:
Injection of EnvVars defined in the "Properties Content" field of the Job Property
These EnvVars are being injected to the script environment and will be inaccessible via the "env" Pipeline global variable (as in here)
Or writing the right values in a file, and using that file content as "Properties Content" of a downstream job (as shown there).

Looking for a Jenkins plugin to allow per-branch default parameter values

I have a multi-branch pipeline job set to build by Jenkinsfile every minute if new changes are available from the git repo. I have a step that deploys the artifact to an environment if the branch name is of a certain format. I would like to be able to configure the environment on a per-branch basis without having to edit Jenkinsfile every time I create a new such branch. Here is a rough sketch of my Jenkinsfile:
pipeline {
agent any
parameters {
string(description: "DB name", name: "dbName")
}
stages {
stage("Deploy") {
steps {
deployTo "${params.dbName}"
}
}
}
}
Is there a Jenkins plugin that will let me define a default value for the dbName parameter per branch in the job configuration page? Ideally something like the mock-up below:
The values should be able to be reordered to set priority. The plugin stops checking for matches after the first one. Matching can be exact or regex.
If there isn't such a plugin currently, please point me to the closest open-source one you can think of. I can use it as a basis for coding a custom plugin.
A possible plugin you could use as a starting point for a custom plugin is the Dynamic Parameter Plugin
Here is a workaround :
Using the Jenkins Config File Provider plugin create a config json with parameters defined in it per branch. Example:
{
"develop": {
"dbName": "test_db",
"param2": "value"
},
"master": {
"dbName": "prod_db",
"param2": "value1"
},
"test_branch_1": {
"dbName": "zss_db",
"param2": "value2"
},
"default": {
"dbName": "prod_db",
"param2": "value3"
}
}
In your Jenkinsfile:
final commit_data = checkout(scm)
BRANCH = commit_data['GIT_BRANCH']
configFileProvider([configFile(fileId: '{Your config file id}', variable: 'BRANCH_SETTINGS')]) {
def config = readJSON file:"$BRANCH_SETTINGS"
def branch_config = config."${BRANCH}"
if(branch_config){
echo "using config for branch ${BRANCH}"
}
else{
branch_config = config.default
}
echo branch_config.'dbName'
}
You can then use branch_config.'dbName', branch_config.'param2' etc. You can even set it to a global variable and then use throughout your pipeline.
The config file can easily be edited via the Jenkins UI(Provided by the plugin) to provision for new branches/params in the future. This doesn't need access to any non sandbox methods.
Not really an answer to your question, but possibly a workaround...
I don't know that the rest of your parameter list looks like, but if it is a static list, you could potentially have your static list with a "use Default" option as the first one.
When the job is run, if the value is "use Default", then gather the default from a file stored in the SCM branch and use that.

Resources