I have created a jenkins pipleine to run a job (e.g. Pipeline A runs job B). Within job B there is multiple parameters. One of the parameters is a choice parameter that has multiple different choices. I need pipeline A to run job B with all of the different choices at once (Pipeline A runs Job B with all of the different choices in one build). I am not too familiar with using the Jenkins declarative syntax but I am guessing I would use some sort of for loop to iterate over all of the available choices?
I have searched and searched through Stack overflow/google for an answer but have not had much luck.
You can define the options in a separate file outside your jobs, in shared library:
// vars/choices.groovy
def my_choices = [
"Option A",
"Option B", // etc.
]
You can then use these choices when defining the job:
// Job_1 Jenkinsfile
#Library('my-shared#master') _
properties([
parameters([
[$class: 'ChoiceParameterDefinition',
name: 'MY_IMPORTANT_OPTION',
choices: choices.my_choices as List,
description: '',
],
...
pipeline {
agent { any }
stages {
...
In Job 2, you can iterate over the values:
// Job_2 Jenkinsfile
#Library('my-shared#master') _
pipeline {
agent { any }
stages {
stage {
for (String option : choices.my_choices) {
build job: "Job_1",
wait: false,
parameters: [ string(name: 'MY_IMPORTANT_OPTION', value: option) , // etc.
]
Job_2 when it is run will asynchronously trigger a number of runs of Job_1 each time with a different parameter.
Related
I have 2 Jenkins Job, say ParentJob and ChildJob.
The ParentJob has a Active Choice Parameter, say ENV, with the below groovy script:
return[
'A','B','C',
]
The ChildJob also has a similar Active Choice Parameter, say ENV, with the same groovy script. Additionally, there is also an Active Choices Reactive Parameter, say ENV_URL with ENV as the Reference Parameter and with following groovy script:
if(ENV.equals("A")){
return ["https://a.com"]
}else if(ENV.equals("B")){
return ["https://b.com"]
} else {
return ["https://c.com"]
Now, I'm calling ChildJob from my ParentJob using a pipeline script. When I set ENV as "A" in my ParentJob, which internally calls ChildJob,
ParentJob pipeline code:
pipeline {
agent {
node {
}
}
stages {
stage('ChildJob') {
steps {
script {
JOB_NAME="ChildJob"
def myJob=build job: "${JOB_NAME}", parameters: [
string(name: 'ENV', value:"${ENV}")
]
}
The Active Choices Parameter for ENV in the ChildJob is set to A
However, the Active Choice Reactive Parameter ENV_URL is empty and IS NOT SET with the value "http://a.com"
Basically, would want the Active Choices Reactive Parameter to set a value based on the Reference Parameter which is set from a Parent job.
Any suggestions on how this can be achieved?
I don't think this is possible. The best option for you is to Pass the parameter from the parent Job.
JOB_NAME="ChildJob"
URL = getURLByEnv("$ENV") // Retrive the URL based on the Same logic in your child job.
build job: '$JOB_NAME', parameters:[
string(name: 'ENV', value: "$ENV"),
string(name: 'url', value: "$URL")
]
I'm running version 2.249.3 of jenkins and try to create a pipeline that remove all old instances.
for (String Name : ClustersToRemove) {
buildRemoveJob (Name, removeClusterBuilds, removeClusterBuildsResults)
parallel removeClusterBuilds
}
and what the method does is :
def buildRemoveJob (Name, removeClusterBuilds, removeClusterBuildsResults) {
removeClusterBuilds[clusterName] = {
//Random rnd = new Random()
//sleep (Math.abs(rnd.nextInt(Integer.valueOf(rangeToRandom)) + Integer.valueOf(minimumRunInterval)))
removeClusterBuildsResults[clusterName] = build job: 'Delete_Instance', wait: true, propagate: false, parameters: [
[$class: 'StringParameterValue', name: 'Cluster_Name', value: clusterName],
]
}
But... I get only one downstream job that is being launched.
I found this bug https://issues.jenkins.io/browse/JENKINS-55748 but it looks that someone must have solved this issue since it's a very common scenario.
Also here - How to run the same job multiple times in parallel with Jenkins? - I found a documentaion but looks that it does not apply from same jobs
The version of build pipeline plugin is 1.5.8
From the parallel command documentation:
Takes a map from branch names to closures and an optional argument failFast which will terminate all branches upon a failure in any other branch:
parallel firstBranch: {
// do something
}, secondBranch: {
// do something else
},
failFast: true|false
So you should first create a map of all executions and then run them all in parallel.
In your example, you should first iterate over the strings and create the executions map, then pass it to the parallel command. Something like this:
def executions = ClustersToRemove.collectEntries {
["building ${it}": {
stage("Build") {
removeClusterBuildsResults[it] = build job: 'Delete_Instance', wait: true, propagate: false,
parameters: [[$class: 'StringParameterValue', name: it, value: clusterName]]
}
}]
}
parallel executions
or without the variable:
parallel ClustersToRemove.collectEntries {
...
Yes to be straight..
Depends on number of agents you have.
if you have single agent then other triggers go to queue state.
Hope that answers your question.
I'm trying to solve the same problem as this SO question: How to trigger a jenkins build on specific node using pipeline plugin?
The only difference in my case is that the job I'm triggering is another scripted pipeline job. So the second step in the proposed solution does not apply in my case:
Install Node and Label parameter plugin
In test_job's configuration, select 'This build is parameterized' and add a Label parameter and set the parameter name to 'node'
In pipeline script, use code (code omitted)
My question is how to define the :
org.jvnet.jenkins.plugins.nodelabelparameter.LabelParameterDefinition
parameter inside my scripted pipeline parameterized job (not through the GUI).
What I have tried:
properties([[$class : 'RebuildSettings',
autoRebuild : false,
rebuildDisabled: false],
parameters([org.jvnet.jenkins.plugins.nodelabelparameter.LabelParameterDefinition(name: 'node')])])
The easiest way to generate the code you need for your parameterized scripted pipeline is to:
Go to Pipeline Snippet Generator
Select "properties: Set job properties"
Check "This project is parameterized"
Click "Add parameter" and select "Label"
Click "Generate pipeline script"
This gives you:
properties([
[$class: 'RebuildSettings', autoRebuild: false, rebuildDisabled: false],
parameters([
[$class: 'LabelParameterDefinition',
allNodesMatchingLabel: false,
defaultValue: '',
description: '',
name: 'node',
nodeEligibility: [$class: 'AllNodeEligibility'], t
riggerIfResult: 'allCases']
]
)
])
But in my case this wasn't even necessary. All you need is a regular string parameter with a custom name, lets say "node" and then do:
node(params.node){}
If your use case is to have generic pipeline to be executed in particular Agent Node, then you can use 'Agent-Server-parameter' plugin with which you can add agent-name parameter as agent of your choice from drop down, into parameterized pipeline (or call it as 'Master' pipeline) and can use agent-name parameter under your pipeline script(e.g. calling sample.groovy inside Master parameterized-pipeline).
And for another parameters, (may be string, boolean, choice) you defined within pipeline(without GUI).
See the below example of sample.groovy which I am calling from Master job.
#!groovy
/* This Groovy implementation is pipeline for a Sample project */
pipeline {
agent { label params['agent-name'] } //agent can be configured for stage as well.
options {
timeout(time: 1, unit: 'HOURS', activity: true) // abort if nothing happens
timestamps() // prepend timestamps on the console output
}//option
parameters {
booleanParam(
name: 'BOO_PARAM1', defaultValue: false,
description: 'Enable Parameter 1')
booleanParam(
name: 'BOO_PARAM2', defaultValue: false,
description: 'Enable Parameter 2')
stringParam('MY_PATH', 'C:\SampleProject')
choiceParam('RUN_JOBON_NODE', ['YES', 'NO'])
}//parameters
environment {
/* Environment Variable definition and its use */
BOO_PARAM1 = "${params.BOO_PARAM1}"
}//environment
stages {
/* agent is single for complete pipeline but can be changed for stage */
stage('Hello') {
when {
expression { return params.BOO_PARAM1}
}
print"Hello Stage on %agent-name%"
} // Stage
}//stages
}//pipeline
Note: post build stage is excluded.
'agent-server-parameter' plugin gives you leverage to have generic pipeline (common stages) to be executed on different Nodes.
Pipeline executes two jobs : Job X and Job Y.
The Pipeline has a choice parameter which is required by Job Y.
Choice parameter options: A,B and C.
Job Y has 3 conditions :
if Choice==A then do task 1
else if Choice==B then do task 2
else do task 3
Getting stuck at declaring choice conditions at STAGE of Job Y.
p.s. tried Active Choice parameter, can't work through it.
Can anybody help out with the logic for this problem?
I am assuming that Job Y is called from your pipeline as a downstream job. Thus somewhere, (probably the end of your pipeline) you will have:
build job: 'CloudbeeFolder1/Path/To/JobY', propagate: false, wait: false, parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: "${env.SOME_VALUE}"]]
Then in JobY on the "other side" you have:
environment {
PARAM_FROM_PIPELINE = "${params.MY_PARAM}"
}
This gets the value of your parameter into an environment variable in JobY.
Depending on what the tasks are you could perform them in a batch (or sh) file by passing PARAM_FROM_PIPELINE like so:
stages {
stage("Do Tasks") {
steps {
bat "mybatchfile.bat ${env.PARAM_FROM_PIPELINE}"
}
}
}
Finally in mybatchfile.bat you can read the value of ${env.PARAM_FROM_PIPELINE} like so:
#ECHO OFF
SET PARAM_VAL=%1
ECHO PARAM VALUE IS: %PARAM_VAL%
IF %PARAM_VAL% = "A" (
REM DO TASK1
) ELSE (
IF %PARAM_VAL% = "B" (
REM DO TASK2
) ELSE (
REM DO TASK3
)
)
If you don't want to encapsulate the if-else logic in a batch file you can use a script {...} block in your Jenkinsfile to use scripted pipeline.
I am building pipeline workflow in Jenkins v.2.8. What I would like to achieve is to build one step which would trigger same job multiple times as same time with different parameters.
Example: I have a worfklow called "Master" which has one step, this step is reading my parameter "Number" which is a check box with multiple selection option. So user can trigger workflow and select option for Number like "1, 2, 3". Now what I would like to achieve when this step is executed that it calls my job "Master_Child" and triggers "Master_Child" with 3 different parameters at the same time.
I tried to do it in this way:
stage('MyStep') {
steps {
echo 'Deploying MyStep'
script {
env.NUMBER.split(',').each {
build job: 'Master_Child', parameters: [string(name: 'NUMBER', value: "$it")]
}
}
}
}
But with this it reads first parameter triggers the Mast_Child with parametre 1 and it waits until the jobs is finished, when job is finished then it triggers the same the job with parameter 2.
If I use wait: false on job call, then pipeline workflow just calls this jobs with different parameters but it is not depended if sub job fails.
Any ideas how to implement that ?
Thank you in advance.
I resolved my problem in this way.
stage('MyStage') {
steps {
echo 'Deploying MyStep'
script {
def numbers = [:]
env.NUMBER.split(',').each {
numbers["numbers${it}"] = {
build job: 'Master_Child', parameters: [string(name: 'NUMBER', value: "$it")]
}
}
parallel numbers
}
}
}
Set the wait in the build job syntax to false wait: false
stage('MyStep') {
steps {
echo 'Deploying MyStep'
script {
env.NUMBER.split(',').each {
build job: 'Master_Child', parameters: [string(name: 'NUMBER', value: "$it")], wait: false
}
}
}
}