We have multiple jobs, 'primary', 'secondary' and 'backup' - All need to have the same parameters (release versions i.e '1.5.1') - There at around 15 of them.
parameters{
string(name: 'service1', defaultValue: 'NA', description: 'Verison' )
string(name: 'service2', defaultValue: 'NA', description: 'Verison' )
string(name: 'service3', defaultValue: 'NA', description: 'Verison' )
}
My pipeline is like the below, how can I use the same above paramaters for all 3 build jobs without having to specify the parameters three times?
//This will kick of the three pipeline scripts required to do a release in PROD
pipeline {
agent any
stages
{
stage('Invoke pipeline primary') {
steps {
build job: 'primary'
}
}
stage('Invoke pipeline secondary') {
steps {
build job: 'secondary'
}
}
stage('backup') {
steps {
build job: 'backup'
}
}
}
}
I've found this answer here, but this seems to use groovy syntax and i'm not sure if this can also be used in a declarative pipline like the above?
When I tried it, I get the below:
Running on Jenkins in PipelineTest
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Invoke pipeline primary)
[Pipeline] build
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: No item named null found
Finished: FAILURE
When I run this primary pipeline by itself, it runs as expected.
Thanks!
Edit: Tried the answer provided by #hakamairi but get the below, I'm not great at DSL but tried a few different variations and none worked / all had similar types of errors around expecting a ParamValue.
//This will kick of the three pipeline scripts required to do a release in PROD
pipeline {
agent any
parameters{
string(name: 'service1', defaultValue: 'NA', description: 'Version' )
string(name: 'service2', defaultValue: 'NA', description: 'Version' )
}
stages
{
stage('Invoke pipeline PrimaryRelease') {
steps {
build job: 'PythonBuildTest', parameters: params
}
}
}
}
Error:
java.lang.UnsupportedOperationException: must specify $class with an
implementation of interface java.util.List at
org.jenkinsci.plugins.structs.describable.DescribableModel.resolveClass(DescribableModel.java:503)
at
org.jenkinsci.plugins.structs.describable.DescribableModel.coerce(DescribableModel.java:402)
at
org.jenkinsci.plugins.structs.describable.DescribableModel.injectSetters(DescribableModel.java:361)
at
org.jenkinsci.plugins.structs.describable.DescribableModel.instantiate(DescribableModel.java:284)
at
org.jenkinsci.plugins.workflow.steps.StepDescriptor.newInstance(StepDescriptor.java:201)
at org.jenkinsci.plugins.workflow.cps.DSL.invokeStep(DSL.java:208)
at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:153)
at
org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:122)
at sun.reflect.GeneratedMethodAccessor956.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325) at
groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1213) at
groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022) at
org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
at
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:157)
at
org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
at
org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:133)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:155)
at
org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:159)
at
org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
at
com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
Caused: java.lang.IllegalArgumentException: Could not instantiate
{job=PythonBuildTest, parameters={service1=NA,
I think you can use the parameters on the pipeline level and just pass the parameters in build calls.
//This will kick of the three pipeline scripts required to do a release in PROD
pipeline {
agent any
parameters{
string(name: 'service1', defaultValue: 'NA', description: 'Verison' )
string(name: 'service2', defaultValue: 'NA', description: 'Verison' )
string(name: 'service3', defaultValue: 'NA', description: 'Verison' )
}
stages
{
stage('Invoke pipeline primary') {
steps {
build job: 'primary', parameters: ([] + params)
}
}
stage('Invoke pipeline secondary') {
steps {
build job: 'secondary', parameters: ([] + params)
}
}
stage('backup') {
steps {
build job: 'backup', parameters: ([] + params)
}
}
}
}
I mostly use scripted approach and something like the below works:
def all_params = [
string(name: 'service1', defaultValue: 'NA', description: 'Version' ),
string(name: 'service2', defaultValue: 'NA', description: 'Version' ),
string(name: 'service3', defaultValue: 'NA', description: 'Version' ),
]
properties([parameters(all_params)])
It should be possible to wrap the above code in a script block and use it in a declarative pipeline as well.
Related
def BUILD_USER = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
pipeline {
agent {label "master"}
parameters {
string(name: 'BUILD', defaultValue: '123')
booleanParam(name: 'Deploy', defaultValue: 'true')
booleanParam(name: 'Upgrade_Config', defaultValue: 'true')
booleanParam(name: 'SchemaComparison', defaultValue: 'false')
booleanParam(name: 'Publish_Server', defaultValue: 'false')
}
stages {
stage ('Start Deployment') {
agent {label "master"}
steps{
script{
sh '''
rm -rf /params/parameters
cd /params
echo $BUILD_USER
python3 buildParameters.py --Build=$BUILD --Publish_VM=$Publish_Server --userName=BUILD_USER --Upgrade_Config=$Upgrade_Config
'''
file = readFile('/params/parameters.txt')
}
}
}
stage ('UpgradeConfigurations') {
when {
expression { params.Deploy == true }
}
agent {label "master"}
environment {
file = "${file}"
}
steps{
script{
println("${file}")
build(job: 'UpgradeConfigurations', parameters: [ file(name: 'parameters', file: "${file}"), string(name: 'build_uniqe_id' , value: "${BUILD_USER}") , booleanParam(name: 'Deploy' , value: "${Deploy}") , booleanParam(name: 'SchemaComparison' , value: "${SchemaComparison}")], propagate: false, wait: false )
}
}
}
}
}
buildParameters.py file generate some additional parameters in parameters.txt file on master vm and I am trying to pass it to the upstream job UpgradeConfigurations
Upstream job UpgradeConfigurations is getting started but file parameters are not getting passed as parameters to it.
I have tried using base64file as well but no luck.
Referred Build Plugin doc:
https://www.jenkins.io/doc/pipeline/steps/pipeline-build-step/
We're trying to start a downstream build which expects "UserName" and "UserPassword" as parameters.
withCredentials([usernamePassword(
credentialsId: params.deployCredentialsId,
usernameVariable: 'MY_USER',
passwordVariable: 'MY_PASS',
)]) {
build(job: "deploy/nightly",
parameters: [stringParam(name: "UserName", value: MY_USER), password(name: "UserPassword", value: MY_PASS),
... more parameters
)
}
but the downstream job never sees the UserName / UserPassword parameters. Is there a bug in the above definition, or should I look at the downstream job?
You need to look in the downstream job. It should have a 'parameters' block that looks like:
parameters {
string(defaultValue: "", description: 'foo', name: "UserName")
string(defaultValue: "", description: 'foo', name: "UserPassword")
}
Then in your stage you can do this:
stage('PrintParameter'){
steps{
sh 'echo ${UserName}'
}
}
Let's say you have 2 Jenkins pipeline jobs called job-A and job-B and you want to call job-B from job-A by passing some parameters and you want to access those on job-B. The example would look like below:
job-A
pipeline {
agent any;
stages {
stage('Build Job-B') {
steps {
build(job: "job-B", parameters: [string(name:"username",value: "user"),password(name:"password",value: "fake")], propagate: false, wait: true, quietPeriod: 5)
}
}
}
}
job-B
pipeline {
agent any;
stages {
stage('echo parameter') {
steps {
sh "echo $params"
}
}
}
}
Note-
params is by default available in all pipeline job, no matter whether you use a scripted pipeline or a declarative pipeline.
you no need to define any parameter in the downstream job
Editing the question: I am trying to run a simple pipeline job which triggers another pipeline job and sends parameter values.
I tried a simplified usecase in the example below
Piepeline - Parent
pipeline{
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('Invoke sample_pipleline') {
steps {
CommitID = 'e2b6cdf1e8018560b3ba51cbf253de4f33647b5a'
Branch = "master"
input id: 'Environment', message: 'Approval', parameters: [choice(choices: ['FRST', 'QA', 'PROD'], description: '', name: 'depServer')], submitter: 'FCMIS-SFTP-LBAAS', submitterParameter: 'user'
build job: 'simple_child',
parameters: [string(name: 'CommitID', value: 'e2b6cdf1e8018560b3ba51cbf253de4f33647b5a'),string(name: 'Environment', value: depServer), string(name: 'Branch', value: 'master')],
quietPeriod: 1
}
}
}
}
Pipeline - child
pipeline{
agent any
parameters {
string defaultValue: '', description: 'K', name: 'depServer'
}
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('CodePull') {
steps {
echo "Testing"
echo "${depServer}"
}
}
}
}
When I run the parent pipeline it did not trigger child pipeline but gave error.
Started by user ARAV
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Windows_aubale in C:\Users\arav\Documents\Proj\Automation\Jenkins\Jenkins_slave_root_directory\workspace\sample_parent2
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Invoke sample_pipleline)
[Pipeline] input
Input requested
Approved by ARAV
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: depServer for class: groovy.lang.Binding
After implementing changes suggested and with some tweaks The parent job triggers the child job but the child log shows that it doesn't receive the parameter passed.
Started by upstream project "sample_parent" build number 46
originally caused by:
Started by user ARAV
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Jenkins in /var/jenkins_home/workspace/simple_child
[Pipeline] {
[Pipeline] stage
[Pipeline] { (CodePull)
[Pipeline] echo
Testing
[Pipeline] echo
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Please help me understand what am I doing wrong here.
Appreciate your help!!
If your child pipeline has parameter named "depServer":
parameters {
string name: 'depServer', defaultValue: '', description: ''
}
You should provide a value for it:
build job: 'simple_child',
parameters: [string(name: 'depServer', value: 'SOMETHING']
Finally, you should address it:
steps {
echo "Testing"
echo "${params.depServer}"
}
groovy.lang.MissingPropertyException: No such property: depServer for
class: groovy.lang.Binding
This means that you don't have defined a variable depServer.
Fix it by assigning the result of the input step to variable depServer:
steps {
script {
def input_env = input id: 'Environment', message: 'Approval', parameters: [choice(choices: ['FRST', 'QA', 'PROD'], description: '', name: 'depServer')], submitter: 'FCMIS-SFTP-LBAAS', submitterParameter: 'user'
build job: 'simple_child',
parameters: [string(name: 'CommitID', value: 'aa21a592d1039cbce043e5cefea421efeb5446a5'),string(name: 'Environment', value: input_env.depServer), string(name: 'Branch', value: "master")],
quietPeriod: 1
}
}
I've added a script block, to be able to create and assign a variable.
The input actually returns a HashMap that looks like this:
[depServer:QA, user:someUser]
That's why we have to write input_env.depServer as argument for the build job.
Thank you so much zett42 and MaratC! So finally the code that worked is as follows(combining both the answers):
Parent script:
pipeline{
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('Invoke sample_pipleline') {
steps {
script{
def CommitID
def depServer
def Branch
CommitID = 'e2b6cdf1e8018560b3ba51cbf253de4f33647b5a'
Branch = "master"
userip = input id: 'Environment', message: 'Approval', parameters: [choice(choices: ['FRST', 'QA', 'PROD'], description: '', name: 'input_env')], submitter: 'FCMIS-SFTP-LBAAS', submitterParameter: 'user'
depServer = userip.input_env
echo "${depServer}"
build job: 'simple_child',
parameters: [string(name: 'CommitID', value: "${CommitID}"),
string(name: 'Environment', value: "${depServer}"),
string(name: 'Branch', value: "${Branch}")],
quietPeriod: 1
}
}
}
}
}
Child script:
pipeline{
agent any
parameters {
string defaultValue: '', description: 'K', name: 'Environment'
}
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('CodePull') {
steps {
echo "Testing"
echo "${params.Environment}"
}
}
}
}
I have a pipeline created which takes the parameter value from the user. I want to trigger a jenkin job using this parameter.
How can I pass the parameter value to the build's parameter.
Here is my code:
pipeline {
agent any
parameters {
string(name: 'SYSTEM', defaultValue: '', description: 'Enter array. Example:SYS-123')
string(name: 'EMail', defaultValue: '', description: 'Enter email id')
}
stages {
stage('Example') {
steps {
echo "Hello ${params.SYSTEM}"
echo "Hello ${params.EMail}"
}
}
stage('core-rest-api-sanity') {
steps {
build job: 'xyz', parameters: [string(name: 'E-Mail', value: ${params.EMail}), string(name: 'SYSTEM', value: ${params.SYSTEM})]
}
}
}
}
In the above code, I am taking email and system details from the user. Then I want to trigger my job "xyz" which would require these parameter.
pipeline {
agent any
parameters {
string(name: 'SYSTEM', defaultValue: '', description: 'Enter array. Example:SYS-123')
string(name: 'EMail', defaultValue: '', description: 'Enter email id')
}
stages {
stage('Example') {
steps {
echo "Hello ${params.SYSTEM}"
echo "Hello ${params.EMail}"
}
}
stage('core-rest-api-sanity') {
steps {
build job: 'xyz', parameters: [string(name: 'E-Mail', value: params.EMail),
string(name: 'SYSTEM', value: params.SYSTEM)]
}
}
}
}
I have this jenkins pipe line which has multiple stages. Inside these stages, there are multiple jobs being executed.
When I build the job I'd like to have a set of check boxes and the pipe line should build only what I've checked inside the pipeline stages. Is there any plugins or methods I can use to achieve this?
Sample pipeline code.
As per below example, there are jobs called job_A1, job_B1, job_C1, job_D1, job_A2, job_B2, job_C2 and job_D2. If I click Build with parameters, it should prompt me check boxes and I should be able to check any job I want so that the pipe line will build only the ones I checked.
Thanks in Advance.
pipeline {
agent {label 'server01'}
stages {
stage('Build 01') {
steps {
parallel (
"BUILD A1" : {
build job: 'job_A1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD B1" : {
build job: 'job_B1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD C1" : {
build job: 'job_C1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD D1" : {
build job: 'job_D1',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
)
}
}
stage('Build 02') {
steps {
parallel (
"BUILD A2" : {
build job: 'job_A2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD B2" : {
build job: 'job_B2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD C2" : {
build job: 'job_C2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
"BUILD D2" : {
build job: 'job_D2',
parameters:[
string(name: 'PARAM01', value: "$PARAM01"),
string(name: 'PARAM02', value: "$PARAM02")
]
},
)
}
}
}
}
Thanks #mbn217 for your answer, but ExtendedChoice parameter didn't help much in my scenario.
Anyway, I could do it using boolean parameters and calling it inside the pipeline using the script tag.
Example pipeline script
stage ('BUILD A') {
steps {
script {
if (params.get('boolA',true)) {
build job: '_build_A', parameters: [string(name: 'param1', value: "$param1"),string(name: 'param2', value: "$param2")]
} else {
echo "A is not selected to build"
}
}
}
}
stage ('BUILD B') {
steps {
script {
if (params.get('boolB',true)) {
build job: '_build_B', parameters: [string(name: 'param1', value: "$param1"),string(name: 'param2', value: "$param2")]
} else {
echo "B is not selected to build"
}
}
}
}
You can use ExtendedChoiceParameter to accomplish what you want. Basically you will nee to parametrize job Names too using this jenkins plugin.
You can use a list of checkboxes as shown in the screen shot