Unable to pass a parameter from one pipeline job to another - jenkins

Editing the question: I am trying to run a simple pipeline job which triggers another pipeline job and sends parameter values.
I tried a simplified usecase in the example below
Piepeline - Parent
pipeline{
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('Invoke sample_pipleline') {
steps {
CommitID = 'e2b6cdf1e8018560b3ba51cbf253de4f33647b5a'
Branch = "master"
input id: 'Environment', message: 'Approval', parameters: [choice(choices: ['FRST', 'QA', 'PROD'], description: '', name: 'depServer')], submitter: 'FCMIS-SFTP-LBAAS', submitterParameter: 'user'
build job: 'simple_child',
parameters: [string(name: 'CommitID', value: 'e2b6cdf1e8018560b3ba51cbf253de4f33647b5a'),string(name: 'Environment', value: depServer), string(name: 'Branch', value: 'master')],
quietPeriod: 1
}
}
}
}
Pipeline - child
pipeline{
agent any
parameters {
string defaultValue: '', description: 'K', name: 'depServer'
}
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('CodePull') {
steps {
echo "Testing"
echo "${depServer}"
}
}
}
}
When I run the parent pipeline it did not trigger child pipeline but gave error.
Started by user ARAV
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Windows_aubale in C:\Users\arav\Documents\Proj\Automation\Jenkins\Jenkins_slave_root_directory\workspace\sample_parent2
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Invoke sample_pipleline)
[Pipeline] input
Input requested
Approved by ARAV
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: depServer for class: groovy.lang.Binding
After implementing changes suggested and with some tweaks The parent job triggers the child job but the child log shows that it doesn't receive the parameter passed.
Started by upstream project "sample_parent" build number 46
originally caused by:
Started by user ARAV
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] Start of Pipeline
[Pipeline] node
Running on Jenkins in /var/jenkins_home/workspace/simple_child
[Pipeline] {
[Pipeline] stage
[Pipeline] { (CodePull)
[Pipeline] echo
Testing
[Pipeline] echo
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Please help me understand what am I doing wrong here.
Appreciate your help!!

If your child pipeline has parameter named "depServer":
parameters {
string name: 'depServer', defaultValue: '', description: ''
}
You should provide a value for it:
build job: 'simple_child',
parameters: [string(name: 'depServer', value: 'SOMETHING']
Finally, you should address it:
steps {
echo "Testing"
echo "${params.depServer}"
}

groovy.lang.MissingPropertyException: No such property: depServer for
class: groovy.lang.Binding
This means that you don't have defined a variable depServer.
Fix it by assigning the result of the input step to variable depServer:
steps {
script {
def input_env = input id: 'Environment', message: 'Approval', parameters: [choice(choices: ['FRST', 'QA', 'PROD'], description: '', name: 'depServer')], submitter: 'FCMIS-SFTP-LBAAS', submitterParameter: 'user'
build job: 'simple_child',
parameters: [string(name: 'CommitID', value: 'aa21a592d1039cbce043e5cefea421efeb5446a5'),string(name: 'Environment', value: input_env.depServer), string(name: 'Branch', value: "master")],
quietPeriod: 1
}
}
I've added a script block, to be able to create and assign a variable.
The input actually returns a HashMap that looks like this:
[depServer:QA, user:someUser]
That's why we have to write input_env.depServer as argument for the build job.

Thank you so much zett42 and MaratC! So finally the code that worked is as follows(combining both the answers):
Parent script:
pipeline{
agent any
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('Invoke sample_pipleline') {
steps {
script{
def CommitID
def depServer
def Branch
CommitID = 'e2b6cdf1e8018560b3ba51cbf253de4f33647b5a'
Branch = "master"
userip = input id: 'Environment', message: 'Approval', parameters: [choice(choices: ['FRST', 'QA', 'PROD'], description: '', name: 'input_env')], submitter: 'FCMIS-SFTP-LBAAS', submitterParameter: 'user'
depServer = userip.input_env
echo "${depServer}"
build job: 'simple_child',
parameters: [string(name: 'CommitID', value: "${CommitID}"),
string(name: 'Environment', value: "${depServer}"),
string(name: 'Branch', value: "${Branch}")],
quietPeriod: 1
}
}
}
}
}
Child script:
pipeline{
agent any
parameters {
string defaultValue: '', description: 'K', name: 'Environment'
}
options {
buildDiscarder logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '5')
}
stages {
stage('CodePull') {
steps {
echo "Testing"
echo "${params.Environment}"
}
}
}
}

Related

How to pass file parameters to upstream pipeline job in build step plugin

def BUILD_USER = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
pipeline {
agent {label "master"}
parameters {
string(name: 'BUILD', defaultValue: '123')
booleanParam(name: 'Deploy', defaultValue: 'true')
booleanParam(name: 'Upgrade_Config', defaultValue: 'true')
booleanParam(name: 'SchemaComparison', defaultValue: 'false')
booleanParam(name: 'Publish_Server', defaultValue: 'false')
}
stages {
stage ('Start Deployment') {
agent {label "master"}
steps{
script{
sh '''
rm -rf /params/parameters
cd /params
echo $BUILD_USER
python3 buildParameters.py --Build=$BUILD --Publish_VM=$Publish_Server --userName=BUILD_USER --Upgrade_Config=$Upgrade_Config
'''
file = readFile('/params/parameters.txt')
}
}
}
stage ('UpgradeConfigurations') {
when {
expression { params.Deploy == true }
}
agent {label "master"}
environment {
file = "${file}"
}
steps{
script{
println("${file}")
build(job: 'UpgradeConfigurations', parameters: [ file(name: 'parameters', file: "${file}"), string(name: 'build_uniqe_id' , value: "${BUILD_USER}") , booleanParam(name: 'Deploy' , value: "${Deploy}") , booleanParam(name: 'SchemaComparison' , value: "${SchemaComparison}")], propagate: false, wait: false )
}
}
}
}
}
buildParameters.py file generate some additional parameters in parameters.txt file on master vm and I am trying to pass it to the upstream job UpgradeConfigurations
Upstream job UpgradeConfigurations is getting started but file parameters are not getting passed as parameters to it.
I have tried using base64file as well but no luck.
Referred Build Plugin doc:
https://www.jenkins.io/doc/pipeline/steps/pipeline-build-step/

can we run Jenkins file in pipeline?

I've Pipeline Generic Webhook from Bitbucket, this is a job to trigger another job.
currentBuild.displayName = "Generic-Job-#" + env.BUILD_NUMBER
pipeline {
agent any
triggers {
GenericTrigger(
genericVariables: [
[key: 'actorName', value: '$.actor.display_name'],
[key: 'TAG', value: '$.push.changes[0].new.name'],
[key: 'REPONAME', value: '$.repository.name'],
[key: 'GIT_URL', value: '$.repository.links.html.href'],
],
token: '11296ae8d97b2134550f',
causeString: ' Triggered on $actorName version $TAG',
printContributedVariables: true,
printPostContent: true
)
}
stages {
stage('Build Job DEVELOPMENT') {
when {
expression { return params.TARGET_ENV == 'DEVELOPMENT' }
}
steps {
build job: 'DEVELOPMENT',
parameters: [
[$class: 'StringParameterValue', name: 'FROM_BUILD', value: "${BUILD_NUMBER}"],
[$class: 'StringParameterValue', name: 'TAG', value: "${TAG}"],
[$class: 'StringParameterValue', name: 'GITURL', value: "${GIT_URL}"],
[$class: 'StringParameterValue', name: 'REPONAME', value: "${REPONAME}"],
[$class: 'StringParameterValue', name: 'REGISTRY_URL', value: "${REGISTRY_URL}"],
]
}
}
}
}
Another Pipeline
pipeline {
agent any
stages {
stage('Cleaning') {
steps {
cleanWs()
}
}
def jenkinsFile
stage('Loading Jenkins file') {
jenkinsFile = fileLoader.fromGit('Jenkinsfile', "${GIT_URL}", "${TAG}", null, '')
}
jenkinsFile.start()
}
}
can i run Jenkinsfile in Pipeline ? Because every project I make has a different Jenkinsfile, it can't be the same, but when I run this it doesn't execute the Jenkinsfile
it works for me :D
Sample Pipeline
stage 'Load a file from GitHub'
def jenkinsFile = fileLoader.fromGit('<path-jenkinsfile>', "<path-git>", "<branch>", '<credentials>', '')
stage 'Run method from the loaded file'
jenkinsFile
pipeline {
agent any
stages {
stage('Print Hello World Ke #1') {
steps {
echo "Hello Juan"
}
}
}
}
before run the pipeline, you must install plugin "Pipeline Remote Loader Plugin Version"

How to display the selected parameter in Jenkins?

There is a job groove pipeline that asks for parameters from the user interactively. After entering, I cannot display the selected parameters.
Here is my code:
node {
stage('Input Stage') {
Tag = sh(script: "echo 123'\n'456'\n'789'\n'111", returnStdout: true).trim()
input(
id: 'userInput', message: 'Choice values: ',
parameters: [
[$class: 'ChoiceParameterDefinition', name:'Tags', choices: "${Tag}"],
[$class: 'StringParameterDefinition', defaultValue: 'default', name:'Namespace'],
]
)
}
stage('Second Stage') {
println("${ChoiceParameterDefinition(Tags)}") //does not work
println("${ChoiceParameterDefinition(Namespace)}") //does not work
}
}
How to display the selected parameter correctly?
You would need to write the input step in a script. This should work.
node {
stage('Input Stage') {
Tag = sh(script: "echo 123'\n'456'\n'789'\n'111", returnStdout: true).trim()
script {
def userInputs =
input(
id: 'userInput', message: 'Choice values: ',
parameters: [
[$class: 'ChoiceParameterDefinition', name:'Tags', choices: "${Tag}"],
[$class: 'StringParameterDefinition', defaultValue: 'default', name:'Namespace'],
]
)
env.TAGS = userInputs['Tags']
env.NAMESPACE = userInputs['Namespace']
}
}
stage('Second Stage') {
echo "${env.TAGS}"
echo "${env.NAMESPACE}"
}
}
References:
Jenkins Declarative Pipeline: How to read choice from input step?
Read interactive input in Jenkins pipeline to a variable

Passing credentials to downstream build step in Jenkins pipeline

We're trying to start a downstream build which expects "UserName" and "UserPassword" as parameters.
withCredentials([usernamePassword(
credentialsId: params.deployCredentialsId,
usernameVariable: 'MY_USER',
passwordVariable: 'MY_PASS',
)]) {
build(job: "deploy/nightly",
parameters: [stringParam(name: "UserName", value: MY_USER), password(name: "UserPassword", value: MY_PASS),
... more parameters
)
}
but the downstream job never sees the UserName / UserPassword parameters. Is there a bug in the above definition, or should I look at the downstream job?
You need to look in the downstream job. It should have a 'parameters' block that looks like:
parameters {
string(defaultValue: "", description: 'foo', name: "UserName")
string(defaultValue: "", description: 'foo', name: "UserPassword")
}
Then in your stage you can do this:
stage('PrintParameter'){
steps{
sh 'echo ${UserName}'
}
}
Let's say you have 2 Jenkins pipeline jobs called job-A and job-B and you want to call job-B from job-A by passing some parameters and you want to access those on job-B. The example would look like below:
job-A
pipeline {
agent any;
stages {
stage('Build Job-B') {
steps {
build(job: "job-B", parameters: [string(name:"username",value: "user"),password(name:"password",value: "fake")], propagate: false, wait: true, quietPeriod: 5)
}
}
}
}
job-B
pipeline {
agent any;
stages {
stage('echo parameter') {
steps {
sh "echo $params"
}
}
}
}
Note-
params is by default available in all pipeline job, no matter whether you use a scripted pipeline or a declarative pipeline.
you no need to define any parameter in the downstream job

Pass (same) parameters to multiple build jobs in a Jenkins pipeline

We have multiple jobs, 'primary', 'secondary' and 'backup' - All need to have the same parameters (release versions i.e '1.5.1') - There at around 15 of them.
parameters{
string(name: 'service1', defaultValue: 'NA', description: 'Verison' )
string(name: 'service2', defaultValue: 'NA', description: 'Verison' )
string(name: 'service3', defaultValue: 'NA', description: 'Verison' )
}
My pipeline is like the below, how can I use the same above paramaters for all 3 build jobs without having to specify the parameters three times?
//This will kick of the three pipeline scripts required to do a release in PROD
pipeline {
agent any
stages
{
stage('Invoke pipeline primary') {
steps {
build job: 'primary'
}
}
stage('Invoke pipeline secondary') {
steps {
build job: 'secondary'
}
}
stage('backup') {
steps {
build job: 'backup'
}
}
}
}
I've found this answer here, but this seems to use groovy syntax and i'm not sure if this can also be used in a declarative pipline like the above?
When I tried it, I get the below:
Running on Jenkins in PipelineTest
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Invoke pipeline primary)
[Pipeline] build
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: No item named null found
Finished: FAILURE
When I run this primary pipeline by itself, it runs as expected.
Thanks!
Edit: Tried the answer provided by #hakamairi but get the below, I'm not great at DSL but tried a few different variations and none worked / all had similar types of errors around expecting a ParamValue.
//This will kick of the three pipeline scripts required to do a release in PROD
pipeline {
agent any
parameters{
string(name: 'service1', defaultValue: 'NA', description: 'Version' )
string(name: 'service2', defaultValue: 'NA', description: 'Version' )
}
stages
{
stage('Invoke pipeline PrimaryRelease') {
steps {
build job: 'PythonBuildTest', parameters: params
}
}
}
}
Error:
java.lang.UnsupportedOperationException: must specify $class with an
implementation of interface java.util.List at
org.jenkinsci.plugins.structs.describable.DescribableModel.resolveClass(DescribableModel.java:503)
at
org.jenkinsci.plugins.structs.describable.DescribableModel.coerce(DescribableModel.java:402)
at
org.jenkinsci.plugins.structs.describable.DescribableModel.injectSetters(DescribableModel.java:361)
at
org.jenkinsci.plugins.structs.describable.DescribableModel.instantiate(DescribableModel.java:284)
at
org.jenkinsci.plugins.workflow.steps.StepDescriptor.newInstance(StepDescriptor.java:201)
at org.jenkinsci.plugins.workflow.cps.DSL.invokeStep(DSL.java:208)
at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:153)
at
org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:122)
at sun.reflect.GeneratedMethodAccessor956.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498) at
org.codehaus.groovy.reflection.CachedMethod.invoke(CachedMethod.java:93)
at groovy.lang.MetaMethod.doMethodInvoke(MetaMethod.java:325) at
groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1213) at
groovy.lang.MetaClassImpl.invokeMethod(MetaClassImpl.java:1022) at
org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:42)
at
org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at
org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:157)
at
org.kohsuke.groovy.sandbox.GroovyInterceptor.onMethodCall(GroovyInterceptor.java:23)
at
org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onMethodCall(SandboxInterceptor.java:133)
at org.kohsuke.groovy.sandbox.impl.Checker$1.call(Checker.java:155)
at
org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:159)
at
org.kohsuke.groovy.sandbox.impl.Checker.checkedCall(Checker.java:129)
at
com.cloudbees.groovy.cps.sandbox.SandboxInvoker.methodCall(SandboxInvoker.java:17)
Caused: java.lang.IllegalArgumentException: Could not instantiate
{job=PythonBuildTest, parameters={service1=NA,
I think you can use the parameters on the pipeline level and just pass the parameters in build calls.
//This will kick of the three pipeline scripts required to do a release in PROD
pipeline {
agent any
parameters{
string(name: 'service1', defaultValue: 'NA', description: 'Verison' )
string(name: 'service2', defaultValue: 'NA', description: 'Verison' )
string(name: 'service3', defaultValue: 'NA', description: 'Verison' )
}
stages
{
stage('Invoke pipeline primary') {
steps {
build job: 'primary', parameters: ([] + params)
}
}
stage('Invoke pipeline secondary') {
steps {
build job: 'secondary', parameters: ([] + params)
}
}
stage('backup') {
steps {
build job: 'backup', parameters: ([] + params)
}
}
}
}
I mostly use scripted approach and something like the below works:
def all_params = [
string(name: 'service1', defaultValue: 'NA', description: 'Version' ),
string(name: 'service2', defaultValue: 'NA', description: 'Version' ),
string(name: 'service3', defaultValue: 'NA', description: 'Version' ),
]
properties([parameters(all_params)])
It should be possible to wrap the above code in a script block and use it in a declarative pipeline as well.

Resources