Return value between Pipelines - jenkins

I have two pipelines, that works isolated between each other. And a third that invoked both of them, and it require information returned by one of them.
From the Pipeline C invoke service-docker-build and I want to obtain the Docker image tag created in that pipeline.
def dockerBuildResponse = build job: "service-docker-build", propagate: false
IMAGE_TAG = dockerBuildResponse.getBuildVariables()["IMAGE_TAG"]
In the service-docker-build pipeline I define this variable
IMAGE_TAG = Globals.nexus3PrdUrl.replaceAll("^https?://", "") + "/" + Globals.image.imageName()
But I still have in pipeline C the IMAGE_TAG a null
Any idea how can I pass this value to the invoker pipeline?
Regards

Related

How to call a method inside triggers block in Jenkinfile

I have a pipeline which needs to be scheduled to run at a particular time. There are some dynamic parameters that needs to be passed while running the pipeline.
I have created a function that gives me the desired parameter value. However this pipeline does not get triggered as the function value is not getting resolved inside trigger block & is getting treated as string.
getlatest is the method I created which takes in 3 parameters. The value of this method is not getting resolved & instead treated as string. The pipeline rund as expected if I hardcode some value for version.
triggers{
parameterizedCron("H/5 * * * * % mod=test; version=getlatest('abc','xyz','lmn');")
}
The problem is that the code that calculates the parameter — just like any other code in Jenkins — needs an executor to run. To get an executor, you need to run your pipeline. To run your pipeline, you need to give Jenkins the parameters. But to give Jenkins the parameters, you need to run your code.
So there's a chicken and egg problem, there.
To break out of this cycle, you may want to run scripted pipeline before you run the declarative one:
node('built-in') { // or "master", or any other
def version = getlatest('abc','xyz','lmn')
def cron_parameters = "H/5 * * * * % mod= test; version=${version}"
println "cron_parameters is ${cron_parameters}"
env.CRON_PARAM = cron_parameters
}
pipeline {
agent { node { label "some_label" } }
triggers {
parameterizedCron(env.CRON_PARAM)
}
// ...
}
I've never seen this being tried before so I don't know if what you are doing is something Jenkins is capable of. Instead, remove the parameter and create an environment variable called version and assign the function result to that:
environment {
VERSION = getlatest('abc','xyz','lmn')
}
And reference this VERSION variable instead of your input parameter.
How to reference:
env.VERSION or ${VERSION} or ${env.VERSION}
Examples:
currentBuild.displayName=env.VERSION
env.SUBJECT="Checkout Failure on ${VERSION}"
string(name: 'VERSION', value: "${env.VERSION}")

Using same Jenkinsfile for two separate jobs in same repo

I have two separate Jenkins jobs that will run on one repository: My Jenkinsfile has a step that will run with this property enabled: enableZeroDownTime. The purpose of the 2nd Jenkins Job is to run the step with this property enableZeroDownTime disabled. Does anyone know how I can control it using the same Jenkinsfile? Can I pass that using some parameter based on any properties file? I am really confused on this.
stage('CreateCustomer') {
steps {
script {
common.runStage("#CreateCustomer")
common.runStage("#SetOnboardingCustomerManifest")
common.runStage("#enableZeroDownTime")
}
}
}
Solution
I currently run multiple pipelines that use the same Jenkinsfile. The change to conditionally execute a stage is trivial.
stage('CreateCustomer') {
when {
environment name: 'enableZeroDownTime', value: 'true'
}
steps {
script {
common.runStage("#CreateCustomer")
common.runStage("#SetOnboardingCustomerManifest")
common.runStage("#enableZeroDownTime")
}
}
}
The CreateCustomer stage will only run when the enableZeroDownTime parameter is set to true ( it can be a String parameter with value true, or a boolean parameter ).
The trick here is that you cannot add the parameters{} block to your declarative pipeline. For example if you had the following
parameters {
string(name: 'enableZeroDownTime', defaultValue: 'true')
}
Both pipelines would default to true. If you had the following
parameters {
string(name: 'enableZeroDownTime', defaultValue: '')
}
Both pipelines would default to a blank default value.
Even if you manually save a different default value to the pipeline after creation it will be overwritten next run with a blank default value.
Instead you simply need to remove the parameters{} block altogether and manually add the parameters through the web interface
Additionally...
Additionally it is possible to have two pipelines use the same Jenkinsfile with different parameters. For example, lets say Pipeline A had a enableZeroDownTime parameter defaulted to true and Pipeline B had no parameters at all. In this case you can add an environment variable of the same name and set the value equal to the following ternary expression
environment {
enableZeroDownTime = "${params.enableZeroDownTime != null ? "${params.enableZeroDownTime}" : false}"
}
You can then reference this parameter in the when declarative ( or anywhere in the pipeline ) without fear of the pipeline throwing a null pointer exception.

Define you own global variable for JenkinsJob (Not for ALL jobs!!)

I have ha Jenkins job that has a string input parameter of the build flags for the make command in my Jenkins job. My problem is that some users forget to change the parameter values when we have a release branch. So I want to overwrite the existing string input parameter (or create a new one) that should be used if the job is a release job.
This is the statement I want to add:
If branch "release" then ${params.build_flag} = 'DEBUGSKIP=TRUE'
and the code that is not working is:
pipeline {
agent none
parameters {
string(name: 'build_flag', defaultValue: 'DEBUGSKIP=TRUE', description: 'Flags to pass to build')
If {
allOf {
branch "*release*"
expression {
${params.build_flag} = 'DEBUGSKIP=TRUE'
}
}
}else{
${params.build_flag} = 'DEBUGSKIP=FALSE'
}
}
The code above explains what I want to do but I don't know to do it.
If you can, see if you could use the JENKINS EnvInject Plugin, with your pipeline, using the supported use-case:
Injection of EnvVars defined in the "Properties Content" field of the Job Property
These EnvVars are being injected to the script environment and will be inaccessible via the "env" Pipeline global variable (as in here)
Or writing the right values in a file, and using that file content as "Properties Content" of a downstream job (as shown there).

Get parameters of Jenkins build by job name and build id

I am using Jenkins Pipeline plugin and I need to get all parameters of particular build by its id and job name from other job.
So, basically i need something like this.
def job = JobRegistry.getJobByName(jobName)
def build = job.getBuild(buildId)
Map parameters = build.getParameters()
println parameters['SOME_PARAMETER']
I figured it out.
I can retrieve parameters like this
def parameters = Jenkins.instance.getAllItems(Job)
.find {job -> job.fullName == jobName }
.getBuildByNumber(buildId.toInteger())
.getAction(hudson.model.ParametersAction)
println parameters.getParameter('SOME_PARAMETER').value
I suggest you to review "Pipeline Syntax" in a pipeline job, at bottom of Pipeline plugin, and you can see Global Variable Reference, like docker/pipeline/env/etc.
So what you need, JOB_NAME / BUILD_ID is given in "env" list

Jenkins Buildflow plugin: how to make variable numbers of jobs in parallel?

I have a Job made with BuildFlow, this jobs receives a parameter like job1, job2, job1 job2.
In my DSL I separate the value of the parameter with a split(","), so now I have an array with: ["job1","job2","job1 job2"].
Now i want to make the DSL run a subjob with X builds in parallel, where X is the size of the array, and iterate to get each position of the array as a paramater to pass to the build of the subjob.
try this within your dsl:
subjob = "yourJobName"
subjobIteration = []
["job1","job2","job1 job2"].each{ parameter ->
//add the closure to trigger the subjob to the list
subjobIteration.add({build( subjob, parameter )})
}
parallel( subjobIteration )
This snippet uses the Syntax for Parallel job-executions documented here.
Groovy processes the list subjobIteration by default for the "parallel"-DSL correctly, so no further steps are needed.

Resources