ensure jenkins pipeline using same node for download stream job - jenkins

Case:
I have 3 machine (A,B,C) for slave (sharing the same node label e.g 'build')
I have a pipeline which may trigger different downstream job. And i need to make sure that all the job and downstream job using same node (for sharing some file etc.). How i can do that?
a) I pass the node label to downstream but i am not sure if the downstream will take the same node.(parent job using slave "A" and i pass the node label 'build' to downstream job but maybe in downstream job it take slave 'B')
b) is that some way to get the runtime slave when the pipeline is executing, when i pass this slave name to downstream?
or is that any better way to do that?

I advice you to try NodeLable Parameter Plugin.
Once installed, check 'This project is parametrized' option and select 'node' from 'Add Parameter' drop down.
It will populate all nodes as drop down while building job with parameters.
It also have some other options which may help you.

Most important question to me would be: Why do they need to run on the very same node?
Anyway. One way to achieve this would be to retrieve the name of the node in the node block in the first pipeline, like (CAUTION: was not able to verify code written below):
// Code for upstream job
#NonCPS
def getNodeName(def context) {
context.toComputer().name
}
def nodeName = 'undefined'
node('build') {
nodeName = steps.getContext(FilePath)
}
build job: 'downstream', parameters: [string(name: 'nodeName', value: nodeName)]
In the downtstream you use that string parameter as input to your node block - of course you should make sure that the downstream actually is parameterized in the first place having a string parameter named nodeName:
node(nodeName) {
// do some stuff
}

Even having static agents, workspaces are eventually cleaned up, so don't rely on existence of files in the workspace on your builds.
Just archive whatever you need in the upstream job (using the archive step) and then use Copy Artifact Plugin in downstream jobs to get what you need there. You'll probably need to parameterize downstream jobs to pass them the reference to the upstream artifact(s) you need (there are plenty of selectors available in the Copy Artifact plugin that you can play with to achieve what you want.

If you are triggering child jobs manually from pipeline, then you can use syntax as this to pass the specific node label
build job: 'test_job', parameters: [[$class: 'LabelParameterValue', name: 'node', label: 'tester1']]
build job: 'test_job', parameters: [[$class: 'LabelParameterValue', name: 'node', label: 'tester2']]
current label of node you should be able to get this way ${env.NODE_NAME}"
found at How to trigger a jenkins build on specific node using pipeline plugin?
ref. to docs- https://jenkins.io/doc/pipeline/steps/pipeline-build-step/
But yes, if you want to manipulate with some files from this job in other jobs, then you will need to use eg. mentioned copy artifacts plugin, because workspaces of the jobs are independent and each job will have different content.

Related

Jenkins: having problems passing environment variable for use in another job (maybe a bug)

I seem to have found a bug with trying to pass environment variables from one Jenkins job to another.
I have a Jenkins job which contains a Powershell build step. My question is not about the Powershell script, as that does exactly what I want (it goes to Artifactory, finds a list of all the builds and then gets the build number of the latest one). The script ends up with the Artifactory build number as a text string '$LATEST_BUILD_NO_SLASH' (for clarity, this is not the Jenkins build number). This is eventually stored as an environment variable called 'LATEST_BUILD_NUM_VAL'
This is definitely creating an environment variable with my value stored in it, as it can be seen in the 'Environment Variables' list.
This environment variable is passed in the standard way in the parameterized build step.
My issue is that when I use this environment variable in a downstream build having passed it using 'LATEST_BUILD_NUM = ${LATEST_BUILD_NUM_VAL}', I get '${LATEST_BUILD_NUM_VAL}' as the value passed to the downstream job:
But, if I pass a Jenkins created environment variable i.e.'LATEST_BUILD_NUM = ${JOB_BASE_NAME}' I get the correct variable in the downstream job:
I have spent all day banging my head around this and don't really know where to go from here. I seem to be creating the environment variable correctly, as it is in the environment variables list and it works if I use a standard environment variable. I have declared 'LATEST_BUILD_NUM' as a parameter in my downstream build.
Is there any other way of achieving what I am trying to do?
I have checked in the 'Jenkins Issues' log for issues with parameterised builds and I can't find anything similar to my issue.
In case it is of any relevance, the Jenkins Environment Injector plugin is v2.1.6 and the Parameterized Trigger plugin is v2.35.2.
This is easy to achieve in Jenkins Pipeline:
Your second job (JobB) is called from your first job (JobA) as a downstream job. Thus somewhere, (probably the end of your JobA pipeline) you will have:
build job: 'CloudbeeFolder1/Path/To/JobB', propagate: false, wait: false, parameters: [[$class: 'StringParameterValue', name: 'MY_PARAM', value: "${env.SOME_VALUE}"]]
Then in JobB on the "other side" you have:
environment {
PARAM_FROM_PIPELINE = "${params.MY_PARAM}"
}
This gets the value of your parameter into an environment variable in JobB.
in first job in post build action--> Trigger Parameterized build on other projects select this
In that project build --> give name of downward jobname
select Add parameter in that add Predefined parameter give parameter in key value format e.g Temp=${BUILD_ID}
In second job select project is parameterized in select any option e.g string parameter and put name as Temp and used this parameter in shell or anywhere as $Temp ...

Trigger Multibranch Job from another

I have a job in Jenkins and I need to trigger another one when it ends (if it ends right).
The second job is a multibranch, so I want to know if there's any way to, when triggering this job, pass the branch I want to. For example, if I start the first job in the branch develop, I need it to trigger the second one for the develop branch also.
Is there any way to achieve this?
Just think about the multibranch job being a folder containing the real jobs named after the available branches:
Using Pipeline Job
When using the pipeline build step you'll have to use something like:
build(job: 'JOB_NAME/BRANCH_NAME'). Of course you may use a variable to specify the branch name.
Using Freestyle Job
When triggering from a Freestyle job you most probably have to
Use the parameterized trigger plugin as the plain old downstream build plugin still has issues triggering pipeline jobs (at least the version we're using)
As job name use the same pattern as described above: JOB_NAME/BRANCH_NAME
Should be possible to use a job parameter to specify the branch name here. However I didn't give it a try, though.
Yes, you can call downstream job by adding post build step: Trigger/Call build on other projects(you may need to install "Parameterized Trigger Plugin"):
where in Parameters section you define vars for the downstream job associated with vars from current job.
Also multibranch_PARAM1 and *PARAM2 must be configured in the downstreamjob:
Sometimes you want to call one or more subordinate multibranch jobs and have them build all of their branches, not just one. A script can retrieve the branch names and build them.
Because the script calls the Jenkins API, it should be in a shared library to avoid sandbox restrictions. The script should clear non-serializable references before calling the build step.
Shared library script jenkins-lib/vars/mbhelper.groovy:
def callMultibranchJob(String name) {
def item = jenkins.model.Jenkins.get().getItemByFullName(name)
def jobNames = item.allJobs.collect {it.fullName}
item = null // CPS -- remove reference to non-serializable object
for (jobName in jobNames) {
build job: jobName
}
}
Pipeline:
pipeline {
agent any
stages {
stage('Build') {
steps {
script {
library 'jenkins-lib'
mbhelper.callMultibranchJob 'multibranch-job-one'
mbhelper.callMultibranchJob 'multibranch-job-two'
}
}
}
}
}

Jenkinsfile in a node with hostname

I would like to execute my job in a remote node passing the domain name as node arg.
Someone knows how to build this jenkinsfile?
I can't execute on the below way
node('jenkins.mydomain.com') {
build 'remote_exec'
}
There are actually two major issues within your two lines of code :-)
node('jenkins.mydomain.com') {
This will build on a build agent with the label jenkins.mydomain.com. If you have only one build agent with this label given, this should work. But it's not the hostname! (Note: I'm not entirely sure if dots are allowed, but you could call it also whateverserver).
So this would allocate an executor slot (to run the code within the closure) on a build agent matching the given label...
build 'remote_exec'
and then trigger yet another build for the job called remote_exec. This job (assuming it exists and you don't have this as a third issue^^) will then be built on an agent matching its own labels, ignoring the one given in the node(label) step.
If you want that the remote_exec job runs on a specific build agent only, then add the node step there!

get Jenkins job one of the value and pass it to other Jenkins jobs of a pipeline

I've pipeline p1. which has 3 jobs. J1,J2 and J3. Let's say J1 has it's own $BUILD_NUMBER. I want to pass exactly the same build number to other jenkins jobs(J2 and J3) of a pipeline p1. How can I do that?
You can pass it as an arguement to other job :
${BUILD_NUMBER}
Ok so to elaborate it I am putting some screenshots :
First you need to add a parameter in your target job in which you want to pass the build number as parameter like this :
Then you need to build this parameterised job in your original job like this:
Make sure that the name you give here is the same as the name of your parameter in the target branch.

How can I get the project details which has triggered the specific job?

I am triggering a job (child job) in 'Server B' from a job (parent job) in 'Server A' through a python script. I have 2-3 parent jobs. So I want to know which parent job is triggered the child job. How can I know which parent job triggered child job?
Can I pass the parent job name to child job name ?
Or
Can I get the parent name directly from child job ? (Environment variable / using python scripts)
Every build has an environment variable JOB_NAME. You can pass this as a string parameter to your child job.
Following description is provided in the /env-vars.html:
JOB_NAME
Name of the project of this build, such as "foo" or "foo/bar". (To strip off folder paths from a Bourne shell script, try: ${JOB_NAME##*/})
Passing the job name as an environment variable as mentioned by OltzU, may be the way to go, but that depends on how you are triggering the child job. If you are directly triggering the child job from the parent job using a post-build step, you can use something like the Parameterized Remote Build plugin to pass the Job name along. If you are using a script in the parent job to fire off the child job, you can string the Job name on as a parameter.
If you can't pass the triggering job as a parameter, you can programmatically get the build trigger(s) using Groovy. Groovy is really the only language that fully integrates with the Jenkins api, so if you want to use another language (python), you are stuck with using the rest api or jenkins cli, which are both limited in what they can give you (e.g. neither can give you the triggering job to my knowledge).
If you want to use groovy to get the trigger job, you will need the Groovy Plugin, which you will run as build step in your child job. Here's a snippet of code to get the chain of upstream jobs that triggered your build. You may need to modify the code depending on the type of trigger that is used.
def getUpstreamProjectTriggers(causes) {
def upstreamCauses = []
for (cause in causes) {
if (cause.class.toString().contains("UpstreamCause")) {
upstreamCauses.add(cause.getUpstreamProject())
}
}
return upstreamCauses
}
getUpstreamProjectTriggers(build.getCauses())
From here, if you want to use the triggering job in, say, a python script, you would need to use groovy to set the triggering job in an environment variable. This SO thread gives more info on that, or you can skip to my answer in that thread to see how I do it.
In the child Jenkinsfile this Groovy code will get the name of the triggering job:
String getTriggeringProjectName() {
if (currentBuild.upstreamBuilds) {
return currentBuild.upstreamBuilds[0].projectName
} else {
return ""
}
}
currentBuild.upstreamBuilds is a list of RunWrapper objects
You could use an additional parameter for your child job.

Resources