Build Jenkins job with Build periodically option with different argument - jenkins

I have a Jenkins job which accepts only BranchName(bitbucket) as an input parameter.
I have scheduled this Job to run every morning 7 AM with Build periodically option H 7 * * *.
On triggering automatically, it takes default input parameter as development.
Now my requirement is that I need to pass few other branch names as input parameter automatically.
One option I tried it Down stream job with other branch name, but that works only from one branch and not a sophisticated solution.
Is there an easy way I can achieve this?
Job Configuration

If you only want to run a known set of branches you can either:
Create an upstream job for every branch that triggers the build with different parameters.
Create one upstream job, that utilize an list, loop over that list and execute the jobs in parallel.
If you need to get the list of branches dynamically, I would assume, that running sh script: 'git branch', returnStdout: true and some Groovy string split operation is the easiest way to archive that.

Related

Is it possible to run a Jenkins job on a slave, use an excel file created as output from the first job and run the next job on Master?

I am trying to run a Jenkins job on a slave. An excel file is created as a result of this first job.
I want to run a second parametrized job on the master after the first job is completed depending on the value from the excel.
I have tried the following options till now:
1. Using the Join Plugin. This doesn't work because the second job is parametrized and I have to take an input from the excel file. there is no option to provide options or read the parameter from a file.
2. Pipeline on master- For some reason when I create a pipeline on the master and execute the first slave job, the slave job waits for a slot to run since one job is already running and the main job is waiting for the job on the slave to run. So it results in a deadlock.
Pipeline (scripted, not declarative) sounds like the way to go.
Something like:
node('MySlaveLabel') {
...do your stuff here...
stash includes: 'myExcelFile.xls', name: 'myExcelFile'
}
node('MyMasterLabel') {
unstash 'myExcelFile'
...examine your Excel file here..
...add conditional statements...
}
As long as the node blocks are not nested, you will need only 1 executor on the slave and 1 on the master.
If for some reason you actually need the jobs to call each other:
Use the build 'anotherProject' syntax.
Make sure you have enough executors on the slave.

Passing git changes to a file for use in later Jenkins jobs

ENVIRONMENT
I have a series of jobs in Jenkins where Job #1 is an initial build and job #n is deployment to production.
Only the first few jobs are connected, and all jobs are parameterized.
I only build the one time, and if that build is successful post build steps trigger a job to deploy that build to dev.
After deployment to dev I have to manually go to Jenkins and click to run the job to deploy to the next region/environment.
THE PROBLEM
I am able to successfully pass the $GIT_COMMIT to the downstream jobs because as a workspace based environment variable during job-run I can write it to a file for use later.
However, $CHANGES is an email-ext specific variable and I am having issues writing its contents to a file I can pass to downstream jobs for the purpose of tracking what the current build is deploying in a given environment.
My unfamiliarity with Groovy and my weak Google-fu have made trying pre-send script and post-send script difficult to work with to get the data I want passed to downstream jobs.
WHAT I HAVE TRIED
What works
I am able to send HTML emails using email-ext
I am able to pass $GIT_COMMIT to a file and use it in downstream jobs
I am able to load the contents of a file into an email using email-ext
What doesn't work
I cannot seem to use Groovy in a pre-send script to output the $CHANGES to a file for use.
Trying to output $CHANGES to a file in a post-send script also does not work, but probably wouldn't be best anyway since it would likely come after any opportunity to pass the file to a downstream job.
WHAT I HAVE NOT TRIED
I have seen suggestions to use the changelog registered by the SCM process, which apparently is recorded in XML which must then be parsed by either the initial build job or downstream jobs if it is to be formatted for inclusion in an HTML email.
HELP
If anyone has any suggestion on what to try next I would appreciate it. I'm pulling my hair out.
You can use this groovy script to access the build environment parameters of an arbitrary job on the same jenkins instance.
To execute the script you have to install the groovy plugin and execute the script as system groovy script.
import jenkins.model.Jenkins
job = Jenkins.instance.getJob("freestyle-test")
numbuilds = job.builds.size()
if (numbuilds == 0) { return }
lastbuild = job.builds[numbuilds - 1]
println 'JOB: ' + job.fullName
println ' -> lastbuild: ' + lastbuild.displayName + ' = ' + lastbuild.result
println ' -> lastbuild someEnv: ' + build.environment.get("SOME_ENV")
The coupling to the job is over the job name.
The selected build is the latest.

How to execute single Jenkins job on multiple platforms with different parameters?

I have created one build job in Jenkins. I want to use that same job to run on multiple nodes but with different job parameter.
You can use https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin
Create a new multyjob job, in the build section create a Multijob phase, and trigger your job with "Pre defined parameters" and "Node label parameter" to specify the node to run on, and the values of the parameters you want to trigger your job with.
Good luck!

How can I analyze console output data from a triggered jenkins job?

I have 2 jobs 'job1' and 'job2'. I will be triggering 'job2' from 'job1'. I need to get the console output of the 'job1' build which triggered 'job2' and want to process it in 'job2'.
The difficult part is to find out the build number of the upstream job (job1) that triggered the downstream job (job2). Once you have it, you can access the console log, e.g. via ${BUILD_URL}consoleOutput, as pointed out by ivoruJavaBoy.
How can a downstream build determine by what job and build number it has been triggered? This is surprisingly difficult in Jenkins:
Solution A ("explicit" approach): use the Parameterized Trigger Plugin to "manually" pass a build number parameter from job1 to job2. Apart from the administrational overhead, there is one bigger drawback with that approach: job1 and job2 will no longer run asynchronously; there will be exactly one run of job2 for each run of job1. If job2 is slower than job1, then you need to plan your resources to avoid builds of job2 piling up in the queue.
So, some solution "B" will be better, where you extract the implicit upstream information from Jenkins' internal data. You can use Groovy for that; a simpler approch may be to use the Job Exporter Plugin in job2. That plugin will create a "properties" (text) file in the workspace of a job2 build. That file contains two entries that contain exactly the information that you're looking for:
build.upstream.number Number of upstream job that triggered this job. Only filled if the build was triggered by an upstream project.
build.upstream.project Upstream project that triggered this job.
Read that file, then use the information to read the console log via the URL above.
You can use the Post Build Task plugin, then you can get the console output with a wget command:
wget -O console-output.log ${BUILD_URL}consoleOutput

How to change the configuration's by using the parametrized build based on the parameter?

choice parameter
Name
a
b
c
I have a parametrized build with choice parameters like a,b,c,d. When I select a parameter, it has to do fresh checkout, and, when I select b it has to update the work space of Jenkins.
Right now whatever the parameter choose either a,b,c, it is checkout policy is fresh checkout only.
Can any one let me know how to set different properties based on the selected parameter.
You would have to forego the Jenkins SCM checkout configuration, and maintain the SCM checkout through a script (either Execute Shell or Execute Batch Command). Your script would then handle the logic of doing a certain type of checkout/update based on passed parameters
Late edit:
You could configure a Pre-scm Build Step, and there run either a Conditional Build Step or a plain shell execution (bash or batch). In that shell, test the param, and if it matched b, then wipe the local workspace/checkout folder from the shell script.
When the rest of the job runs, it will do a fresh checkout (since the workspace/checkout is empty). With other param options, it will run the job normally, doing an update.
I haven't tried this. Your biggest issue may be if pre-scm build step does not have access to environment variables at that time.

Resources