Aggregating test results in Jenkins with Parametrized Jobs - jenkins

I understand this post is similar to:
Aggregating results of downstream is no test in Jenkins
and also to:
Aggregating results of downstream parameterised jobs in Jenkins
Nonetheless, I am not able to figure out for my case, how to make this working. I am currently using Jenkins 1.655.
I have jobs A, B, C - A being the upstream job. What I want to do is to have A call B and B call C. All needs to block and wait for completion of the next. If one fails, all fails. B and C generate unit test reports. So I want to aggregate these reports in A and then Publish that result in A. So, here's the current setup of the jobs:
Job A:
Build Steps
Execute shell: echo $(date) > aggregate
Trigger Parametrized Buid Job: Job B
Post Build Steps
Aggregate downstream test results
Record fingerprints of files to track usage: set Files to fingerprint to aggregate
Publish JUnit test result report (report files from B and C)
Job B:
Build Steps
Copy artifacts from another project: copy from upstream job aggregate file
Run tests to generate unit test reports
Trigger Parametrized Build Job: Job C
It ultimately fails here because aggregate is only archived in the
Post Build Steps of Job A. How can I archive an artifact in the Build Step?
Post Build Steps
Aggregate downstream test results (unit test.xml generated)
Record fingerprints of files to track usage: set Files to fingerprint to aggregate
I won't post Job C here for simplicity but it follows pretty much what B does.
So, summing it up, I want to have interlinked jobs that depend on each other and uses the parametrized plugin and the upstream job must aggregate the test results of all downstream.
Any help appreciated, thanks!

If you have no limitation on where to run your jobs you can always specify it to run on the same workspace\machine - this will solve all your issues.
If for some reason you can't run it on the same workspace, instead of using the copy artifact plugin you can use the link in Jenkins to the WS (guessing you're using Parameterized Trigger Plugin) so it'll be easy to wget the "aggregate" file from A job from the triggered job using the defined: TRIGGERED_BUILD_NUMBER_="Last build number triggered" from A. This will also help you to keep track of the jobs B and C you triggered to get the artifacts from there.
Hope it helps!

Related

I have Job A and B in Jenkins. How can I always execute a file in the most recent build of A in the build steps in job B

I have these Jenkins jobs A and B. Job A Builds a bunch of Files for my project. In Job B i wanna execute a command to run a file in the most recent build of job A.
My execution even works fine, but only because I have hard coded the build number and I am picking that from the files stored by Jenkins in my C:JenkinsData Directory, I would wanna have that called from the Workspace instead
See image for clarification.
Jenkins build steps illustration
For e.g my last build right now is 70 I want to know how I can be always executing those same files but in the most recent Build
Or if its even way better Can I execute those same file from Job A since the built files are in the Workspace.
You could get the last build index using this API and "number" property:
/lastBuild/api/json
ie:
http://localhost:8080/job/yourJobName/lastBuild/api/json
This could help:
Jenkins - Get last completed build status

How can I analyze console output data from a triggered jenkins job?

I have 2 jobs 'job1' and 'job2'. I will be triggering 'job2' from 'job1'. I need to get the console output of the 'job1' build which triggered 'job2' and want to process it in 'job2'.
The difficult part is to find out the build number of the upstream job (job1) that triggered the downstream job (job2). Once you have it, you can access the console log, e.g. via ${BUILD_URL}consoleOutput, as pointed out by ivoruJavaBoy.
How can a downstream build determine by what job and build number it has been triggered? This is surprisingly difficult in Jenkins:
Solution A ("explicit" approach): use the Parameterized Trigger Plugin to "manually" pass a build number parameter from job1 to job2. Apart from the administrational overhead, there is one bigger drawback with that approach: job1 and job2 will no longer run asynchronously; there will be exactly one run of job2 for each run of job1. If job2 is slower than job1, then you need to plan your resources to avoid builds of job2 piling up in the queue.
So, some solution "B" will be better, where you extract the implicit upstream information from Jenkins' internal data. You can use Groovy for that; a simpler approch may be to use the Job Exporter Plugin in job2. That plugin will create a "properties" (text) file in the workspace of a job2 build. That file contains two entries that contain exactly the information that you're looking for:
build.upstream.number Number of upstream job that triggered this job. Only filled if the build was triggered by an upstream project.
build.upstream.project Upstream project that triggered this job.
Read that file, then use the information to read the console log via the URL above.
You can use the Post Build Task plugin, then you can get the console output with a wget command:
wget -O console-output.log ${BUILD_URL}consoleOutput

Copying artifacts from multiple upstream jobs at join in Jenkins

Is it possible to have a Jenkins Job with has been triggered by the Join plugin copy artifacts from multiple upstream jobs?
I'm trying to set-up a Jenkins configuration with a "diamond" of jobs: my-trigger runs and spawns two jobs, my-fork1 and my-fork2, that can run concurrently and take varying amounts of time, and the Join plugin sets off the job my-join once both forks have completed.
Each of my-trigger, my-fork1 and my-fork2 creates and fingerprints artifacts (say, text files).
I want to copy the artifacts from each of the upstream jobs in my-join using the "Copy artifacts from another project" tool, with the "Which build" parameter set to "Upstream build that triggered this job". However, I see output like this in the console of my-join:
Building remotely on build-machine in workspace /path/to/workspace/my-join
Copied 1 artifact from "my-trigger" build number 63
Copied 1 artifact from "my-fork1" build number 63
Unable to find a build for artifact copy from: my-fork2
and the job fails. In this case, my-fork2 finished first, so my-fork1 triggered the join step. I believe that that means that my-join only has record of my-fork1 and my-trigger as being upstream. If my-fork1 finishes first, then my-fork2 kicks off the join, and the job fails when trying to copy from my-fork1.
If I change the configuration to copy the artifact from the build "Latest successful build" then the build succeeds, but my-trigger may run many times in succession so there would be no guarantee that my-join is joining related artifacts.
How can I get the join step to copy artifacts from multiple forks upstream?
Note: the second point of this question seems to be asking the same thing, but the only answer there doesn't address it, and has been accepted.
Thanks
tensorproduct
If your builds are parameterized with a unique parameter for each run of the join-diamond, you can use that parameter in the CopyArtifact plugin to determine which build to copy from. You would want to specify "Latest successful build" and qualify it with the parameter and value.
We have a similar situation where I work; multiple simultaneous runs of a join-diamond. The parameter in the build allows the downstream jobs to get the correct artifacts from the upstream jobs.
Step by Step settings of the provided solution from Jason Swager:
Project dependencies:
diamond->fork->diamond_ready
Project "fork":
String parameter "UNIQUE_ID" (only dummy not used inside)
(Creates an artifcat and Archive the artifacts)
Project "diamond_ready"
String parameter: UNIQUE_ID
Copy artifacts from another project
Project name: fork
Parameter filters: UNIQUE_ID=${UNIQUE_ID}
Project "diamond":
Trigger parameterized build on other project
Projects to build: fork
Predefinded parameters: UNIQUE_ID=${BUILD_TAG}
Join Trigger:
Post-Join Actions:
Trigger parameterized build on other projects
Projects to build: diamond_ready
Predefined Generator parameters: UNIQUE_ID=${BUILD_TAG}

Jenkins Parallel Trigger and Wait

I have 4 jobs which needs to be executed in the following sequence
JOB A
|------> JOB B
|------> JOB C
|------> JOB D
In the above
A should trigger B & C parallely and C inturn triggers D.
A should hold the job as running till all 3 of them completed.
I tried the following plugins and couldn't achieve what I am looking for
Join Plugin
Multijob Plugin
Multi-Configuration Project
Paramterized Trigger Plugin
Is there any plugin which I haven't tried would help me in resolving this. Or is this can be achieved in a different way. Please advise.
Use DSL Script with Build Flow plugin.
try this Example for your execution:
build("job A")
parallel
(
{build("job B")}
{build("job C")}
)
build("job D")
Try the Locks and Latches plugin.
This may not be optimal way, but it should work. Use the Parameterized Trigger Plugin. To Job A, add a build step (NOT a Post Build Action) to start both Jobs B and C in the same build step AND block until they finish. In Job C, add a build step (NOT a Post Build Action) that starts Job D AND blocks until it is finished. That should keep Job A running for the full duration.
This isn't really optimal though: Job A is held open waiting for B and C to finish. Then C is held open until D is finished.
Is there some reason that Job A needs to remain running for the duration? Another possibility is to have Job A terminate after B and C are started, but have a Promotion on Job A that will execute your final actions after jobs B, C and D are successful.
I am trying to build a same system. I am building a certification pipeline where I need to run packager/build/deploy jobs and and corresponding test jobs. When all of them are successful, I want to aggregate the test results and trigger the release job that can do an automated maven release.
I selected Build pipeline plugin for visualization of the system. Initially tried with Parameterized trigger Plugin with blocking builds. I could not setup archiving the artifacts/fingerprinting and downstream build relationship this way since archiving the artifacts works only in postbuild. Then I put the Parameterized trigger in Post build activity. This way I was able to setup downstream builds, fingerprinting, aggregate test results but the build failures were not bubbling to upstream job chain and upstream jobs were non blocking
I was finally able to achieve this using these plugins-
Build Pipeline
MultiJob Plugin
FingerPrint Plugin
Copy Artifacts Plugin
Join Plugin
I'm using Jenkins 1.514
System looks like this
Trigger Job --> build (and deploy) Job (1..n) ---> Test Job (1..n)
Trigger Job -
Create as MultiJob and create a fingerprint file in shell exec
echo date +%s > fingerprint.txt
Trick is that file needs to be archived during the build, to do that execute this script-
ARCHIVEDIR=$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive
mkdir $ARCHIVEDIR
cp fingerprint.txt $ARCHIVEDIR
Create MultiJob Phase consisting of build/deploy job.
Build/deploy job is itself a multijob
follow the same steps for creating build/deploy job as above relative
to fingerprinting.
Copy the fingerprint.txt artifact from upstream job
Setup MultiJob phase in deploy job that triggers the test job
create a new fingerprint file and force archive it similar to above step
Collect Junit results in the final test job.
In the trigger Job, use Join Plugin to execute the Release Job by choosing 'Run Post Build Actions at join' and execute the release project only on stable build of Trigger Job.
This way all the steps are showing up in Build Pipeline view and Trigger job is blocking for all downstream builds to finish and sets its status as the worst downstream build to give a decision point for release job.
Multijob Plugin
If you'd like to stop the mess with downstream / upstream jobs chains definitions. Or when you want to add a full hierarchy of Jenkins jobs that will be executed in sequence or in parallel. Add context to your buildflow implementing parameter inheritance from the MultiJob to all its Phases and Jobs. Phases are sequential while jobs inside each Phase are parallel.
https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin

How do I make a Jenkins job start after multiple simultaneous upstream jobs succeed?

In order to get the fastest feedback possible, we occasionally want Jenkins jobs to run in Parallel. Jenkins has the ability to start multiple downstream jobs (or 'fork' the pipeline) when a job finishes. However, Jenkins doesn't seem to have any way of making a downstream job only start of all branches of that fork succeed (or 'joining' the fork back together).
Jenkins has a "Build after other projects are built" button, but I interpret that as "start this job when any upstream job finishes" (not "start this job when all upstream jobs succeed").
Here is a visualization of what I'm talking about. Does anyone know if a plugin exists to do what I'm after?
Edit:
When I originally posted this question in 2012, Jason's answer (the Join and Promoted Build plugins) was the best, and the solution I went with.
However, dnozay's answer (The Build Flow plugin) was made popular a year or so after this question, which is a much better answer. For what it's worth, if people ask me this question today, I now recommend that instead.
Pipeline plugin
You can use the Pipeline Plugin (formerly workflow-plugin).
It comes with many examples, and you can follow this tutorial.
e.g.
// build
stage 'build'
...
// deploy
stage 'deploy'
...
// run tests in parallel
stage 'test'
parallel 'functional': {
...
}, 'performance': {
...
}
// promote artifacts
stage 'promote'
...
Build flow plugin
You can also use the Build Flow Plugin. It is simply awesome - but it is deprecated (development frozen).
Setting up the jobs
Create jobs for:
build
deploy
performance tests
functional tests
promotion
Setting up the upstream
in the upstream (here build) create a unique artifact, e.g.:
echo ${BUILD_TAG} > build.tag
archive the build.tag artifact.
record fingerprints to track file usage; if any job copies the same build.tag file and records fingerprints, you will be able to track the parent.
Configure to get promoted when promotion job is successful.
Setting up the downstream jobs
I use 2 parameters PARENT_JOB_NAME and PARENT_BUILD_NUMBER
Copy the artifacts from upstream build job using the Copy Artifact Plugin
Project name = ${PARENT_JOB_NAME}
Which build = ${PARENT_BUILD_NUMBER}
Artifacts to copy = build.tag
Record fingerprints; that's crucial.
Setting up the downstream promotion job
Do the same as the above, to establish upstream-downstream relationship.
It does not need any build step. You can perform additional post-build actions like "hey QA, it's your turn".
Create a build flow job
// start with the build
parent = build("build")
parent_job_name = parent.environment["JOB_NAME"]
parent_build_number = parent.environment["BUILD_NUMBER"]
// then deploy
build("deploy")
// then your qualifying tests
parallel (
{ build("functional tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name) },
{ build("performance tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name) }
)
// if nothing failed till now...
build("promotion",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name)
// knock yourself out...
build("more expensive QA tests",
PARENT_BUILD_NUMBER: parent_build_number,
PARENT_JOB_NAME: parent_job_name)
good luck.
There are two solutions that I have used for this scenario in the past:
Use the Join Plugin on your "deploy" job and specify "promote" as the targeted job. You would have to specify "Functional Tests" and "Performance Tests" as the joined jobs and start them via in some fashion, post build. The Parameterized Trigger Plugin is good for this.
Use the Promoted Builds Plugin on your "deploy" job, specify a promotion that works when downstream jobs are completed and specify Functional and Performance test jobs. As part of the promotion action, trigger the "promote" job. You still have to start the two test jobs from "deploy"
There is a CRITICAL aspect to both of these solutions: fingerprints must be correctly used. Here is what I found:
The "build" job must ORIGINATE a new fingerprinted file. In other words, it has to fingerprint some file that Jenkins thinks was originated by the initial job. Double check the "See Fingerprints" link of the job to verify this.
All downstream linked jobs (in this case, "deploy", "Functional Tests" and "Performance tests") need to obtain and fingerprint this same file. The Copy Artifacts plugin is great for this sort of thing.
Keep in mind that some plugins allow you change the order of fingerprinting and downstream job starting; in this case, the fingerprinting MUST occur before a downstream job fingerprints the same file to ensure the ORIGIN of the fingerprint is properly set.
The Multijob plugin works beautifully for that scenario. It also comes in handy if you want a single "parent" job to kick off multiple "child" jobs but still be able to execute each of the children manually, by themselves. This works by creating "phases", to which you add 1 to n jobs. The build only continues when the entire phase is done, so if a phase as multiple jobs they all must complete before the rest are executed. Naturally, it is configurable whether the build continues if there is a failure within the phase.
Jenkins recently announced first class support for workflow.
I believe the Workflow Plugin is now called the Pipeline Plugin and is the (current) preferred solution to the original question, inspired by the Build Flow Plugin. There is also a Getting Started Tutorial in GitHub.
Answers by jason & dnozay are good enough. But in case someone is looking for easy way just use JobFanIn plugin.
This diamond dependency build pipeline could be configured with
the DepBuilder plugin. DepBuilder is using its own domain
specific language, that would in this case look like:
_BUILD {
// define the maximum duration of the build (4 hours)
maxDuration: 04:00
}
// define the build order of the existing Jenkins jobs
Build -> Deploy
Deploy -> "Functional Tests" -> Promote
Deploy -> "Performance Tests" -> Promote
After building the project, the build visualization will be shown on the project dashboard page:
If any of the upstream jobs didn't succeed, the build will be automatically aborted. Abort behavior could be tweaked on a per job basis, for more info see the DepBuilder documentation.

Resources