Jenkins pipeline top-level join gets triggered before sub-level join - jenkins

I've got a multi-level build pipeline with a "top-level" join (test_Join) and a "sub-level" join (test_Build1_Join) (see image below).
My test_Join job requires artifacts from both test_Build1_Join and test_Build2. I copy them by filtering on a parameter named PL_BUILD_NUMBER that's passed downstream from test_Start job. That works (see this SO post)
My problem is
Sometimes, the "top-level" join is triggered before the "sub-level" join, like in the image below. This mean that the build artifacts are not yet available and I get the following error:
Copied 1 artifact from "test_Build2" build number 33
Unable to find a build for artifact copy from: test_Build1_Join
Build step 'Copy artifacts from another project' marked build as failure
Notifying upstream projects of job completion
Finished: FAILURE
The joins are done using the "Join Trigger" plugin, with the "run post-build actions at join -> Trigger parameterized build on other projects" options.
How can I synchronise the whole pipeline better? What's the standard practice?

Related

Jenkins pipeline share information between jobs

We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.

How to use a different set of parameters for release builds in jobs triggered via parameterized trigger plugin

I have a set of jobs that run shell scripts. Parameters for those scripts can be chosen via some choices defined in the build.
Now i want to use the release plugin to prevent people from accidentally choosing from a set of "release only" parameters.
So what i basically need is the ability to have one build with two distinct sets of parameter choices.
To achieve this i have configured the jobs as follows:
Master Job
normal build
- choiceParameter name:TEST values:"normal"
release build
- choiceParameter name:TEST values:"release"
Child Job
normal build
- choiceParameter name:TEST values:"normal"
release build
- choiceParameter name:TEST values:"release"
MasterJob triggers ChildJob via "Parameterized Build" plugin
When i execute a normal build everything works fine.
But when i trigger a Release Build on the MasterJob i get the following exception:
ERROR: Build step failed with exception
java.lang.IllegalArgumentException: Illegal choice for parameter TEST: release
at hudson.model.ChoiceParameterDefinition.checkValue(ChoiceParameterDefinition.java:75)
at hudson.model.ChoiceParameterDefinition.createValue(ChoiceParameterDefinition.java:87)
at hudson.model.ChoiceParameterDefinition.createValue(ChoiceParameterDefinition.java:19)
at hudson.plugins.parameterizedtrigger.ProjectSpecificParameterValuesActionTransform.convertToDefinedType(ProjectSpecificParameterValuesActionTransform.java:83)
at hudson.plugins.parameterizedtrigger.ProjectSpecificParameterValuesActionTransform.transformParametersAction(ProjectSpecificParameterValuesActionTransform.java:34)
at hudson.plugins.parameterizedtrigger.ProjectSpecificParametersActionFactory.getProjectSpecificBuildActions(ProjectSpecificParametersActionFactory.java:32)
at hudson.plugins.parameterizedtrigger.BuildTriggerConfig.getBuildActions(BuildTriggerConfig.java:290)
at hudson.plugins.parameterizedtrigger.BuildTriggerConfig.perform2(BuildTriggerConfig.java:336)
at hudson.plugins.parameterizedtrigger.BlockableBuildTriggerConfig.perform2(BlockableBuildTriggerConfig.java:57)
at hudson.plugins.parameterizedtrigger.TriggerBuilder.perform(TriggerBuilder.java:85)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:761)
at hudson.model.Build$BuildExecution.build(Build.java:203)
at hudson.model.Build$BuildExecution.doRun(Build.java:160)
at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:536)
at hudson.model.Run.execute(Run.java:1741)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:374)
Build step 'Trigger/call builds on other projects' marked build as failure
Finished: FAILURE
Fixing this error is easy. I just have to add the value "release" to the choices in the normal build. But this destroys the whole intention of this setup.
Is there a way to get this kind of setup to work?
If you want people to restrict running arbitrary script on production boxes, You can use Node label plugin.
You can configure the job to select which node(box/machine) user can run the job, This way you can restrict user running jobs on prod env.

Jenkins MultiJob Plugin does not aggregate downstream test results

I am using the jenkins multijob plugin to execute a number of parallel builds in the same build phase and i want to display the test results in the main multijob project so i select a post-build action step to 'Aggregate down stream test results' and select both options 'Automatically aggregate all downstream tests' and 'Include failed builds in results' but when the jobs complete and i go into the main multijob project it shows 'no tests' under 'Latest Test Result' link...
Has anyone else encountered this issue? My downstream 'child' projects that run in parallel are multi-configuration projects.
As a previous poster indicated, this is an open issue in the Jenkins JIRA and does not work. There is a workaround to achieve what your looking for. Your going to need the Copy Artifact Plugin and to also Archive the test result files as Artifacts in your jobs that are doing the test runs.
After you have installed this and configured your test run jobs properly, go to your Multijob and after all your test phases add a build step "Copy artifacts from another project" for each of the jobs you want the test results from. You can use "Specified by permalink" and use the "Last build" permalink to always retrieve the latest artifacts. Select the artifacts you want to copy (i.e. *.xml), and input your target directory as something like "job1". If you add multiple build steps to copy artifacts from another project, just name your target directories for the copied artifacts something similar like "job2", "job3", etc.
Then select a Post-build action in your Multijob as you would to Publish JUnit test result report (or whatever you prefer) and input **/job*/*.xml (or similar).
This is what I did, and it works just fine. It is a bit manual in the setup, but it works great once its configured.

Copying artifacts from multiple upstream jobs at join in Jenkins

Is it possible to have a Jenkins Job with has been triggered by the Join plugin copy artifacts from multiple upstream jobs?
I'm trying to set-up a Jenkins configuration with a "diamond" of jobs: my-trigger runs and spawns two jobs, my-fork1 and my-fork2, that can run concurrently and take varying amounts of time, and the Join plugin sets off the job my-join once both forks have completed.
Each of my-trigger, my-fork1 and my-fork2 creates and fingerprints artifacts (say, text files).
I want to copy the artifacts from each of the upstream jobs in my-join using the "Copy artifacts from another project" tool, with the "Which build" parameter set to "Upstream build that triggered this job". However, I see output like this in the console of my-join:
Building remotely on build-machine in workspace /path/to/workspace/my-join
Copied 1 artifact from "my-trigger" build number 63
Copied 1 artifact from "my-fork1" build number 63
Unable to find a build for artifact copy from: my-fork2
and the job fails. In this case, my-fork2 finished first, so my-fork1 triggered the join step. I believe that that means that my-join only has record of my-fork1 and my-trigger as being upstream. If my-fork1 finishes first, then my-fork2 kicks off the join, and the job fails when trying to copy from my-fork1.
If I change the configuration to copy the artifact from the build "Latest successful build" then the build succeeds, but my-trigger may run many times in succession so there would be no guarantee that my-join is joining related artifacts.
How can I get the join step to copy artifacts from multiple forks upstream?
Note: the second point of this question seems to be asking the same thing, but the only answer there doesn't address it, and has been accepted.
Thanks
tensorproduct
If your builds are parameterized with a unique parameter for each run of the join-diamond, you can use that parameter in the CopyArtifact plugin to determine which build to copy from. You would want to specify "Latest successful build" and qualify it with the parameter and value.
We have a similar situation where I work; multiple simultaneous runs of a join-diamond. The parameter in the build allows the downstream jobs to get the correct artifacts from the upstream jobs.
Step by Step settings of the provided solution from Jason Swager:
Project dependencies:
diamond->fork->diamond_ready
Project "fork":
String parameter "UNIQUE_ID" (only dummy not used inside)
(Creates an artifcat and Archive the artifacts)
Project "diamond_ready"
String parameter: UNIQUE_ID
Copy artifacts from another project
Project name: fork
Parameter filters: UNIQUE_ID=${UNIQUE_ID}
Project "diamond":
Trigger parameterized build on other project
Projects to build: fork
Predefinded parameters: UNIQUE_ID=${BUILD_TAG}
Join Trigger:
Post-Join Actions:
Trigger parameterized build on other projects
Projects to build: diamond_ready
Predefined Generator parameters: UNIQUE_ID=${BUILD_TAG}

Jenkins Parallel Trigger and Wait

I have 4 jobs which needs to be executed in the following sequence
JOB A
|------> JOB B
|------> JOB C
|------> JOB D
In the above
A should trigger B & C parallely and C inturn triggers D.
A should hold the job as running till all 3 of them completed.
I tried the following plugins and couldn't achieve what I am looking for
Join Plugin
Multijob Plugin
Multi-Configuration Project
Paramterized Trigger Plugin
Is there any plugin which I haven't tried would help me in resolving this. Or is this can be achieved in a different way. Please advise.
Use DSL Script with Build Flow plugin.
try this Example for your execution:
build("job A")
parallel
(
{build("job B")}
{build("job C")}
)
build("job D")
Try the Locks and Latches plugin.
This may not be optimal way, but it should work. Use the Parameterized Trigger Plugin. To Job A, add a build step (NOT a Post Build Action) to start both Jobs B and C in the same build step AND block until they finish. In Job C, add a build step (NOT a Post Build Action) that starts Job D AND blocks until it is finished. That should keep Job A running for the full duration.
This isn't really optimal though: Job A is held open waiting for B and C to finish. Then C is held open until D is finished.
Is there some reason that Job A needs to remain running for the duration? Another possibility is to have Job A terminate after B and C are started, but have a Promotion on Job A that will execute your final actions after jobs B, C and D are successful.
I am trying to build a same system. I am building a certification pipeline where I need to run packager/build/deploy jobs and and corresponding test jobs. When all of them are successful, I want to aggregate the test results and trigger the release job that can do an automated maven release.
I selected Build pipeline plugin for visualization of the system. Initially tried with Parameterized trigger Plugin with blocking builds. I could not setup archiving the artifacts/fingerprinting and downstream build relationship this way since archiving the artifacts works only in postbuild. Then I put the Parameterized trigger in Post build activity. This way I was able to setup downstream builds, fingerprinting, aggregate test results but the build failures were not bubbling to upstream job chain and upstream jobs were non blocking
I was finally able to achieve this using these plugins-
Build Pipeline
MultiJob Plugin
FingerPrint Plugin
Copy Artifacts Plugin
Join Plugin
I'm using Jenkins 1.514
System looks like this
Trigger Job --> build (and deploy) Job (1..n) ---> Test Job (1..n)
Trigger Job -
Create as MultiJob and create a fingerprint file in shell exec
echo date +%s > fingerprint.txt
Trick is that file needs to be archived during the build, to do that execute this script-
ARCHIVEDIR=$JENKINS_HOME/jobs/$JOB_NAME/builds/$BUILD_ID/archive
mkdir $ARCHIVEDIR
cp fingerprint.txt $ARCHIVEDIR
Create MultiJob Phase consisting of build/deploy job.
Build/deploy job is itself a multijob
follow the same steps for creating build/deploy job as above relative
to fingerprinting.
Copy the fingerprint.txt artifact from upstream job
Setup MultiJob phase in deploy job that triggers the test job
create a new fingerprint file and force archive it similar to above step
Collect Junit results in the final test job.
In the trigger Job, use Join Plugin to execute the Release Job by choosing 'Run Post Build Actions at join' and execute the release project only on stable build of Trigger Job.
This way all the steps are showing up in Build Pipeline view and Trigger job is blocking for all downstream builds to finish and sets its status as the worst downstream build to give a decision point for release job.
Multijob Plugin
If you'd like to stop the mess with downstream / upstream jobs chains definitions. Or when you want to add a full hierarchy of Jenkins jobs that will be executed in sequence or in parallel. Add context to your buildflow implementing parameter inheritance from the MultiJob to all its Phases and Jobs. Phases are sequential while jobs inside each Phase are parallel.
https://wiki.jenkins-ci.org/display/JENKINS/Multijob+Plugin

Resources