I have two Jenkins jobs.
Job 1: Uploads build to artifactory
Job 2: Downloads build from artifactory and does some stuff with it
Right now, Job 1 triggers Job 2 using the Parameterized Build Plugin (Job 2 is shared amongst many teams at my company, so I don't want to change it too much - it's a parameterized job that takes an artifactory URL)
The problem is, it seems like the artifact doesn't always finish uploading to artifactory before Job 2 is triggered. Sometimes Job 2 gets a 404 when it tries to download the artifact. Is there some way to 1) prevent triggering Job 2 until the artifact has uploaded? or 2) pass the artifact directly from Job 1 to Job 2 w/out needing to do an upload and a download? (the former would be preferable, since option 2 would require changing Job 2)
Regarding your option 1, you can use the Naginator plugin to reschedule a job 2 if it fails.
Regarding the option 2, you can use the Copy Artifact plugin. It will allow the job 2 to copy the artifacts from the job 1.
Personally, I prefer the option 1. Artifactory is the right place to store binaries :)
There is a 3rd solution by using the quiet period setting on the job 2 to delay the start (Jenkins: build one job after another with some delay).
Related
I have three jobs in my jenkins pipeline. The requirement is to queue the pipeline, ie, it should run as it is for the first build; for the second build (before the first gets completed), the flow should be - get all the 3 jobs done from the first build, then go for the second build.
PS: Is it possible through Build Blocker plugin? If so, how?
I have the Jenkins pipeline which contains series of job (for testing using Selenium & Cucumber BDD). Every time we run the pipeline, even the functional test is passed (i called it test status) it takes time for saving the artifacts then job is considered to be PASSED (this i called Job status). So let's say for a simple test that take only 1 minute to run , but saving the artifacts from Jenkins slave to Jenkins master take around the same time or more before it's considered to be passed. In regarding to the fast feedback to the team while running these jobs, it slows down the whole flow.
So, I wonder if there's way that i can modify or config for the post-build actions to send the test status to the pipeline right after running the test (but still saving the artifacts ? )
I just configured the post-build actions:
Archive the artifacts - File to archive : **
My expectation, basically is, the test status (passed/failed) will be parsed right away to Pipeline build scripts, so that the pipeline script will 'acknowledge' it way faster.
As per my understanding without upload completion, success or failure status can't be sent to the upstream job.
There are two jobs in my jenkins server
job1 : build every 10 minutes to scan the events, if happens it triggers the downstream job2
job2 : normal job mostly run once in the case.
Problem:
too many useless jenkins build for job1 in the UI since it runs frequently.
It will be good if the build can be discarded if it doesn't trigger the downstream job.
Solution so far:, using Discard Old build plugin in post build action is one direction, but no clue how to get it works nicely.
With the hints from #JamesD's comments, I can use several plugins to achieve this
Archive artifacts Plugin: to archive the param.txt files which is used to path to downstream jobs
Groovy Postbuild job Plugin: add the groovy script to check whether the param.txt exists or not. The build will be set to Abort if it doesn't exists
Discard Old Builds Plugin: will discard the Abort build
I am currently using the Promoted Builds Plugin to promote jobs into QA, Test, Production, etc. This plugin is really great for promoting a single job. However I am looking for a way to promote the parent job and have that job promote the rest of the downstream jobs with the corresponding build.
Here is an example. Lets say I have 3 jobs that are all triggered from an upstream job:
Job 1 - Build 200 - Parent job kicks off job 2 upon successful build
Job 2 - Build 400 - Job 2 kicks off job 3 upon successful build
Job 3 - Build 300 - Job 3 builds after job 2 is successful
After this is done I want to be able to promote job 1 build 200 and have that job also promote job2 build 400 and job 3 build 300 since these are the artifacts that were built together in the downstream relationship.
This can be easily done by actually building the jobs and automatically promoting them, but I do not want to build as the artifacts have already been created.
Any help is much appreciated!
I have the following jenkins work flow :
I'm using a Build Flow to orchestrate several jobs.
Job 1 -> publish artifacts to artifactory
Job 2 -> publish artifacts to artifactory
Job 3 -> uses artifacts from artifactory
(I actually have several more jobs, with parallelization, that's why the Build Flow is particularly useful)
Now Job 3 might have used the artifact from Job1Build1 and from Job2Build2
I'd like to install a promotion on Job 1 and Job 2.
For instance, after Job3 has run, it should detect that it retrieved artifacts from Artifactory from J1B1 and J2B2, and therefore apply a promotion on these builds.
Is it possible to record such link without explicitly recording fingerprints in Jenkins (only relying on the fingerprints of the artifacts published to / retrieved from Artifactory) ?