Include a different job in a job's build steps in Jenkins - jenkins

I am trying to make this rather unique build flow and I haven't found a plugin or a way to do it with jenkins yet.
There is one job called "JOB A" which is used by itself and creates a standalone installer.
Then there is "JOB B" which creates another installer but it needs to include everything built in "JOB A" in addition to some other stuff. Now I could just copy JOB A build steps into JOB B, but I want to actually build JOB A and maybe even use those artifacts later as well.
It cannot be a build trigger cause JOB B needs to continue building after JOB A has finished and I cannot use something like flow because that creates JOB C and only sequences other jobs and I would need to go into A and B to get the artifacts.
Bonus points would be if it checked JOB A source code in git for any changes since its last build when building JOB B and decide if it needs to build it again.
I looked at many plugins and I can't seem to find one that would do this.
I hope my explanation was not confusing. Sorry if it was, I could elaborate.

If I understand correctly what you want, then what you need is:
Custom (shared) workspace
Parameterized Trigger Plugin
For both, JOB A and JOB B, setup Custom Workspace to the same folder on the server (You can even leave JOB A workspace as is, and just point JOB B custome workspace to workspace of JOB A. I am not at my work computer with Jenkins and can't provide screenshots, so I will borrow this great guide for more info on how to setup custom workspace
Then, whenever appropriate, have JOB A execute a build step Trigger/call builds on other projects, namely JOB B. You can even pass it all the same parameters that JOB A had. By default, this will not wait for JOB B to complete. It will kick off JOB B, meanwhile JOB A will finish running, and then JOB B completes whenever it is done.
If needed, you can check-mark Block until triggered projects finish their builds, and then JOB A will wait for JOB B to finish before continuing.
So, the above will:
Share workspace, and not do extra checkouts if code didn't change
Let JOB A and JOB B exist independently, with it's own artifacts, and each being able to be triggered separately.
JOB B will get everything from JOB A through shared workspace and passed parameters.

Related

Jenkins - Job Y is configure to run after Job x, but is not triggered

I have a Jenkins server on K8s (using Rancher).
Job Y is configured to run after job X (image is attached).
Job X runs successfully, but at its end, job Y is not triggered.
Both jobs are on the same Jenkins master.
This problem has never happened to me before, on our Jenkins on prem servers.
I have checked everything I could think of, including creating test jobs that basically do nothing, just in order to test if the triggering is working for other jobs on this master, it does not.
Checking the logs - job x is in the log (image is also attached), job Y is missing from the log completely.
I will mention that on another master on the same cluster, triggering is working. Any ideas?
check the trigger condition for job Y.
Below is the one way you can trigger jobs in Jenkins
Chain jobs together
Chain Jenkins job together using the “Build other projects” option in the “Post Build Actions” of a job. The downside to this is that it introduces a dependency between jobs. For example, I want to be able to run the database backup job without doing a deployment every time.
Parameterized Trigger Plugin
When you have the Parameterized Trigger Plugin installed:
Create a wrapper job for your sequential jobs
For each sequential job
Select Build->Add build step->Trigger/call builds on other projects
Enter the sequential job name
Check the ‘Block until the triggered projects finish their builds’ checkbox (this only appears when you have the Parameterized Trigger Plugin installed)
Now when you run the wrapper job, all the triggered jobs will be run in order and sequentially. You of course also have the option of using the parameters functionality of the plugin too.
Throttle Concurrent Builds Plugin
The docs for the Throttle Concurrent Builds Plugin seem to be a little lacking, but it works well. With it installed, a “Throttle Concurrent Builds” section shows up in the Jenkins config section (Jenkins -> Manage Jenkins -> Configure System). There, you create a “Multi-Project Throttle Category”
Next, for each job that needs to be run sequentially, you
Check the ‘Throttle Concurrent Builds’ check box
Select the ‘Throttle this project as part of one or more categories’ radio button
Check the checkbox of the “Multi-Project Throttle Category” you created above
Save
Finally, you create a wrapper job that triggers all your to-be-run-sequentially jobs and when you run it, you should see your jobs running sequentially and in order.
Although a fairly complex option, this approach also allows you to run some jobs sequentially and others in parallel. For example, your wrapper job can run jobs A and B sequentially and jobs C and D sequentially, but trigger A and C in parallel. You can also combine this plugin with the Parameterized Trigger Plugin for maximum flexibility.
Join Plugin
Although I haven’t used the Join Plugin myself, it seems worth mentioning. Its docs state that
This plugin allows a job to be run after all the immediate downstream jobs have completed. In this way, the execution can branch out and perform many steps in parallel, and then run a final aggregation step just once after all the parallel work is finished.
That’s it. Several options for running Jenkins jobs in parallel. Choose the one that best suits your needs…

Jenkins: Let a job wait while another job is running

I have a set of components, one being the base component (A) and all others (B1 .. Bn) being dependent on the base component.
I have Jenkins jobs for each component which are triggered by SCM changes on the respective repositories. Now I would like to configure that a job of a Bx component waits if the base component A is currently building in order to have the latest build of A incorporated in its own build.
I looked at "throttle concurrent builds" plugin but I think I cannot use it in this case. Also I found code snippets to detect if another specific job is running. But then, how do I let the current job wait until the other is finished?
Try Build Blocker plugin, it does exactly what you need: you can specify job name of A component in Blocking jobs section for every job of Bx components.

Upstream and downstream puzzle in Jenkins

Consider that I have jobs A,B,C,D in Jenkins.
Job A- contains my common code dependencies.
Job B,C,D- These are independent jobs, which refers to the dependencies in Job A.
Please note that A and B,C,D(have the upstream downstream relationship). I have added build triggers in each of them such that, each time a code is merged in GIT, it triggers the job.
The problem what I face is, Each time when the build is triggered for B, it invokes A. Now A has a list of downstream jobs C and D, it builds C and D jobs too. I want to make sure that, if B is triggered due to a commit then it should build only A and B jobs and not C and D.
I think there is a wrong configuration which i have set, kindly suggest.
I use jenkins version 1.643. Please let me know if there are plugins which can help me handle it.

How to pass the Kitver to the job using Jenkins build promotion?

I need to promote a deploy job, but have only the Kitver which is a parameter passed from the last downstream job.
So let us say the following jobs
A builds and compile and creates a zip with version and pushes to a ftp server. I don't store artifacts on job workspace.
B does the deployment by taking the Kitver passed on to it from A.
C does the testing
All the above is a build pipeline process so all of them happens sequentially.
D is a Prod deploy job, not in the pipeline but needs to be connected to A job using a promotion process. How to pass the Kitver to this job using promotion?
I am assuming the job A is the one that has the promotion process configured.
Perhaps you would store the Kitver into a file and archive it has a build artifact.
When the promotion process runs, it has a parameter, PROMOTED_BUILD, which is the build which is being promoted. If you pass that number to job D as a parameter, D can get the file and read the Kitver.
Alternatively you can get the file in the promotion process, read the file and pass Kitver as a parameter to D. Please note that this has to be done carefully. The promotion process runs in the same workspace as build A. The promotion process may be running at the same time as the next build of A, so they must take care not to touch or delete each other's files.
It would be safer to simply pass PROMOTED_BUILD to D and let D do all the complicated things because D will have its own workspace.

Hudson dependencies

I have set up my hudson job A. Job A depends on job B and C. I have set them up with "Build other projects". This works well, although each job is in separate directory in my workspace (default structure). But I need job B and C in jobs A workspace (root folder).
I have considered two approaches:
Change the workspace for job A and push that variable to job via "Trigger parameterized build on other projects" and then use ant build script to copy them to that location, since I couldnt find an option to change the folder where job B or C should go
Trigger job B and then C from build script as part of job A. This is done via remote calls (found it somewhere on stackoverflow), but that option is missing in my configuration and I couldnt find any plugin that would add it.
Ideal approach for me would be to use ant build script and trigger job B and C from there with antsvn or something like that. But I cant find a solid example of this.
Reason why I want it this way is simple - job B is CMS which is essential for job A and job C has python scripts that need to be executed before new version can land on production server (this is already done with py-ant).
Or maybe there is some better way to manage dependencies like this. Any help is appreciated.
I hope it makes sense.
Think of Jobs "B" and "C" as producing "artifacts" that Job "A" needs. Then, all you have to do is import the artifacts produced by Jobs "B" and "C" whenever you build Job "A".
Your jobs shouldn't share workspaces. Otherwise what happens if Job "A" is building when Job "B" or "C" is triggered? You'll have multiple builds going on at once. However, if you separate out what "A" needs from jobs "B" and "C", you can have Job "A" import those dependencies. There are two ways of doing this:
The hard but correct way: You should create a release repository where jobs can fetch the artifacts they need. If this sounds Mavinish to you, well, it is. However, I've used Maven architectural stuff without Maven projects and it works fine. You can use something like Artifactory or Nexus as your release repository. Then use wget or curl to fetch the items from the repository and use Maven's deploy:deploy-file plugin to send the stuff over. You will need Maven (which is a Java process) to run deploy:deploy-file, but you don't need a Maven project, or even a Java project. The deploy:deploy-file plugin doesn't even require a Maven pom.xml file. Think of it more like a command line utility to send stuff to your release repository.
The easy, but incorrect way: Hudson has a Copy Artifacts plugin that you can use to do this. The problem is that it's easy to setup, but hard to start tracking. Plus, it makes you dependent upon a very specific tool. If you decide to move away from Hudson, you might not be able to duplicate this functionality.

Resources