I have two Jenkins jobs A1 and A2 that retrieve the project from SVN, and job B which builds it. Jobs A1 and A2 both retrieve the codebase, but you can paremeterize them differently. Both reuse job B for building.
Job B copies the artifacts from upstream job, but my problem is that you can only specify one upstream job. I need to specify that job B can retrieve artifacts either A1 or A2, depending on which one striggered the downstream job. Any ideas?
This is a somewhat unusual way to structure your builds. Usually people prefer to check out the sources in the same job that builds the sources. It keeps things simple and is usually much faster than using Jenkins' artifact copying which, let's face it, isn't the fastest way to move stuff around.
But if you really think that's the proper way for you to do it, have you tried providing the job name as a parameter? When A1 triggers B, it should pass "A1" as a parameter and so on.
The built-in Jenkins post-build action can trigger other jobs but it cannot pass parameters. You can install https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin which is able to trigger other jobs with parameters.
Related
I have a multi-module Maven project that installs a whole bunch of artifacts (with different classifiers) into the local Maven repository. I also have a second Maven project that uses the Maven Dependency Plugin to collect those artifacts into a number of different directories (for installer building purposes). And finally I have a Jenkins that I want to do all that for me.
There are a number of requirements I would like to see fulfilled:
Building the source code (and running the tests) and building the installers should be two separate jobs, Job A and Job B.
Job A needs to finish quickly; as it contains the tests the developers should get feedback as fast as possible.
The artifacts of Job B take up a lot of space but they need to be archived so this job should only run when the results of Job A do meet certain requirements (which are not a part of this problem).
Job B needs to be connected to Job A. It must be possible to tell exactly which Job A instance created the files that were used in the build of Job B. (It is also possible that I need a run of Job B for a particular build of Job A which was three weeks and 200 builds ago.)
And finally both jobs should be able to be executed locally on a developer’s machine so I would love to keep most of the configuration within Maven and only relegate to Jenkins what’s absolutely necessary. (Using the Copy Artifacts Plugin I can collect the artifacts from Job A into the required directories in Job B but when removing the collection from the Maven project I also take away the developer’s ability to do local builds.)
Parts of 3 and 4 can be achieved using the Promoted Builds plugin for Jenkins. However, I cannot seem to make sure that the files collected in Job B are exactly the files created by a certain run of Job A. During development all our version numbers of all involved projects are suffixed with “-SNAPSHOT” so that an external job has no way of knowing whether it actually got the correct file or whether it was given a newer file because another instance of Job A has been running concurrently. The version numbers are then increased directly before a release.
Here are some things I have tried and found to be unsatisfactory:
Use a local repository in the workspace directory of Job A. This will, upon each build, download all of the dependencies from our Nexus. While this does not have a huge impact on the diskspace it does consume way too much time.
Merge Job A and Job B into a single job. As Job B takes more time than time A, developers have to wait longer for feedback, it still uses a lot of diskspace—and it doesn’t really solve the problem as there is still the possibility of another Job A+B running at the same time.
Am I missing something obvious here? Are Maven or Jenkins or the combination of both unable to do what I want? What else could I try?
Consider that I have jobs A,B,C,D in Jenkins.
Job A- contains my common code dependencies.
Job B,C,D- These are independent jobs, which refers to the dependencies in Job A.
Please note that A and B,C,D(have the upstream downstream relationship). I have added build triggers in each of them such that, each time a code is merged in GIT, it triggers the job.
The problem what I face is, Each time when the build is triggered for B, it invokes A. Now A has a list of downstream jobs C and D, it builds C and D jobs too. I want to make sure that, if B is triggered due to a commit then it should build only A and B jobs and not C and D.
I think there is a wrong configuration which i have set, kindly suggest.
I use jenkins version 1.643. Please let me know if there are plugins which can help me handle it.
I need to promote a deploy job, but have only the Kitver which is a parameter passed from the last downstream job.
So let us say the following jobs
A builds and compile and creates a zip with version and pushes to a ftp server. I don't store artifacts on job workspace.
B does the deployment by taking the Kitver passed on to it from A.
C does the testing
All the above is a build pipeline process so all of them happens sequentially.
D is a Prod deploy job, not in the pipeline but needs to be connected to A job using a promotion process. How to pass the Kitver to this job using promotion?
I am assuming the job A is the one that has the promotion process configured.
Perhaps you would store the Kitver into a file and archive it has a build artifact.
When the promotion process runs, it has a parameter, PROMOTED_BUILD, which is the build which is being promoted. If you pass that number to job D as a parameter, D can get the file and read the Kitver.
Alternatively you can get the file in the promotion process, read the file and pass Kitver as a parameter to D. Please note that this has to be done carefully. The promotion process runs in the same workspace as build A. The promotion process may be running at the same time as the next build of A, so they must take care not to touch or delete each other's files.
It would be safer to simply pass PROMOTED_BUILD to D and let D do all the complicated things because D will have its own workspace.
I am trying to make this rather unique build flow and I haven't found a plugin or a way to do it with jenkins yet.
There is one job called "JOB A" which is used by itself and creates a standalone installer.
Then there is "JOB B" which creates another installer but it needs to include everything built in "JOB A" in addition to some other stuff. Now I could just copy JOB A build steps into JOB B, but I want to actually build JOB A and maybe even use those artifacts later as well.
It cannot be a build trigger cause JOB B needs to continue building after JOB A has finished and I cannot use something like flow because that creates JOB C and only sequences other jobs and I would need to go into A and B to get the artifacts.
Bonus points would be if it checked JOB A source code in git for any changes since its last build when building JOB B and decide if it needs to build it again.
I looked at many plugins and I can't seem to find one that would do this.
I hope my explanation was not confusing. Sorry if it was, I could elaborate.
If I understand correctly what you want, then what you need is:
Custom (shared) workspace
Parameterized Trigger Plugin
For both, JOB A and JOB B, setup Custom Workspace to the same folder on the server (You can even leave JOB A workspace as is, and just point JOB B custome workspace to workspace of JOB A. I am not at my work computer with Jenkins and can't provide screenshots, so I will borrow this great guide for more info on how to setup custom workspace
Then, whenever appropriate, have JOB A execute a build step Trigger/call builds on other projects, namely JOB B. You can even pass it all the same parameters that JOB A had. By default, this will not wait for JOB B to complete. It will kick off JOB B, meanwhile JOB A will finish running, and then JOB B completes whenever it is done.
If needed, you can check-mark Block until triggered projects finish their builds, and then JOB A will wait for JOB B to finish before continuing.
So, the above will:
Share workspace, and not do extra checkouts if code didn't change
Let JOB A and JOB B exist independently, with it's own artifacts, and each being able to be triggered separately.
JOB B will get everything from JOB A through shared workspace and passed parameters.
Is there are a way one can share variables across jenkins jobs ?
Job1 collects the required source code and labels them using perl scripts.
Then a number of other jobs compiles the code since there are many versions. As of now i have made the other jobs depend on Job1 so that same code could be collected from head since it was labelled just before in Job1, but this was not the case during release since codes were going in the repository at odd hours so we had no control, so we thought it would be nice if we could find a way to sync the code using perforce label created in Job1. I did not find any way to sync to particular label that got created in a different job.
So i thought if we could set an environment variable and then use the same for the following jobs, then the codes can be in perfect sync. But seems like environment variables cannot be shared across jobs.
I would appreciate any ideas and help.
Can you use the "Use Upstream Project revision" option? It allows you to sync to the changeset of another project.
If you want to stick to the label idea, I think it's doable. I haven't tried this, but I think I would first have the first job create a new label based on the job name and the build number; both are available in the create label post-build action.
If you launch the downstream job using the paramaterized trigger plugin it will have access to the upstream job name and build number as environment variables. The 'P4 Label' field in the downstream job can then use parameter substitution to specify the correct label name to sync to.
Perforce plugin can help you.
Look at section "Sync multiple builds to the same changeset".