How to pass the Kitver to the job using Jenkins build promotion? - jenkins

I need to promote a deploy job, but have only the Kitver which is a parameter passed from the last downstream job.
So let us say the following jobs
A builds and compile and creates a zip with version and pushes to a ftp server. I don't store artifacts on job workspace.
B does the deployment by taking the Kitver passed on to it from A.
C does the testing
All the above is a build pipeline process so all of them happens sequentially.
D is a Prod deploy job, not in the pipeline but needs to be connected to A job using a promotion process. How to pass the Kitver to this job using promotion?

I am assuming the job A is the one that has the promotion process configured.
Perhaps you would store the Kitver into a file and archive it has a build artifact.
When the promotion process runs, it has a parameter, PROMOTED_BUILD, which is the build which is being promoted. If you pass that number to job D as a parameter, D can get the file and read the Kitver.
Alternatively you can get the file in the promotion process, read the file and pass Kitver as a parameter to D. Please note that this has to be done carefully. The promotion process runs in the same workspace as build A. The promotion process may be running at the same time as the next build of A, so they must take care not to touch or delete each other's files.
It would be safer to simply pass PROMOTED_BUILD to D and let D do all the complicated things because D will have its own workspace.

Related

Use Maven Artifacts Installed by Jenkins Job A in Job B

I have a multi-module Maven project that installs a whole bunch of artifacts (with different classifiers) into the local Maven repository. I also have a second Maven project that uses the Maven Dependency Plugin to collect those artifacts into a number of different directories (for installer building purposes). And finally I have a Jenkins that I want to do all that for me.
There are a number of requirements I would like to see fulfilled:
Building the source code (and running the tests) and building the installers should be two separate jobs, Job A and Job B.
Job A needs to finish quickly; as it contains the tests the developers should get feedback as fast as possible.
The artifacts of Job B take up a lot of space but they need to be archived so this job should only run when the results of Job A do meet certain requirements (which are not a part of this problem).
Job B needs to be connected to Job A. It must be possible to tell exactly which Job A instance created the files that were used in the build of Job B. (It is also possible that I need a run of Job B for a particular build of Job A which was three weeks and 200 builds ago.)
And finally both jobs should be able to be executed locally on a developer’s machine so I would love to keep most of the configuration within Maven and only relegate to Jenkins what’s absolutely necessary. (Using the Copy Artifacts Plugin I can collect the artifacts from Job A into the required directories in Job B but when removing the collection from the Maven project I also take away the developer’s ability to do local builds.)
Parts of 3 and 4 can be achieved using the Promoted Builds plugin for Jenkins. However, I cannot seem to make sure that the files collected in Job B are exactly the files created by a certain run of Job A. During development all our version numbers of all involved projects are suffixed with “-SNAPSHOT” so that an external job has no way of knowing whether it actually got the correct file or whether it was given a newer file because another instance of Job A has been running concurrently. The version numbers are then increased directly before a release.
Here are some things I have tried and found to be unsatisfactory:
Use a local repository in the workspace directory of Job A. This will, upon each build, download all of the dependencies from our Nexus. While this does not have a huge impact on the diskspace it does consume way too much time.
Merge Job A and Job B into a single job. As Job B takes more time than time A, developers have to wait longer for feedback, it still uses a lot of diskspace—and it doesn’t really solve the problem as there is still the possibility of another Job A+B running at the same time.
Am I missing something obvious here? Are Maven or Jenkins or the combination of both unable to do what I want? What else could I try?

Upstream and downstream puzzle in Jenkins

Consider that I have jobs A,B,C,D in Jenkins.
Job A- contains my common code dependencies.
Job B,C,D- These are independent jobs, which refers to the dependencies in Job A.
Please note that A and B,C,D(have the upstream downstream relationship). I have added build triggers in each of them such that, each time a code is merged in GIT, it triggers the job.
The problem what I face is, Each time when the build is triggered for B, it invokes A. Now A has a list of downstream jobs C and D, it builds C and D jobs too. I want to make sure that, if B is triggered due to a commit then it should build only A and B jobs and not C and D.
I think there is a wrong configuration which i have set, kindly suggest.
I use jenkins version 1.643. Please let me know if there are plugins which can help me handle it.

Include a different job in a job's build steps in Jenkins

I am trying to make this rather unique build flow and I haven't found a plugin or a way to do it with jenkins yet.
There is one job called "JOB A" which is used by itself and creates a standalone installer.
Then there is "JOB B" which creates another installer but it needs to include everything built in "JOB A" in addition to some other stuff. Now I could just copy JOB A build steps into JOB B, but I want to actually build JOB A and maybe even use those artifacts later as well.
It cannot be a build trigger cause JOB B needs to continue building after JOB A has finished and I cannot use something like flow because that creates JOB C and only sequences other jobs and I would need to go into A and B to get the artifacts.
Bonus points would be if it checked JOB A source code in git for any changes since its last build when building JOB B and decide if it needs to build it again.
I looked at many plugins and I can't seem to find one that would do this.
I hope my explanation was not confusing. Sorry if it was, I could elaborate.
If I understand correctly what you want, then what you need is:
Custom (shared) workspace
Parameterized Trigger Plugin
For both, JOB A and JOB B, setup Custom Workspace to the same folder on the server (You can even leave JOB A workspace as is, and just point JOB B custome workspace to workspace of JOB A. I am not at my work computer with Jenkins and can't provide screenshots, so I will borrow this great guide for more info on how to setup custom workspace
Then, whenever appropriate, have JOB A execute a build step Trigger/call builds on other projects, namely JOB B. You can even pass it all the same parameters that JOB A had. By default, this will not wait for JOB B to complete. It will kick off JOB B, meanwhile JOB A will finish running, and then JOB B completes whenever it is done.
If needed, you can check-mark Block until triggered projects finish their builds, and then JOB A will wait for JOB B to finish before continuing.
So, the above will:
Share workspace, and not do extra checkouts if code didn't change
Let JOB A and JOB B exist independently, with it's own artifacts, and each being able to be triggered separately.
JOB B will get everything from JOB A through shared workspace and passed parameters.

Jenkins copy artifacts from upstream job when there are multiple upstream jobs

I have two Jenkins jobs A1 and A2 that retrieve the project from SVN, and job B which builds it. Jobs A1 and A2 both retrieve the codebase, but you can paremeterize them differently. Both reuse job B for building.
Job B copies the artifacts from upstream job, but my problem is that you can only specify one upstream job. I need to specify that job B can retrieve artifacts either A1 or A2, depending on which one striggered the downstream job. Any ideas?
This is a somewhat unusual way to structure your builds. Usually people prefer to check out the sources in the same job that builds the sources. It keeps things simple and is usually much faster than using Jenkins' artifact copying which, let's face it, isn't the fastest way to move stuff around.
But if you really think that's the proper way for you to do it, have you tried providing the job name as a parameter? When A1 triggers B, it should pass "A1" as a parameter and so on.
The built-in Jenkins post-build action can trigger other jobs but it cannot pass parameters. You can install https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin which is able to trigger other jobs with parameters.

Hudson dependencies

I have set up my hudson job A. Job A depends on job B and C. I have set them up with "Build other projects". This works well, although each job is in separate directory in my workspace (default structure). But I need job B and C in jobs A workspace (root folder).
I have considered two approaches:
Change the workspace for job A and push that variable to job via "Trigger parameterized build on other projects" and then use ant build script to copy them to that location, since I couldnt find an option to change the folder where job B or C should go
Trigger job B and then C from build script as part of job A. This is done via remote calls (found it somewhere on stackoverflow), but that option is missing in my configuration and I couldnt find any plugin that would add it.
Ideal approach for me would be to use ant build script and trigger job B and C from there with antsvn or something like that. But I cant find a solid example of this.
Reason why I want it this way is simple - job B is CMS which is essential for job A and job C has python scripts that need to be executed before new version can land on production server (this is already done with py-ant).
Or maybe there is some better way to manage dependencies like this. Any help is appreciated.
I hope it makes sense.
Think of Jobs "B" and "C" as producing "artifacts" that Job "A" needs. Then, all you have to do is import the artifacts produced by Jobs "B" and "C" whenever you build Job "A".
Your jobs shouldn't share workspaces. Otherwise what happens if Job "A" is building when Job "B" or "C" is triggered? You'll have multiple builds going on at once. However, if you separate out what "A" needs from jobs "B" and "C", you can have Job "A" import those dependencies. There are two ways of doing this:
The hard but correct way: You should create a release repository where jobs can fetch the artifacts they need. If this sounds Mavinish to you, well, it is. However, I've used Maven architectural stuff without Maven projects and it works fine. You can use something like Artifactory or Nexus as your release repository. Then use wget or curl to fetch the items from the repository and use Maven's deploy:deploy-file plugin to send the stuff over. You will need Maven (which is a Java process) to run deploy:deploy-file, but you don't need a Maven project, or even a Java project. The deploy:deploy-file plugin doesn't even require a Maven pom.xml file. Think of it more like a command line utility to send stuff to your release repository.
The easy, but incorrect way: Hudson has a Copy Artifacts plugin that you can use to do this. The problem is that it's easy to setup, but hard to start tracking. Plus, it makes you dependent upon a very specific tool. If you decide to move away from Hudson, you might not be able to duplicate this functionality.

Resources