Problems with Jenkins Build pipeline and parallel jobs - jenkins

Background
I am using Jenkins with the Build Pipeline plugin to build some fairly complicated projects that require multiple compilation steps:
Build source RPM.
Build binary RPM (this is performed twice, once for each platform).
Deploy to YUM repository.
My strategy for solving build requirements involves splitting the common work into parameterized jobs that can be reused across projects and branches, with each job representing one stage in the pipeline. Each stage is triggered with parameters, and build artifacts passed along to the next job in the pipeline. However, I'm having some trouble with this strategy, and could really use some tips on how to go about solving this problem in the most elegant and flexible way possible.
To be more specific, there are two common libraries, which are shared by other projects (but not all projects). The libraries are built differently from the dependent projects, but it should not be necessary to specify in Jenkins what the dependent projects are.
There are multiple branches, the master branch (rebuilt nightly), the develop branch (polled for changes), feature branches (also polled), and release branches (polled, but built for release). The branches are built the same way across multiple projects.
We create multiple repositories every month, and whilst it is feasible to expect a little setup for a new project, generally we want this to be as simple and automated as possible.
The Problems
I have many projects with multiple branches, and I do not wish to build all branches or even all projects in the same way. Because most of the build steps are similar I can turn these common steps into parameterized build jobs, and get each job to trigger the next in the chain, passing parameters and build artifacts along the chain. However, this falls apart if one of the steps needs to be skipped, because I don't know of a way to conditionally skip a build step. This implies I would need to copy the build jobs so that I can customise them for each pipeline, resulting in a very large number of build jobs. I could use a combination of plugins to create a job generator (eg. dsl flow, dsl job, etc), and hide as much as possible from the users, but what's the most elegant Jenkins solution to this? Are there any plugins, or examples that I might have missed? What's your experience of doing this?
Because step 2 can be split into two jobs that can be run in parallel, this introduces a complexity that is causing me problems with my pipeline. My first attempt would trigger a parameterized build job twice with different parameters, and then join the jobs afterwards using the join plugin, but it was beginning to look like it would be complicated to copy in the build artifacts from the two upstream jobs. This is significant, because I need the build artifacts from both jobs for stage 3. What's the most elegant solution to join parallel jobs and copy artifacts from them all? Are there any examples that I might have missed?
I need to combine test results generated from both of the jobs in stage 2, and copy them to the job that triggers the build. What's the best way to handle this?
I'm happy to read articles, presentations, technical articles, reference documentation, write scripts and whatever else necessary to make this work nicely, but I'm not a Jenkins expert. If anyone can give me some advice on these 3 problems then that would be helpful. Additionally, I would appreciate any constructive advice on how to get the best out of pipeline CI builds in Jenkins, if relevant.

For the first point, the Job Generator plugin I wrote has been developed to address this use case. You can find more info on the wiki page of Job Generator.
There is also the same type of plugin with a different approach (job generator as a build step), it is called Jobcopy Builder.
The other approaches you mentioned require some kind of DSL and can be a good choice too.

Related

Add Promotion after jenkins pipeline

I know well Jenkins free-style jobs, but I'm new to pipelines.
I'm reworking some validation pipelines to add a manual step/pause before the final publication/diffusion step.
I've read about the input directive, but it's not what we want.
Our need is to be able to run the same pipeline several (10) times with different parameters, and afterwards, make a manual check on some or all the runs, eventually update some repports. Then go on with downstream operations (basically officially publish the reports, for QA team or others).
I used to do such things easily with free-style jobs, and manual promotions (restricted to authorized users). It's safe because all the build artefacts are saved at the end of the free-style job, and can be post-processed later.
Is there a way to achieve such thing in a pipeline ? (adding properties / promotions)
A key point for us is that the same pipeline job should be run several time, so each time the artefacts should be stored in a different location/workspace.
I used the input directive. It's expecting an interractive input.
If I launch the same job again, I'm afraid it will use the same workspace.
I added a free-style job, triggered after the pipeline job.
in this job, I retrieve the artefacts from the first job. The promotion does the job, but it's quite ugly implementation.
I tried to add a promotion using properties + promotions, but I'm not sure if I can do the same thing as a manual promotion in a free-style job. That's what I would like to do, to keep all things in the pipeline.
I have searched for some time, but I did not found much about it.
I have read some things like
https://issues.jenkins.io/browse/JENKINS-36089
or
https://www.jenkins.io/projects/gsoc/2019/artifact-promotion-plugin-for-jenkins-pipeline/
which say it's not possible.
But I'm sure some people have the same need, so some solutions should exist.

Jenkins pipeline Continuous Integration

I have 20-30 projects that I am working on with their own git repos, each repo have several branches not dependent on other projects. I am looking if there's some way to come up with Jenkins Pipeline to accommodate all the projects with CI/CD ecosystem. OR do I need to create separate pipeline for each repo.
Is there a way I can use one Jenkins-file into all these projects.
how do you share data between pipelines if module 3 is depenendent on data coming from Module1&2.
do I need to create 30 hooks/tokens if I have 30 projects?
I was able to create dependent build triggers between the very first three such that if A & B build then C will build using the SCM polling option and build triggers.
Thanks in advance. Appreciate any help, feedback or suggestions.
You can use shared libraries in Jenkins pipelines. It is a fairly involved process requiring writing libraries in groovy.
As Pipeline is adopted for more and more projects in an organization,
common patterns are likely to emerge. Oftentimes it is useful to share
parts of Pipelines between various projects to reduce redundancies and
keep code "DRY".
Pipeline has support for creating "Shared Libraries" which can be
defined in external source control repositories and loaded into
existing Pipelines.

Jenkins CI workflow with separate build and automated test both in source control

I am trying to improve our continuous integration process using Jenkins and our source control system (currently svn, but git soon).
Maybe I am thinking about this overly complicated, or maybe I have not yet seen the right hints.
The process I envisioned has three steps and associated roles:
one or more developers would do their job and ultimately submit the code changes for the actual software ("main software") as well as unit tests into source control (git, or something else). Jenkins shall build the software, run unit tests and perhaps some other steps (e.g. static code analysis). If none of this fails, the work of the developers is done. As part of the build, the build number is baked into the main software itself as part of the version number.
one or more test engineers will subsequently pickup the build and perform tests. Some of them may be manual, most of them are desired to be automated/scripted tests. These shall ultimately be submitted into source control as well and be executed through the build server. However, this shall not trigger a new build of the main software (since there is nothing changed). If none of this fails, the test engineers are done. Note that our automated tests currently take several hours to complete.
As a last step, a project manager authorizes release of the software, which executes whatever delivery/deployment steps are needed. Also, the source of the main software, unit tests, and automated test scripts, the jenkins build script - and ideally all build artifacts ("binaries") - are archived (tagged) in the source control system.
Ideally, developers are able to also manually trigger execution of the automated tests to "preview" the outcome of their build.
I have been unable to figure out how to do this with Jenkins and Git - or any other source control system.
Jenkin's pipelines seem to assume that all steps are carried out in sequence automatically. It also seems to assume that committing code into source control starts at the beginning (which I believe is not true if the commit was "merely" automated test scripts). Triggering an unnecessary build of the main software really hurts our process, as it basically invalidates and manual testing and documentation, as it results in a new build number baked into the software.
If my approach is so uncommon, please direct me how to do this correctly. Otherwise I would appreciate pointers how to get this done (conceptually).
I will try to reply with some points. This is indeed conceptually approach as there are a lot of details and different approaches too, this is only one.
You need git :)
You need to setup a git branching strategy which will allow to have multiple developers to work simultaneously, pushing code and validating it agains the static code analysis. I would suggest that you start with Git Flow, it is widely used and can be adapted to whatever reality you do have - you do not need to use it in its pure state, so give some thought how to adapt it. Fundamentally, it will allow for each feature to be tested. Then, each developer can merge it on the develop branch - from this point on, you have your features validated and you can start to deploy and test.
Have a look at multibranch pipelines. This will allow you to test the several feature branches that you might have and to have different flows for the other branches - develop, release and master - depending on your deployment needs. So, when you have a merge on develop branch, you can trigger testing or just use it to run static code analysis.
To overcome the problem that you mention on your second point, there are ways to read your change sets on the pipeline, and in case the changes are only taken on testing scripts, you should not build your SW - check here how to read changes, and here an example of how to read changes and also to prevent your pipeline to build all the stages according to the changes being pushed to git.
In case you still have manual testing taking place, pipelines are pausable which means that you can pause the pipeline asking for approval to proceed. Before approving, testers should do whatever they have to, and whenever they are ready to proceed, just approve the build to proceed for the next steps.
Regarding deployments authorization, it is done the same way that I mention on the last point, with the approvals, but in this case, you can specify which users/roles are allowed to approve that step.
Whatever you need to keep from your builds, Jenkins has an archive artifacts utility. Let me just note that ideally you would look into a proper artefact repository such as Nexus.
To trigger manually a set of tests... You can have a manually triggered job on Jenkins apart from your CI/CD pipeline, that will only execute the automated tests. You can even trigger this same job as one pipeline stage - how to trigger another jobs
Lastly let me say that the branching strategy is the starting point.
Think on your big picture, what SDLC flows you need to have and setup those flows on your multibranch pipeline. Different git branches will facilitate whatever flows you need within the same Jenkinsfile - your pipeline definition. It really depends on how many environments you have to deploy to and what kind of steps you need.

How to trigger jenkins multi-branch pipeline build from a gitlab merge request from a fork?

Folks,
I'm actually having loads of fun since switching to multibranch pipelines in Jenkins which i use in combination with GitLab.
But something i still do not wrap my head around is how to build merge request that originates from a fork - the ones coming from the same remote triggers a build but not the ones from my fork !
I'd be really happy to hear any idea about this.
Thanks a lot community !
Maybe you could try a different approach:
A multibranch pipeline is for continuous build (with only compilation and testing). This is basically a quick feedback on our current work.
Building a MR is altogether another business. It is supposed to provide all necessary elements to the reviewer to determine wether a MR can be merged on the master branch or not. Therefore, it could imply code quality analysis, security analysis, gating, dashboards...
With this respect, it should be separated in different jobs.
Having two seperate jobs for the two functionalities is not only cleaner, but I believe, it would solve your trigger from fork problem.

Jenkins: a heavily branched chain of build jobs

We would like to set up Continuous Integration and Continuous Deployment processes on the base of Jenkins ecosystem. Currently we're trying to put together all the Jenkins build jobs we have (from sources to several endpoint processes launched on the testing server). There are three kinds of build/deployment processes in our case:
Building deb packages from C++ projects (some of them are dependent, others are dependencies);
Building images from Docker containers;
Launching some processes in the endpoint;
As you can notice, we faced with a heavily branched chain of jobs triggered by each other. And every update of any of the upstream projects must go throughout the chain of jobs and trigger the final job (process I). So It would be nice to use some kind of Jenkins plugins that will:
Control such a complicated structure of jobs (I tried to use Build Pipeline Plugin and I got the impression that this tool is suitable for "linear" job chain);
Provide clean way of passing the parameters between job environments.
As #slav mentioned, the Workflow plugin should be able to handle this kind of complex control flow, including parallel handling of subtasks, simple handling of variables throughout the process (would just be Groovy local variables), and Docker support.
You could of course arrange this whole process in a single build.gradle (or Makefile for that matter). That would be suitable if you did not mind running all the steps on the same Jenkins slave, and did not otherwise need to interact with or report to Jenkins in any particular way in the middle of the build.
Well, for passing parameters, you should be using Parameterized Trigger Plugin.
For a more asynchronous passing of parameters, you an use EnvInject plugin (it's extremely useful and flexible for all sorts of things, and considering your complexity, might prove useful regardless if you use it for passing parameters or not)
As for control, research into Workflow plugin. It allows to write the whole execution flow in it's own Groovy script, with fine granular control. More links:
Official - https://jenkins-ci.org/content/workflow-plugin-10
Tutorial - https://github.com/jenkinsci/workflow-plugin/blob/c15589f/TUTORIAL.md#pausing-flyweight-vs-heavyweight-executors

Resources