Jenkins trigger another job for specific branch - jenkins

I'd have a pipeline with some jobs until the deploy like that:
(unit test) -> (integration test) -> (package) -> (deploy)
but I'd like to run the jobs (unit test) and (integration test) in all branches,
and run jobs (package) and (deploy) only on changes on branch master.
How can I do it? Are there a conditional trigger plugin?

If you have a fixed number of branches, it's easiest to have separate Jenkins jobs for each branch. If you use a single job to build multiple branches, it will be hard to tell which branch a particular build ran on when you look at the build history.
You can create the jobs for different branches by copying from an existing job via the UI, or you could look at the Job DSL plugin if you want to automate job creation and make it easy to create jobs for new branches.

UPDATE: 2018
The flexible publish plugin (github) can do this now.
You can specify run conditions (e.g. based on the a regex match of GIT_BRANCH) and a flexible set of actions to take when conditions match.
I am using this to trigger downstream deployment jobs only for the develop branch of one of our repositories.

Related

Testing changes in a generated Jenkins job

I have a series of Jenkins jobs that are generated via Jobs DSL that are dependent upon one another.
For example:
Jobs:
Build package
Test package
Deploy package
I'd like to test a change in one of those jobs, i.e. the Test Package job. I have a branch and a PR in the seed repo.
My only means of testing this right now is to edit the job and build it manually in the UI and hope that it doesn't break anything, I could also merge the PR without testing it and risk breaking the production workflow.
Neither of the above options are good. I'd like to isolate my changes so that I can test the job from my branch or PR. Should I switch to using a Pipeline or is there a way to use Jobs DSL off of a PR or branch (like develop)?

make a Jenkinsfile that uses Generic-Webhook-Trigger and acts like a regular multibranch-pipeiine job with how it handles jobs from branches and PRs

with Jenkins multibranch pipeline job, it will create a separate job for each branch and each PR. How can I setup a multibranch pipeline job that uses a Jenkinsfile with the Generic-Webhook-Trigger add-on and have it act in the same way with how it processes each branch and each PR?
A common issue I keep coming across is that all the branches and PRs get processed under the same job because trigger criteria for that job allows for all of them.

Use one Jenkinsfile or multiple Jenkinfiles

We are currently using Windows \ Jenkins 2.107.1 (no pipeline), and I am researching going to pipeline. We have a nightly build job, that fetches from repositories, and submits and waits on other jobs. I see 9 jobs running on the same Master node (we only have a master), at the same time. I am not clear on if we should have one Jenkinsfile or multiple Jenkinsfiles. It will not be a multibranch pipeline, as we do not create test branches and then merge back to a master. In the repository we have product1.0 branch, product2.0 branch etc, and build only one branch (the latest one). While I do like the Blue Ocean editor, it is only for MultiBranch pipelines.
Do I combine all the jobs into one Jenkinsfile, or create multiple jenkins files for each of the existing jobs (Jenkinsfilestart, JenkinsfileFetchCVs, JenkinsFileFetchGit, Jenkinsfilenextjob,etc., and have one call the other)?. Do I create all the old jobs as Jenkinsfiles, or scripts executed by the one master Jenkinsfile? Do I do this in Declarative or script ?
Have set up Jenkins pipeline on test VM, but not clear on which way to go yet.
Looking for directions and\or examples. Is there documentation on how to convert existing Jenkins non-pipeline systems?
I found this after doing the initial post...https://wiki.jenkins.io/display/JENKINS/Convert+To+Pipeline+Plugin.
It does help a little in that it gives you some converted steps, but cannot convert all the steps, and will give comments in the pipeline script "//Unable to convert a build step referring to...please verify and convert manually if required." There is an option "Recursively convert downstream jobs if any" and if you select that, it appears to add all the downstream jobs to the same pipeline script, and really confuses the job parameters. There is also an option to "Commit JenkinsFile". I will play with this some more, but it is not the be all and end all of converting to pipeline, and I still am not sure of whether I should be have one or more scripts.
Added 07/26/19 -
Let’s see if I have my research to date correct…
A Declarative pipeline (Pipeline Script from SCM), is stored in a Jenkinsfile in the repository. Every time that this Jenkins job is executed, a fetch from the repository is done (to get the latest version of the Jenkinsfile).
A Pipeline script is stored as part of the config.xml file in the Jenkins\Jobs folder (it is not stored in the repository, or in a separate Jenkinsfile in the jobs folder). There is a fetch from the repository only if the job requires it (you do not need to do a repository fetch to get the Pipeline script).
Besides our nightly product build, we also have other jobs. I could create a separate Declarative Jenkinsfile for each of them (JenkinsfileA, JenkinsfileB, etc.) for each of the other jobs and store then in the repository also (in the same branch as the main Jenkinsfile), but that would mean that every one of those additional jobs, to get the particular Jenkinsfile for that job, would also need to do a repository fetch (basically fetching\cloning the repository branch for each job, and have multiple versions of the repository branch unnecessarily downloaded to the workspace of each job).
That does not make sense to me (unless my understanding of things to date is incorrect). Because the main product build does require a fetch every time it is run (to get any possible developer check-ins), I do not see a problem doing Declarative Jenkinsfile for that job. For the other jobs (if we do not leave then for the time being in the classic (non-pipeline) format)), they will be Pipeline scripts.
Is there any way of (or plans for), being able to do Declarative pipeline without having to store in the repository and doing a fetch every time (lessening the need to become a Groovy developer)? The Blue Ocean script editor appears to be an easier tool to use to create pipeline scripts, but it is only for MultiBranch pipelines (which we don’t do).
Serialization (restarting a job), is that only for when a node goes down, or can you restart a pipeline job (Declarative or Scripted), from any point if it fails?
I see that there are places to look to see what Jenkins plugin’s have been ported to pipeline, but is there anything that can be run to take a look at the classic jobs that you have, to determine up front which jobs are going to have problems being converted to pipeline?
08/02/19...
Studying and playing with pipelines. I see that you can use Declarative in the Pipeline Scrip window, but it still stores it in the config.xml file. And I have played with the combination of both Declarative and non Declarative in the same script.
I am trying to understand the Blue Ocean interface, the word "MultiBranch" is throwing me a little. We do not create test branches, and them merge them back into the master. In the repository, we have branches for each release of the product, and we rarely go back to previous branches\versions. So, if I am working on branchV9 right now, do I also need a Jenkinsfile in the Master branch, or any other of the previous version branches?
I have been playing with Blue Ocean (which only does MultiBranch pipelines). I am on a Windows system, Jenkins 2.176.2, and have all the latest Blue Ocean plugins as of today (1.18.0). I am accessing a local Git repository (not GitHub), and am running into the following...
If I try to use use “c:\GitRepos\Pipelines1.git”, i get "not a valid name"...
Why does it do this?
If you have a single job that you would be executed on multiple branches (with possibly optional stages, depending on the branch name or tag or other) then you still could utilize multi branch pipeline.
In general I would say that paradigm shift focuses mainly on converting the old jobs to stages in order to automate your build process. If you would have semi/fully automated CI/CD flow this could look like
Multibranch pipeline project (all branches) with the following stages (1st jenkinsfile)
build (all branches)
unit tests (all branches) publish report
publish artifacts (master and release branches)
build and publish docker (master and release branches)
deploy to test (master and release branches)
run integration tests (master and release branches)
deploy to staging (master and release branches) possibly ending with manual step if result of deployment was as expected
deploy to production (release branches)
Pipeline job for nightly tests (other jenkinsfile), what's the result here? Would it break CI/CD flow?

Single Jenkins job with multiple deployment

How can we configure a Jenkins job for different environment. We are using a git repository with different branch like master , test and devl and these environments are deployed in different servers. I have configured build with parameter option, so I can build any of the branch from this job using the radio button.
When I choose devl branch the job need to take latest code from devl branch and build it then need to deploy into devl server. If we choose test, it will need to deploy into test server. How do we configure this multiple deployment within this same job?
You can use the https://wiki.jenkins.io/display/JENKINS/Pipeline+Multibranch+Plugin.
This plugin will watch all your branches of your repo and builds the different branches based on a Jenkinsfile.
In the Jenkinsfile you can use the when expression:
stage('Deploy devl') {
when {
expression { env.BRANCH_NAME == 'devl' }
}
steps {
sh 'deploy devl'
}
}
Why did you choose to do it in 1 job, and not different jobs?
What tool do you use for deployment?
What I can suggest - not getting too deep inside the details (as I don't have them).
Have a job for each of the different environments, connect each job to the relevant git repository and branch, whenever this job is built, it will perform an scm checkout (you can also trigger it on change using hooks) and deploy to the relevant environment.
If you would answer #2 I may be able to suggest other options to manage this (possibly under the same job - if you require this for some reason).

Delete not recently run jobs from Jenkins

In part of our testing setup we are build our artifacts needed and then copying a template job and settings its name so it recognizable.
Build artifact -> copy test template -> ending with a job for each
test case
that means i'm ending up with lots of jobs with Test_Client${BRANCHNAME}_Server${BRANCHNAME}
I'm running through these jobs alot while testing that branch, but as soon as it's merged it's not going to be touched again, which is why i would like to create a job of sorts that simply deletes the jobs that havn't been run for the 14 days or so.
Does anyone know a way of doing this? and not just cleaning out the workspace.
Thanks!
It may be a big change, but this is an ideal case for the Multibranch pipeline.
On one project, we have a master branch and version branches. The developers branch to short-lived feature branches or other branching purposes. As the branches are created and pushed to github, the multibranch job picks them up and starts building them. Developers get quick feedback that the changes will pass the build. When they merge them to master or to a version branch, then delete the branch, the multibranch job goes away.

Resources