Delivery Pipeline View for Jenkins Pipelines in job dsl - jenkins

How can I create a delivery pipeline view for jenkins pipeline using jobdsl.
All I could find was deliveryPipelineView, which isn't the same view, any information on this would be useful

Delivery Pipeline views for Jenkins pipelines does not seem to be supported by JobDSL at the moment (1.64).
The Job DSL class DeliveryPipelineView only supports traditional jobs with upstream/downstream dependencies. The reason for this is that the Delivery Pipeline plugin uses different views and data models under the hood to render the pipeline views for upstream/downstream jobs and Jenkins pipelines, much due to the different nature of the underlying data models used in Jenkins.
The traditional view, which JobDSL supports will generate a se.diabol.jenkins.pipeline.DeliveryPipelineView configuration, while views supporting Jenkins pipelines are modelled using the se.diabol.jenkins.workflow.WorkflowPipelineView class in the Delivery Pipeline plugin.
Current DeliveryPipelineView template in JobDSL: https://github.com/jenkinsci/job-dsl-plugin/blob/master/job-dsl-core/src/main/resources/javaposse/jobdsl/dsl/views/DeliveryPipelineView-template.xml#L2
If you append /config.xml on the URL of your view which is based on Jenkins pipelines, you will notice the XML is of the type se.diabol.jenkins.workflow.WorkflowPipelineView.
The solution at the moment would be to handcraft the necessary config.xml and feed it to Jenkins yourself.

For me, the deliveryPipeLineView method create exactly a delivery pipeline view..
Here an exemple :
deliveryPipelineView('name-pipeline') {
description('description-pipeline')
pipelineInstances(1)
showAggregatedPipeline()
columns(1)
sorting(Sorting.TITLE)
updateInterval(2)
enableStartBuild()
enableManualTriggers()
showAvatars()
showChangeLog()
pipelines {
component('name', 'init-job')
}
}
See the doc on gitHub for more details : https://github.com/jenkinsci/job-dsl-plugin/wiki/Job-DSL-Commands

Related

Extract Freestyle Jobs and create pipeline Jobs in another Jenkins instance

I have two jenkins instances (jenkins1 and jenkins2)
Jenkins1 - Contains freestyle jobs (all runs on a specific template)
I need to extract all the jobs from jenkins1 and create those jobs as pipeline jobs in jenkins2.
I know simply copying the jobs doesnt work (because it is two different templates Freestyle and pipeline)
How can I do it in efficient way using a groovy/shell script to achieve this?
Every job has a config.xml where all the job step are listed in xml.
Parse that file and extract all the information than convert them in a pipeline job routine.
I think groovy/shell scripts are a perfect way to achieve it, just use the config.xml as source of information.
The below resources can help:
https://jenkinsworld20162017.sched.com/event/Bk3r/auto-convert-your-freestyle-jenkins-jobs-to-coded-pipeline?iframe=no&w=100%&sidebar=yes&bg=no
https://github.com/visualphoenix/jenkins-xml-to-jobdsl

Can jenkins Pipeline force storing everything in VCS?

I have used Jenkins DSL. Now I started a new project and considering using Pipeline instead Jenkins DSL.
When using Jenkins DSL there was a seed job and everybody was forced to store every job in version control in order to not have it overwritten.
I cannot find a way for forcing the same thing with Pipeline.
I liked this approach, because in my opinion it really helps to store everything in VCS.
When using Pipeline, you need to create the job configuration manually like the Job DSL seed job.
You can use a mixed approach by using Job DSL for creating the Pipeline jobs and keep the pipeline definition in a Jenkinsfile next to your project's code.
pipelineJob('example') {
definition {
cpsScm {
scm {
git('https://github.com/jenkinsci/job-dsl-plugin.git')
}
scriptPath('Jenkinsfile')
}
}
}
See https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob for details.
Also checkout the advanced Pipeline job types like Multibranch and Organization Folder which provide a dynamic job setup out-of-the-box. See https://jenkins.io/doc/book/pipeline/multibranch/. The job types are also supported by Job DSL.

Can a single seed job process DSLs from multiple repos?

I recently managed to convert several manually-created jobs to DSL scripts (inlined into temporary 'seed' jobs), and was pleasantly surprised how straightforward it was. Now I'd like to get rid of the multiple seed jobs and try to structure things more cleanly.
To that end, I created a new jenkins-ci repo and committed all the Groovy DSL scripts to it. Then I created a job-generator Jenkins job that pulls from the jenkins-ci repo and has a single Process Job DSLs step. This step has the Look on Filesystem box ticked, with the DSL Scripts field set to jobs/*.groovy. With global push notifications already in place, this works more-or-less as intended: if I make a change to the jenkins-ci repo, the job-generator job automatically runs and regenerates all the jobs—awesome!
What I don't like about this solution is that it has poor locality of reference: the DSL scripts for the job live in a completely separate repository from the code. What I'd really like is to keep the job DSL scripts in each individual code repository, in a jenkins subfolder, and have a single seed job that processes them all. That way, changes to CI setup could be code-reviewed right alongside the code. To me, that just feels like an ideal setup.
Unfortunately, I don't have a clear idea about how to make this happen. If I could figure out a way to make the seed job watch multiple repos, such that a commit to any one of them would trigger it, perhaps I could inject another build step before the Process Job DSLs step and (somehow) script my way to victory, but... I'm unsure how to even get to that point. (I certainly don't want to do full clones of each repo in the generator job just to pull in the DSL scripts!)
I suspect I'm not the first person to wish they could put the Job DSL scripts alongside the code, though perhaps I'm over-estimating the benefits. Any advice on this topic would be much appreciated—thanks!
Unfortunately there is no direct way of solving this. Several feature requests have been opened (JENKINS-33275, JENKINS-37220), but AFAIK no one is working on any of them.
As a workaround you can use the Pipeline Multibranch Plugin and create a multibranch project for each of your repositories. You must then add a simple Jenkinsfile to each repo/branch and use the Jenkinsfile to execute your Job DSL scripts. See Use Job DSL in Pipeline scripts for details. This would require minimal coding, but I think each repo must be cloned for this to work because the Job DSL files must be available on the file system.
You can use Job DSL to create the multibranch jobs, see multibranchPipelineJob in the API viewer. This would be your "root" seed job.
If your repos are hosted on GitHub, you can also checkout the GitHub Organization Folder Plugin. With that plugin you must only create one job for each organization instead of multiple multibranch jobs.

How to redirect to a particular pipeline instance in a delivery pipeline view in Jenkins?

I am using the Delivery pipeline plugin for grouping my jobs into stages.
Currently, there seems no mechanism to directly go to a particular older pipeline instance in the view.
I wanted to provide a provide a mechanism(posssibly a link) in the Initial job(JobI) of the pipeline so that whenever I click on a particular build of the JobI, it should redirect me to that particular pipeline instance of the pipeline view directly.
I also tried to achieve the above behavior via some other pipeline plugins i.e Build pipeline etc. but no solution.
The idea is to make the view easy to use for a user,so that he does not have to take the pain of scrolling through all the instances to get to a particular instance.
I want to replicate the behavior of the "builds in the jobs" for the "versions of the view" or similar to that.
Any help/suggestions would be great.
Try Build Graph View Plugin to view the executions of downstream builds.
But this plugin doesn't have a grouping of jobs as stages, it is plain flow graph.

Deliver Pipeline Plugin and its `columns` property does not work

What is the purpose of the columns(int number) of Delivery Pipeline Plugin? According Job DSL Plugin and its view reference document it specifies the number of columns. I have tried changing this setting to different values and I do not see its effect.
My Delivery pipeline has 3 stages with 3-4 jobs each. What should I expect?
EDIT Open issue JENKINS-29324
Jenkins v1.619
Delipery Pipeline Plugin v0.9.4
Build Pipeline Plugin v1.4.7
The columns(int number) method configures the "Number of Columns" option of the Delivery Pipeline View settings.
But changing the value does not seem to have any effect. You should consider to report an issue for the Delivery Pipeline plugin in the Jenkins Issue Tracker.
The purpose is if you have multiple components defined in the view, then they can be shown in the number of columns that you configure. Will not have any effect if you only have one component defined.

Resources