What is difference between Jenkins Shared Libraries and Jenkins pipeline templates - jenkins

I am trying to understand what is exact difference between Jenkins Shared Libraries and Jenkins pipeline templates.
Shared libraries as I understand is used for keeping common code and making it accessible to multiple pipelines.
I am not able to understand then what is difference between Jenkins pipeline template. Also what is the use of Jenkins templates created using template engine. Is it somehow similar to shared library

maintainer of the Jenkins Templating Engine here.
Shared Libraries
Focused on reusing pipeline code. A Jenkinsfile is still required for each individual application and those individual pipelines have to import the libraries.
Jenkins Templating Engine
A pipeline development framework that enables tool-agnostic pipeline templates.
Rather than creating individual Jenkinsfiles, you can create a centralized set of pipeline templates.
These templates invoke steps such as:
build()
unit_test()
deploy_to dev
This common template can be applied across teams, regardless of the technology they're using.
The build, unit_test, and deploy_to steps would come from libraries.
There may be multiple libraries that implement the build step, such as npm, gradle, maven, etc.
Rather than have each team define an entire pipeline, they can now just declare the tools that should be used to "hydrate" the template via a pipeline configuration file:
libraries{
npm // contributes the build step
}
Feel free to check out this CDF Webinar: Pipeline Templating with the Jenkins Templating Engine.

Related

Is there some way to pull in Jenkins plugins from a shared pipeline library?

Jenkins publishes plugins which contains primarily, or only, additional pipeline steps.
For example, pipeline-utility-steps would be nice to have, because it has the tar task.
But getting additional plugins installed in the Jenkins instance comes with some difficulty - only one or two devs have access to do so, and they're reluctant to add more plugins, because more plugins installed means more work overall - they have to be tested for cross-compatibility, kept up to date, tracked for vulnerabilities, etc.
Jenkins lets us maintain our own shared pipeline libraries, which we are already using as a place to put all our own custom steps.
Is there some way we can make use of the shared pipeline library feature, to also gain access to existing Jenkins plugins without installing them in Jenkins?
For example, can we directly import a Jenkins plugin into a pipeline somehow to use its steps?
Or can we do something in our shared pipeline library to make it also pull in additional vars from some other Jenkins plugin?
Similar questions:
Is there a way to include a jar in a Jenkins Pipeline folder-level shared library? asks about one specific way this might be possible

How to run Jenkins pipeline for multi module projects

In my project, I have multiple modules(consist of multiple directories and each directory having it's own dockerfile and pom.xml files). I have created a Jenkins pipeline for it. But every time I run the jenkins pipeline each module gets build and deployed, which wastes lot of time. Is there any way, to deploy only a particular module that time through my jenkins pipeline. So that it saves my time.
As of now I am commenting manually the module which I don't want to run in my jenkins pipeline, is there any way, I don't have to comment the module, but can run only the module, I requires through jenkins pipeline. I don't want to create separate jenkins pipeline for each modules, because there are lots of modules.
I don't want to use multi-branch pipeline since, all the module codes are present in one single branch only, and placing modules in different branches will change my code structure.
Is there any plugin or mechanism through which I can do build for only particular module that I want.
Take a look at when and change set on this page https://www.jenkins.io/doc/book/pipeline/syntax/
You could then run certain module build stages depending on if changes to that modules code have been committed.

Reuse jenkins pipeline or groovy script?

We have jenkins pipelines which are reused and some pipelines which use the same functions.
Now is my question: what is the right approach to reuse them.
I use a shared library but I don't know if I have to add groovy scripts or full pipelines?
The groovy scripts seem to be executed in the root instead of my jenkins workspace which is a big issue.
How are you handling this in the right way?
Using shared library is the right approach. You have to add just groovy scripts to your library and use them in your pipeline.
Have a look at an example, Pipeline and the associated library.

Jenkins with Shared jobs

I am working with Jenkins, and we have quite a few projects that all use the same tasks, i.e. we set a few variables, change the version, restore packages, start sonarqube, build the solution, run unit/integration tests, stop sonarqube etc. The only difference would be like {Solution_Name}, everything else is exactly the same.
What my question is, is there a way to create 1 'Shared' job, that does all that work, while the job for building the project passes the variables down to that shared worker job. What i'm looking for is the ability to not have to create all the tasks for all of our services/components. It be really nice if each of our services/components could have only 2 tasks, one to set the variables, another to run the shared job.
Is this possible?
Thanks in advance.
You could potentially benefit from looking into the new pipelines as code feature.
https://jenkins.io/doc/book/pipeline/
Using this pattern, you define your build pipeline in a groovy script rather than the jenkins' UI. This script is then kept in the codebase of the project it builds in a file called Jenkinsfile.
By checking this pipeline into a git repository, you can create a minimal configuration on the jenkins' side and simply tell it to look towards a specific repo and do the things that pipeline says to do.
There's a few benefits to this approach if it works for your setup. The big one being that your build pipeline will be fully versioned just like the project it builds. And the repository becomes portable, easily able to be built on any jenkins' installation across as many jobs as long as the pipeline plugins are installed.

Jenkins Job DSL Plugin - Include another Jenkinsfile

I want to build a Common Jenkinsfile for a couple of Build Jobs in different languages. And then I want to add a specific Jenkinsfile that depends on some parameters.
For example: the common file should contain information about Docker Hub and Nexus Repository. It's always the same. And the specific file should contain language specific build steps.
Is it possible to "include" another file?
Using the Pipeline Shared Groovy Libraries Plugin it is possible to define your own Job DSL. This section of the plugin's manual explains how to do this.

Resources