Jenkins Job DSL Plugin - Include another Jenkinsfile - jenkins

I want to build a Common Jenkinsfile for a couple of Build Jobs in different languages. And then I want to add a specific Jenkinsfile that depends on some parameters.
For example: the common file should contain information about Docker Hub and Nexus Repository. It's always the same. And the specific file should contain language specific build steps.
Is it possible to "include" another file?

Using the Pipeline Shared Groovy Libraries Plugin it is possible to define your own Job DSL. This section of the plugin's manual explains how to do this.

Related

Is there some way to pull in Jenkins plugins from a shared pipeline library?

Jenkins publishes plugins which contains primarily, or only, additional pipeline steps.
For example, pipeline-utility-steps would be nice to have, because it has the tar task.
But getting additional plugins installed in the Jenkins instance comes with some difficulty - only one or two devs have access to do so, and they're reluctant to add more plugins, because more plugins installed means more work overall - they have to be tested for cross-compatibility, kept up to date, tracked for vulnerabilities, etc.
Jenkins lets us maintain our own shared pipeline libraries, which we are already using as a place to put all our own custom steps.
Is there some way we can make use of the shared pipeline library feature, to also gain access to existing Jenkins plugins without installing them in Jenkins?
For example, can we directly import a Jenkins plugin into a pipeline somehow to use its steps?
Or can we do something in our shared pipeline library to make it also pull in additional vars from some other Jenkins plugin?
Similar questions:
Is there a way to include a jar in a Jenkins Pipeline folder-level shared library? asks about one specific way this might be possible

How to run Jenkins pipeline for multi module projects

In my project, I have multiple modules(consist of multiple directories and each directory having it's own dockerfile and pom.xml files). I have created a Jenkins pipeline for it. But every time I run the jenkins pipeline each module gets build and deployed, which wastes lot of time. Is there any way, to deploy only a particular module that time through my jenkins pipeline. So that it saves my time.
As of now I am commenting manually the module which I don't want to run in my jenkins pipeline, is there any way, I don't have to comment the module, but can run only the module, I requires through jenkins pipeline. I don't want to create separate jenkins pipeline for each modules, because there are lots of modules.
I don't want to use multi-branch pipeline since, all the module codes are present in one single branch only, and placing modules in different branches will change my code structure.
Is there any plugin or mechanism through which I can do build for only particular module that I want.
Take a look at when and change set on this page https://www.jenkins.io/doc/book/pipeline/syntax/
You could then run certain module build stages depending on if changes to that modules code have been committed.

What is difference between Jenkins Shared Libraries and Jenkins pipeline templates

I am trying to understand what is exact difference between Jenkins Shared Libraries and Jenkins pipeline templates.
Shared libraries as I understand is used for keeping common code and making it accessible to multiple pipelines.
I am not able to understand then what is difference between Jenkins pipeline template. Also what is the use of Jenkins templates created using template engine. Is it somehow similar to shared library
maintainer of the Jenkins Templating Engine here.
Shared Libraries
Focused on reusing pipeline code. A Jenkinsfile is still required for each individual application and those individual pipelines have to import the libraries.
Jenkins Templating Engine
A pipeline development framework that enables tool-agnostic pipeline templates.
Rather than creating individual Jenkinsfiles, you can create a centralized set of pipeline templates.
These templates invoke steps such as:
build()
unit_test()
deploy_to dev
This common template can be applied across teams, regardless of the technology they're using.
The build, unit_test, and deploy_to steps would come from libraries.
There may be multiple libraries that implement the build step, such as npm, gradle, maven, etc.
Rather than have each team define an entire pipeline, they can now just declare the tools that should be used to "hydrate" the template via a pipeline configuration file:
libraries{
npm // contributes the build step
}
Feel free to check out this CDF Webinar: Pipeline Templating with the Jenkins Templating Engine.

jenkins-as-code: purpose of jobs

I want to use Jenkins and store the configuration and the pipeline in my SCM(e.g. git). To do so, I created a directory, let's say "jobs" in the root of my project where I will store jobs.groovy files written as JobDSL plugin files.
Should I do all the things in a single job file, like fetching the source code, testing it, maybe building Docker images if necessary, then deploying on AWS cloud? Or for each operation, should I create different jobs? If so, then how can I create a pipeline using these job files?
look at jenkins configuration as code plugin. following link would be helpful
https://github.com/tomasbjerre/jenkins-configuration-as-code-sandbox

Adding Jenkins Pipelines on Build

does anyone know if its possible to add a Jenkins pipeline build into a Jenkins docker image? For example, I may have a Jenkinsfile that defines my pipeline in groovie, and would like to ADD that into my image when building from the Jenkins image.
something like:
FROM jenkins:latest
ADD ./jobs/Jenkinsfile-pipeline-example $JENKINS_HOME/${someplace}
And have that pipeline ready to go when i run it.
Thanks.
It's a lot cleaner to use Jenkinsfile for this instead. This way, as your repositories develop you can change the build process without needing to recompile and redeploy your Jenkins instance everytime. (less work, and less CI downtime) Also, having the Jenkinsfile in source code allows a simpler decoupling.
If you have any questions about extending Jenkins on Docker further to handle building NodeJS, Ruby or something else I go into how to do all that in an article.
You can create any job in Jenkins by passing in an XML file that describes the job. See https://support.cloudbees.com/hc/en-us/articles/220857567-How-to-create-a-job-using-the-REST-API-and-cURL
The way I've done this is to manually create the job I want in Jenkins, then append config.xml to the URL and it shows you the XML content needed to generate the pipeline job. Save that XML and you can deliver it to your newly deployed Jenkins instance.
I use a system similar to this to generate several hundred jobs based on our external build specifications.

Resources