I want to setup a Jenkins from code to
Create one initial pipeline
Create the Job DSL seed job and executing it to configure jobs used in the pipeline
Configure Jenkins settings
Locales - set locale to EN
Access control - Lock down system
I read many tutorials and questions and found the following ideas
Using the Jenkins CLI
Some Job DSL interface for setting up a job as described here at the bottom
Using JenkinsSCI interface within a Groovy file located in init.groovy.d - see below
For testing I use Docker and have the following sample already running.
Dockerfile
# https://github.com/jenkinsci/docker/blob/master/README.md
FROM jenkins/jenkins:lts
USER root
COPY groovy/* /usr/share/jenkins/ref/init.groovy.d/
USER jenkins
EXPOSE 8080
ENTRYPOINT ["/bin/tini", "--", "/usr/local/bin/jenkins.sh"]
groovy/jobs/test1-basic.groovy
#!/usr/bin/env groovy
import hudson.model.*
import jenkins.model.Jenkins;
import hudson.tasks.Shell;
job = Jenkins.instance.createProject(FreeStyleProject, 'test1-basic')
job.buildersList.add(new Shell('echo hello world'))
job.save()
The sample sadly lacks the
configuration part, as I do not know how to access the locale plugin from within the groovy code
Job DSL integration, how to read the seed job and execute it ones
I really did an intensive research and could not find much about this initial setup part. It seems many people do this manually, or the legacy way copying XML files. Could you help me out solving this and making it a "best practice documentation" for other?
If you are familiar with configuration management tool like chef you can use it for configuring your jenkins instance. There is a jenkins community cookbook which can be utilized to write a wrapper to suite your needs.
jenkins_job resource in this cookbook lets you create any type of job be it pipeline, free style etc, you just need to supply the required job configuration. You can template this with variables so based on what you supplied, job will be created accordingly. Not just jobs, you can configure almost everything you do manually with chef using a resource corresponding to that.
One of the best part about using chef is you can source control it and update configuration based on requirements at any point of time.
If you are not planning to use a configuration management tool, you can check out the discussion here on how to achieve job creation with plugins
Related
I am trying to write the test cases to validate the Jenkinsfile, But the load script function not working expecting the extension to be provided and throwing ResourceException exception loadScript("Jenkinsfile")
Is their better way to test the Jenkinsfile
The problem is that there are not enough tools for the development of pipelines. Pipelines is DSL and it imposes a restrictions.
There is an interesting approach to using flags. For example, test which defines outside pipeline(in job). If test=true, a pipeline change some "production" logic to "test" - select another agent, load artifacts into another repository, run another command and so on.
But recently appeared Pipeline Unit Testing Framework. It allows you to unit test Pipelines and Shared Libraries before running them in full. It provides a mock execution environment where real Pipeline steps are replaced with mock objects that you can use to check for expected behavior.
Useful links:
Jenkins World 2017: JenkinsPipelineUnit: Test your Continuous Delivery Pipeline
Pipeline Development Tools
You can validate your Declarative Pipeline locally thanks to Jenkins built-in features.This can be done using a Jenkins CLI command or by making an HTTP POST request with appropriate parameters.
The command is the following:
curl -s -X POST -F "jenkinsfile=<YourJenkinsfile" \
https://user:password#jenkins.example.com/pipeline-model-converter/validate
For a practical example follow this guide:
https://pillsfromtheweb.blogspot.com/2020/10/validate-jenkinsfile.html
I want to create DSL extension for my Jenkins plugin (built using maven) just like in the example of Docker plugin for Jenkins. I see that the groovy file Docker.groovy is in: src/main/resources/org/jenkinsci/plugins/docker/workflow/Docker.groovy
Does this groovy file have to be within org.jenkinsci.plugin.docker.workflow, or can I just put it inside resources? What is the difference?
Also, If I define my DSL extension within the groovy file in this manner is the DSL extension available to call implicitly in the pipeline file?
In order to make a step available in the Pipeline DSL through your plugin, you need to define a subclass of Step that performs the needed task. This can be completely done within Java, and is the preferred method for adding expanding the Pipeline DSL within a Jenkins plugin.
The Docker example you linked is unusual in this instance, and doesn't define a typical Pipeline DSL step (the docker directive in Pipeline functions like a cross between an agent, a step and a context block). Furthermore, it appears to include a Java class that loads the Groovy script dynamically, which acts as the entry point into the directive.
Groovy can be used to expand the Pipeline DSL; however this is done within the context of a shared library, which is meant to be more of a boilerplate reducing tool to be used internally.
How can I add/edit new code to my Jenkins instance that would be accesible in a DSL script? Context follows
I've inherited a Jenkins instance. Part of this inheritance includes spending the night in a haunted house writing some new automation in groovy via the Jobs DSL plugin. Since I'm fearful of ruining our jenkins instance, my first step is setting up a local development instance.
I'm having trouble running one of our existing DSL Scripts on my local development instance -- my builds on the local server fail with the following in the Jenkins error console.
Processing DSL script jobs.groovy
ERROR: startup failed:
jobs.groovy: 1: unable to resolve class thecompanysname.jenkins.extensions
The script in question starts off like this.
import thecompanysname.jenkins.extensions
use(extensions) {
def org = 'project-name'
def project = 'test-jenkins-repo'
def _email = 'foo#example.com'
So, as near I can tell, it seems like a predecesor has written some custom Groovy code that they're importing
import thecompanysname.jenkins.extensions
What's not clear to me is
Where this code lives
How I can find it in our real Jenkins instance
How I can add to to my local instance
Specific answers are welcome, as our here's how you can learn to fish answers.
While there may be other ways to accomplish this, after a bit of poking around I discovered
The Jenkins instance I've installed has an older version of the Jobs DSL plugin installed.
This version of the Jobs DSL plugin allowed you to set an additional classpath in your Process DSL Builds job section that pointed to additional jar files.
These jar files can give you access to additional classes in your groovy scripts (i.e. thecompanysname.jenkins.extensions)
Unfortunately, more recent versions of the Jobs DSL plugin have removed this option, and it's not clear if it's possible to add it back. That, however, is another question.
Configure Global Security -> uncheck "Enable script security for Job DSL
scripts".
works for me
Every jenkins pipeline does pretty much the same thing - atleast in a small team with multiple projects.
Build (from the same sourcecode repo) --> run tests --> publish artifacts (to the same artifact repo)
We are creating many new projects and they all have very similar lifecycle. Is it possible to create a template pipeline from which I can create concrete pipleines and make necessary changes to the jobs?
There are a couple of approaches that I use that work well for me and my team.
part 1) is to identify which orchestration plugins suits you best in jenkins.
Plugins and approaches that worked well for me were:
a) Use http://ci.openstack.org/jenkins-job-builder/
It abstract the jobs definitions and flows using a higher level library. It allows you to define jobs in YAML which is fairly simple and it supports most of the common usage cases (jobs, templates, flows).
These yaml files can then be consumed by the jenkins-jobs-builder python cli tool through an orchestration tool such as ansible, puppet,chef.
You can use YAML anchors to replace blocks that are common to multiple jobs, or ever template them from a template engine (erb,jinja2)
b) Use the workflow-plugin, https://github.com/jenkinsci/workflow-plugin
The workflow plugin allows you to have a single workflow in groovy, instead of a set of jobs that chain together.
"For example, to check out and build several repositories in parallel, each on its own slave:
parallel repos.collectEntries {repo -> [/* thread label */repo, {
node {
dir('sources') { // switch to subdir
git url: "https://github.com/user/${repo}"
sh 'make all -Dtarget=../build'
}
}
}]}
"
If you build these workflow definitions from a template engine (ERB, jinja2), and integrate them with a configuration management tool (again ansible,chef,puppet).
It becomes a lot easier to make small and larger changes that affect one or all the jobs.
For example, you can template that some jenkins boxes compile, publish and deploy the artifacts into a development environment, while others simply deploy the artifacts into a QA environment.
This can all be achieved from the same template, using if/then statements and macros in jinja2/erb.
Ex (an abstraction):
if ($environment == dev=) then compile, publish, deploy($environment)
elif ($environment== qa) then deploy($environment)
part2) is to make sure all the jenkins configuration for all the jobs and flows is kept in source control, and make sure a change of a job definition in source control will be automatically propagated to the jenkins server(s) (again ansible, puppet, chef).
Or even have a jenkins jobs that monitors its own repo of jobs definitions and automatically updates itself
When you achieve #1 and #2 you should be at a position where you can with some confidence allow all your team members to make changes to their jobs/projects, giving you information of who changed what and when, and be able to rollback changes easily from change control when things go wrong.
its pretty much about getting jenkins to deploy code from a series of templated jobs that were themselves defined in code.
Another approach we've been following is managing jobs via Ansible templates. We started way before jenkins_job module became available, and are using url module to talk to jenkins, but overall approach will be the same:
j2 templates created for different jobs
loop goes over project definitions, and updates jobs and views in jenkins
by default common definition is used, and very minimal description is required:
default_project:
jobs:
Build:
template: build.xml.j2
Release: ...
projects:
DefaultProject1:
properties:
repository: git://../..
CustomProject2:
properties:
a: b
c: d
jobs:
Custom-Build:
template: custom.j2
I am using Jenkins to run a bunch of my scripts on regular basis. And the Execute shell section of my job looks like:
runner.py my_script A.config run
The problem is I have a bunch of configs A.config, B.config .... S.config. I am currently creating separate jobs for each config. And have a bunch of jobs. Is there any plugin you would recommend so that I can just have "runner.py my_script" and pass in A.config or B.config... using a simpler option than having a bunch of job like I have right now?
You can a achieve this with the multi configuration job in Jenkins. Select it as the radio button entry when naming the job
Here is more information Jenkins and multi-configuration (matrix) jobs
It is a standard Jenkins feature so doesn't need a plugin and you can add your own labels in.