Jenkins: Automated job configuration using Seed Jobs and Jenkinsfile - jenkins

I am trying to understand how to best deploy an instance of Jenkins, complete with plugins, users and jobs using Chef. I am currently using the Chef Jenkins Supermarket cookbook.
I am attempting to achieve automated deployment of our Pipelines as part of the project. From what I have gathered, the best way to go about this is to have Chef configure a seed job in Jenkins initial setup and configuration.
The seed job should specify, among other things, the git repository from which to find and use a Jenkinsfile for a given job. I've found this resource by Daniel Spilker to be helpful in explaining seed jobs.
So the seed Jenkins job would be run, which would then generate the Jenkins job we have just scripted with it (in this case the seed job would be to pull the Jenkinsfile from source control and configure a new Jenkins job (our pipeline), with the details of the Jenkinsfile).
Am I understanding this correctly as the proper way to not only automate Jenkins job configuration, but also as the proper way to always have an up to date job configuration for any given job in the event the job configuration were to change?
If we used a seed job to setup our pipeline, what are some possible solutions to having the initial seed job run automatically once Jenkins is fully configured by Chef?
As for job configuration changes that may occur over time, would we need to setup the seed job to poll source control for any changes in the Jenkinsfile periodically in the event the Jenkinsfile has been modified? (It may be helpful to note that we are currently using BitBucket for source control).
Just getting started with pipeline as code. Thanks to everybody in advance for their patience and guidance.

I've mentioned this a bit in your other questions, but the least painful approach is to treat Jenkins as a database, not a web service. Have Chef do the basic install, but then configure the initial bits by hand. For DR, rely on your backups rather than Chef.

Related

Auto create Jenkins jobs and update configs from source code repo - GitHub

I wanted to setup a single source of truth to my Jenkins running in different DC's (Data Centers), so I converted all my jenkins jobs to pipeline jobs - Jenkinsfile taken from Github repo.
I'm looking for a method to create/delete/update Jenkins jobs in UI, for the multiple Jenkins running in different DC's automatically. so I am looking to auto create/delete jobs in all Jenkins upon updating the Job configurations in the GitHub Repository.
Any recommendations or help for this workflow would be appreciated.

Why declarative pipelines need to run on master if there are build executors available?

I'm using recent Jenkins version 2.286 and since this update there is an security hint: "You should set up distributed builds. Building on the controller node can be a security issue. See the documentation."
But I'm already doing so with three Jenkins nodes and I also fully understand the security implications.
The problem here is, that there are two jobs that need to run an the master, since they are the jobs that deploy those Jenkins nodes. That means I can not reduce the build executors to 0.
I've also tried using the Job Restrictions plugin to restrict which jobs can run on the master. This problem here is that all my jobs are waiting for the master queue do have a free slot available. I wonder why, because they all are declarative pipelines and define something like:
agent {
label 'some-different-node-label'
}
Which means they aren't really executed on the master node.
Questions here are:
Is this intentionally that all jobs require the master node before switching the agent?
Is there any configuration option to change that?
Is there a way to execute the deploy jobs on master, even if there aren't any executed defined (to bypass that behavior)?
Thanks.
With declarative pipelines the lightweight code checkout is done on the Master node to get a Jenkinsfile for that job. While this doesnt use an executor on the Master perhaps the Job Restriction Plugin is still blocking this (I havent used it before so cannot comment)
Also certain pipeline actions are delegated back to the Master node as well (e.g. the withAWSParameterStore step.
If you look at the console output for a Declarative pipeline job, you will see lots of output (mainly around library checkouts or git checkouts) before you see the start of the pipeline [Pipeline] Start of Pipeline. All that is done on the Master.
Unfortunately this cannot be changed as the Master needs to do this work to find out which agent type to delegate the job to.
Depending on how you are running you agents, you could use something like the EC2 Cloud Plugin to generate you agent nodes which wouldn't require a job to do it

run a Jenkins job on another Jenkins instance from the Jenkins job

I want to create a Jenkins job that starts other Jenkins jobs. That would be quite easy, because Jenkins Template Project Plugin allows us to create a build step of a type "use builders from another project". However, what makes my situation harder is that I have to start Jenkins jobs on other machines. Is there any standard way to do that?
In case you want only to trigger new build of Job You Have plenty of ways to accomplish it
you can use remote access API and Trigger a request to build target job from source Job.
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Or you can use https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Remote+Trigger+Plugin
which is handy in handling server details and other stuff. you shoukld ensure ssh keys shared by both servers.

Save all Jenkins jobs to a repository

I have 15 Jenkins jobs configured in order to implement a specific flow. I am improving and editing these jobs as time goes by.
Is there a way to save all these jobs and its configurations to a repository or at least in the form of export jobs, save it and import when needed?
There are two plugins that will help you save Jenkins jobs, "SCM Sync configuration" and "JobConfigHistory" defined at the wiki.jenkins-ci.org website.
SCM Sync Configuration Plugin (which keeps the config in a SCM repository)
or
Job Config History Plugin (Saves copies of all job and system configurations)
The Job DSL Plugin allows to define the jobs in a DSL and store the DSL scripts in a SCM repo. The DSL increases the readability of the config files in contrast to the XML format.
For an intro, see the slides and video from the Configuration as Code: The Job DSL Plugin talk at the Jenkins User Conference 2015 in London.
You can move/copy jobs to another destination simply by copying directory with job (default path for jobs directories is /var/lib/jenkins/jobs).
You can get more info here - https://wiki.jenkins-ci.org/display/JENKINS/Administering+Jenkins
The Workflow feature may let you write your entire process as one (Groovy) script, which you can then maintain in your version control system alongside other sources.

Can I store Jenkins configuration in the project repo (like Travis CI)?

How do you maintain the Jenkins job configuration in SCM along side the source code?
As source code evolves, so does the job configuration. It would be ideal to be able to keep the job configuration in SCM, for the following benefits:
easy to see who a history of the changes, including the author and the description
able to rebuild old branch/tag by checking out the revision and build just work
not having to scroll through the UI to find the appropriate section and make change
I see there is a Jenkins Job Builder plugin. I prefer a solution along the lines of Travis CI, where the job configuration is maintained in a YAML file (.travis.yml). Any good suggestions?
Note: Most of our projects are using Java & Maven.
Update 2016: Jenkins now provides a Jenkinsfile which provides exactly this. This is supported by the core Jenkins developers and actively developed.
Benefits:
Creating a Jenkinsfile, which is checked into source control, provides a number of immediate benefits:
Code review/iteration on the Pipeline
Audit trail for the Pipeline
Single source of truth for the Pipeline, which can be viewed and edited by multiple members of the project.
I've written a plugin that does this!
Other than my plugin, you have some (limited) options with existing Jenkins plugins:
Use a single test script
If you configure your Jenkins to simply run:
$ bash run_tests.sh
You can then check in a run_tests.sh file into your SCM repo and you're now tracking changes for how you run tests. However, this won't track configuration of any plugins.
Similarly, if you're using Maven, the Maven Project Plugin simply runs a specified goal for your repo.
The Literate Plugin does allow Jenkins to run the commands in your README.md, but it hasn't yet been released.
Track changes to Jenkins configuration
You can use the SCM Sync configuration plugin to write configuration changes to SCM, so you at least have a persistent record. This is global, across all projects on your Jenkins instance.
There's also the job config history plugin, which stores config history on the filesystem.
Write Jenkins configuration from SCM
The Jenkins job builder project you mentioned lets you check config changes into SCM and have them applied to your Jenkins instance. Again, this is across all projects on your Jenkins instance.
Write Jenkins configuration from another job
You can use the Job DSL Plugin with a repo of groovy scripts. Jenkins then polls that repo, executes the groovy scripts, which create job configurations.
Discussions
Issue 996 (now closed) discusses this, and it has also been discussed on the mailing list: 'Keeping track of Hudson's configuration changes', and 'save hudson config in svn'.
you can do this all with the workflow plugin and a lot more. Workflow is one of the most advanced technics to use jenkins and it has a very strong support.
It is based on a groovy DSL and allows you to keep the whole configuration in the SCM of your choise (e.g. GIT, SVN...).

Resources