Ephemeral Jenkins Pipeline Jobs from Github and Jenkinsfile - jenkins

I have automated Jenkins master and slaves deployment and redeployment successfully.
I know how to manually create pipeline jobs and add github repos to use their Jenkinsfiles for the steps.
my issue is how can I automate the pipeline jobs addition to jenkins after its been destroyed and redeployed without having to manually create the pipeline jobs and point to Jenkinsfile each time.
I have seen this done before in a container environment with chef and docker when redeployed or updated it re-adds all the pipelines automatically again.
I want to not use the UI at all only to confirm job status progress and verify settings.

I would recommend looking at the JobDSL Plugin to create jobs, using a seed job to create them on initial Jenkins startup. The Jenkins Configuration-as-Code plugin can be used to setup any other configuration outside the jobs.

Related

How to decouple Jenkins CI and gitlab CI pipelines?

I've only been working with Jenkins so far. We have configured a Multibranch Pipeline job to automatically build and test software. The tasks are written in Groovy and stored as Jenkinsfile in the root directory of our git repository.
Recently, we have decided to add another mechanism to automatically generate documentation. The generation of documentation (but this could be any other task) has been realized using GitLab CI.
Both pipelines are practically independent - and both are triggered by a git commit/push. What I do not understand is: why and how is the Jenkins pipeline execution associated with the GitLab CI pipeline? In the following screenshot a new column "External" appears - representing the Jenkins pipeline job.
That's not really a big issue. But as both pipelines should be independent - the results of the runs should not influence each other. However, it seems that when the Jenkins job fails, i.e. "External", the GitLab CI pipeline also fails:
Is there a way to better decouple those pipelines, i.e. let them fail or succeed individually?
This is because the Gitlab Branch Source Plugin automatically notify Gitlab about then Jenkins pipeline status. This allow you to see the result of a build directly in Gitlab. If you want to have only the result of the Gitlab CI pipeline in Gitlab, you can disable this feature :
Additional Traits:
These traits can be selected by selecting Add in
the Behaviours section.
[...]
Skip pipeline status notifications - Disable notifying GitLab server
about the pipeline status.
[...]
So in yout Gitlab group, just go the Configure > Projects > Gitlab Group > Add and select Skip pipeline status notifications.
why and how is the Jenkins pipeline execution associated with the GitLab CI pipeline? In the following screenshot a new column "External" appears - representing the Jenkins pipeline job.
In general, "External" statuses are created using the commit build status API -- Jenkins uses this API to report the Jenkins pipeline build status to GitLab CI.
This external status for Jenkins appears in your GitLab pipeline because you have configured your Jenkins server/project to report build statuses to GitLab or you have setup a webhook integration with Jenkins in GitLab (note these may be set at the group level or by an administrator, not necessarily the project level)
To remove this from your pipeline, you should disable any existing integration configurations and setup your Jenkins project independently of any GitLab integration. e.g. using git polling to trigger jenkins builds and remove any updateGitlabCommitStatus calls in your groovy scripts / build stages.

Auto create Jenkins jobs and update configs from source code repo - GitHub

I wanted to setup a single source of truth to my Jenkins running in different DC's (Data Centers), so I converted all my jenkins jobs to pipeline jobs - Jenkinsfile taken from Github repo.
I'm looking for a method to create/delete/update Jenkins jobs in UI, for the multiple Jenkins running in different DC's automatically. so I am looking to auto create/delete jobs in all Jenkins upon updating the Job configurations in the GitHub Repository.
Any recommendations or help for this workflow would be appreciated.

Kubernetes Cron Job to trigger a Jenkins pipeline

My team to to create a cron in Kubernetes/OpenShift that will trigger our Jenkins pipeline that we have set up. We tried doing the triggers{} syntax and the build periodically option on the Jenkins UI, however these are unreliable for us since whenever Jenkins restarts, those build triggers on Jenkins get removed.
There is two approaches you could use here:
Make your Jenkins stateless with the Configuration as Code plugin
Use a curlimages/curl container to trigger a job via the Jenkins REST API

Continuous deployment branch wise using Spinnaker

I'm using Jenkins multibranch pipeline for CI process and for CD using Spinnaker.
I've gone through almost all documents, support channels etc. from spinnaker for "how to create spinnaker multibranch pipeline similarly as jenkins" but didn't find anywhere.
After integrating jenkins to spinnaker, in drop down list of jenkins jobs in spinnaker pipeline configuration, it shows all multibranch jobs separately. Hence for each branch I'd need to go to spinnaker and create pipeline manually.
To solve this, I'm thinking this solution: while running jenkins multibranch pipeline job > create spinnaker pipeline(if not exist) using spin cli with required parameters(branch, version, trigger using jenkins of this running branch job etc) > and trigger the same created spinnaker pipeline after jenkins job executed.
Please advise if there is any other better way to accomplish this.
Thanks.
I am not super familiar with the multibranch plugin, but you can make this simpler by doing [ triggers ] -> [ pipeline stage calling the same pipeline ] rather than calling the entire pipeline via the spin-cli.
Alternatively, if the list of jobs generated is small or well known, you could just update the list of triggers for the same pipeline programmatically as part of your release process.
i.e, in your jenkins job
add this job to list of triggers
run rest of jenkins job
job finishes, spinnaker pipeline triggers

Run script before removing job in Jenkins Pipelines

I'm setting up a development environment where I have Jenkins as CI server (using pipelines), and the last build step in Jenkinsfile is a deployment to staging. The idea is to have a staging environment for each branch that is pushed.
Whenever someone deletes a branch (sometimes after merging), Jenkins automatically removes its respective job.
I wonder if there is a way to run a custom script before the automatic job removal, then I would be able to connect to the staging server and stop or remove all services that are running for the job that is going to be deleted.
The plugin multibranch-action-triggers-plugin might be worth a look.
This plugin enables building/triggering other jobs when a Pipeline job is created or deleted, or when a Run (also known as Build) is deleted by a Multi Branch Pipeline Job.

Resources