How to build multiple projects the same time using jenkins pipeline - jenkins

I already wrote an example Jenkinsfile to checkout and build and deploy a signal project. Is there a way to do all these for multiple project in different git repo the same time just using one Jenkinsfile ? I know I can set up these projects as independent jobs and use a Jenkinsfile to call them,but I'm wondering if I can do this without independent jobs.
Thanks.

You can make use of Job DSL Plugin to achieve this.
Jenkins Job DSL API will help you to write DSL scripts. You can find all the built-in DSL methods that will be needed to construct jobs.
Example pipeline script:
pipeline {
agent any
stages {
stage('Job1') {
steps {
//Pipeline Job
jobDsl scriptText: '''pipelineJob(\"$job1\") {
definition {
cpsScm {
scm {
git {
remote{
name('origin')
url('https://github.com/satta19/user-node.git')
credentials('git2-cred')
}
branch ('master')
}
}
scriptPath('Jenkinsfile')
}
}
}'''
}
}
stage('Job2') {
steps {
//Freestyle job
jobDsl scriptText: '''job(\"$job2\") {
steps {
shell(\'echo Hello World!\')
}
}'''
}
}
}
}
Note: I have taken the jobs name as string parameter i.e. $job1 and $job2 in the above example pipeline script.

Related

How to invoke the job from one slave(server) to another slave(server) in jenkins

Please, can you advise as I am planning to invoke a job that is on different server. For example:
Slave1 server1: deploy job
Slave2 server2: build job
I want build job trigger the deploy job. Any suggestion, please
If you are trying to run two Jobs in the same Jenkins server. Your Pipeline should look something like below. Here from build Job you can call the Deploy Job.
Build Job
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
build job: 'DeployJobName'
}
}
}
}
Deploy Job
pipeline {
agent { label 'server1' }
stages {
stage('Deploy') {
steps {
// Deploy something
}
}
}
}
Update
If you want to trigger a Job in a different Jenkins server, you can use a Plugin like RemoteTriggerPlugin or simply using the Jenkins API.
curl http://serer1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN
pipeline {
agent { label 'server2' }
stages {
stage('Build') {
steps {
sh 'curl http://server1:8080/job/DeployJobName/build?token=YOUR_REMOTE_TRIGGER_TOKEN'
}
}
}
}

Load multiple Jenkins pipeline scripts from Git via Job DSL seed script

My use case: I want to set up a Jenkins configuration via the Jenkins Helm chart, using the JCasC plugin. I would also like to define a number of jobs via the Pipeline plugin in a series of Jenkinsfiles, so that the entire setup is configured in code, with a clean, complete installation able to be performed just by running helm install.
However, I'm having trouble loading my Pipeline scripts. In my JCasC, I have defined a Job DSL seed script as follows:
job('seedJob') {
scm {
git {
remote {
url 'ssh://git#foo/bar.git'
credentials 'creds'
}
}
}
steps {
dsl {
external 'jenkins/jobs/*.groovy'
}
}
}
This successfully pulls the scripts from the repo, an example of which is:
pipeline {
// hello.groovy
// Do stuff
}
However, the job fails when parsing the Pipeline script with the following error:
ERROR: (hello.groovy, line 1) No signature of method: hello.pipeline() is applicable for argument types: (hello$_run_closure1) values: [hello$_run_closure1#4c6f43b6]
Possible solutions: pipelineJob(java.lang.String), pipelineJob(java.lang.String, groovy.lang.Closure)
Finished: FAILURE
My suspicion is that Pipeline scripts can't be read this way via Job DSL. If this is the case, is there are way I can achieve the loading of multiple Pipeline scripts from a single seed job?
Seed jobs should look like:
pipelineJob('job_name_here') {
definition {
cpsScm {
scm {
git {
branches('*/master')
branches('*/release')
remote {
credentials('credentials_id_from_jenkins_here')
name('name')
url('git#gitlab_repo_here.git')
}
}
}
}
}
triggers {
gitlab {
ciSkip(true)
triggerOnPush(true)
triggerOnMergeRequest(false)
triggerOnClosedMergeRequest(true)
branchFilterType('RegexBasedFilter')
targetBranchRegex('(.*master.*|.*release.*)')
secretToken('paste_secret_token_for_webhook_here')
}
}
}

Jenkins pipeline job start by hook by release

I have a Bitbucked repo, and I want to satrt my Jenkins pipeline job only afrer commit with tag like "release-1.0.*"
So, I seted my job up with pipeline script:
pipeline {
agent any
stages {
stage ('Prepare') {
when {
tag "release*"
}
steps {
git branch: 'tag1', url: 'git#bitbucket.org:m*********ny/tests.git'
}
}
stage ('Deploy') {
steps {
sshPublisher(publishers: [sshPublisherDesc(configName: "JenkinsSrv", transfers: [sshTransfer(execCommand: 'pwd')])])
}
}
}
post ('POST BUILD'){
always {
echo 'This is post action!!!'
}
}
}
Also, I turned on Bitbucked webhook plugin, than my repo notify Jenkins about new changes.
But my solution doesn't work. Help me resolve this case.
enter image description here
According to the official documentation for a Jenkins pipeline, the option you are looking for is the changelog condition inside the when directive. For example:
when { changelog 'release*' }

Create a Build Pipeline View based on Pipeline job using Job DSL plugin in Jenkins

Since there's a limitation in Jenkins Pipeline's that you cannot add a manual build step without hanging the build (see for example this stackoverflow question) I'm experimenting with a combination of Jenkins Pipeline and Build Pipeline Plugin using the Job DSL plugin.
My plan was to create a Job DSL script that first executes the the Jenkins Pipeline (defined in a Jenkinsfile) and then create a downstream job that deploys to production (this is the manual step). I've created this Job DSL script as a test:
pipelineJob("${REPO_NAME} jobs") {
logRotator(-1, 10)
def repo = "https://path-to-repo/${REPO_NAME}.git"
triggers {
scm('* * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
publishers {
buildPipelineTrigger("${REPO_NAME} deploy to prod") {
parameters {
currentBuild()
}
}
}
}
freeStyleJob("${REPO_NAME} deploy to prod") {
}
buildPipelineView("$REPO_NAME Build Pipeline") {
selectedJob("${REPO_NAME} jobs")
}
where REPO_NAME is defined as an environment variable. The Jenkinsfile looks like this:
node {
stage('build'){
echo "building"
}
stage('run tests'){
echo "running tests"
}
stage('package Docker'){
echo "packaging"
}
stage('Deploy to Test'){
echo "Deploying to Test"
}
}
The problem is that the selectedJob points to "${REPO_NAME} jobs" which doesn't seem to be a valid option as "Initial Job" in the Build Pipeline Plugin view (you can't select it manually either).
Is there a workaround for this? I.e. how can I use a Jenkins Pipeline as the "Initial Job" for the Build Pipeline Plugin?
From the documentation on yourDomain.com/plugin/job-dsl/api-viewer/index.html#method/javaposse.jobdsl.dsl.views.NestedViewsContext.envDashboardView
It shows that buildPipelineView can only be used within a View block which is inside of a Folder block.
Folder {
View {
buildPipelineView {
}
}
}

Can I "import" the stages in a Jenkins Declarative pipeline

I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

Resources