Please bear with me the description might be long but it might give a clean picture of the intent and issue.
I have used Job DSL Plugin to create a seeder job, which in turns creates two new Jobs. I have 2 separate repositories
For maintaining jenkins pipeline scripts.
For actual code to build.
First I have created a pipeline job in jenkins which in turns creates view and 2 jobs. Config shown below:
The Jenkinsfile given below uses Job DSL plugin api, reads the groovy script and creates the required 2 jobs.
node('master') {
checkout scm
jobDsl targets: ['dsl/seedJobBuilder.groovy'].join('\n'),
removedJobAction: 'IGNORE',
removedViewAction: 'IGNORE',
lookupStrategy: 'SEED_JOB'
}
seedJobBuilder.groovy creates a dsl pipeline job whose task would be to build the actual codebase.
listView('Build Pipelines') {
description('All build and deploy jobs')
jobs {
names(
'build',
'deploy',
)
}
columns {
status()
weather()
name()
lastSuccess()
lastFailure()
lastDuration()
buildButton()
}
}
def buildCommerce = pipelineJob('build') {
properties {
githubProjectUrl("${projectRepo}") // url of actual code repo not the jenkins script repo
}
definition {
cpsScm {
scm {
git {
remote {
url("${pipelineRepo}") // jenkins script repo url
credentials("somecredentials")
}
branch('${JENKINS_SCRIPT_BRANCH}')
}
scriptPath('pipelines/pipelineBuildEveryDay.groovy')
lightweight(false)
}
}
}
triggers {
githubPush()
}
}
Config of the above job created by Job DSL:
This job reads the pipelineBuildEveryDay groovy script, checkout the actual codebase and build and deploy.
The place where I am struggling is how do we trigger build on this second job through github hook or through ghprb. Since I don't want to manipulate manually the second job and the git url of the job is the script repo URL not the codebase URL. Is it possible to do this even? If yes what am I missing?
I have the webhook configured
pipelineBuildEveryDay.groovy
pipeline {
libraries {
lib("shared-library#${params.JENKINS_SCRIPT_BRANCH}")
}
agent {
node {
label 'master'
}
}
options {
skipDefaultCheckout(true) // No more 'Declarative: Checkout' stage
}
stages {
stage('Crazy Build Pipeline') {
tools {
jdk 'java11'
}
stages {
stage('Prepare build name') {
steps{
script{
currentBuild.displayName = "${currentBuild.number}-build"
}
}
}
stage('Checkout') {
steps {
cleanWs()
script {
checkoutRepository("${projectDir}", "${params.PROJECT_TAG}", "${params.PROJECT_REPO}")
}
}
}
stage('Run Tests') {
steps {
echo "Running test coming soon..."
}
}
}
}
}
// post build actions
post {
success {
echo "success"
}
failure {
echo "failure"
}
}
}
Well the suffering comes to an end. Posting this answer for anyone struggling with similar sort of issues.
Make sure you uncheck all other types of trigger, the only checked one should be pull request builder.
The part which screwed me was the Project URL. In my case in SCM part the github url was of the Jenkins-scripts repository URL not the URL of the codebase I want to build. So I tried to use my codebase repository URL in Github Project URL textbox.
But the real problem was using repository URL in the format 'https://code-base-repo-url.git' instead it should be 'https://code-base-repo-url'. Sounds stupid? Yeah I know!
Finally the complete Job config pipeline script if it helps:
def pipelineRepo = 'https://jenkins-script-repo'
def projectRepo = 'https://code-base-repo-url'
def projectTag = '${GIT_BRANCH}'
def buildCommerce = pipelineJob('build') {
properties {
githubProjectUrl("${projectRepo}")
}
definition {
cpsScm {
scm {
git {
remote {
url("${pipelineRepo}")
credentials("use-your-own-user-pass-cred")
}
branch('${JENKINS_SCRIPT_BRANCH}')
}
scriptPath('pipelines/pipelineBuildEveryDay.groovy')
lightweight(false)
}
}
}
triggers {
githubPullRequest {
admin('use_your_own_admin')
triggerPhrase('build please')
useGitHubHooks()
permitAll()
displayBuildErrorsOnDownstreamBuilds()
extensions {
commitStatus {
context('Jenkins')
completedStatus('SUCCESS', 'All is well')
completedStatus('FAILURE', 'Something went wrong. Investigate!')
completedStatus('ERROR', 'Something went really wrong. Investigate!')
}
}
}
}
}
Related
When using a jenkins pipeline, on an ephemeral node (e.g. fargate):
pipeline {
agent {
label 'build-swarm'
}
stages {
...
}
post {
always {
cleanWs()
}
}
the ws cleanup [plugin][1] will try and remove the ws on the ephemeral node, which is pointless.
In an ideal world, we would use lightweight checkout on the controller, but because reasons this is not possible. So we have a fairly large repo checkout, that is not cleaned up.
This is the best thing I've managed to come up with:
pipeline {
...
}
node('master') {
folder = JOB_NAME.split('/')[0]
job = JOB_NAME.split('/')[1]
ws("${JENKINS_HOME}/jobs/${folder}/jobs/${job}/workspace#script") {
stage('clean up ws') {
cleanWs()
}
}
}
which seems to work, but feels very fragile. Am I missing something obvious?
[1]: https://plugins.jenkins.io/ws-cleanup/
https://www.jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#node-allocate-node
If I got you right you need to run cleanWs() step before pipeline run. Use:
pipeline {
agent any
options {
// This is required if you want to clean before build
skipDefaultCheckout(true)
}
stages {
stage('Build') {
steps {
// Clean before build
cleanWs()
// We need to explicitly checkout from SCM here
checkout scm
echo "Building ${env.JOB_NAME}..."
}
}
}
}
I have a remote agent and multiple local agents on my jenkins server. I have a script that I want to run only after those test that are builded on the remote agent. Is it possible somehow?
Thanks
Using jenkins pipeline you have the possiblity of running actions according to the result of the build. Take a look here:
https://jenkins.io/doc/book/pipeline/syntax/#post
You can even separate your build in "stages" and run actions according to the result of the stage using the same method.
On the entire pipeline, or on a specific stage, or even on your post actions, you can choose which node does the job.
Considering you run a stage on a specific node you could:
pipeline {
stages {
stage ('Build') {
agent { label "SLAVE1" }
steps {
// Stuff to do
}
post {
always {
// stuff
}
}
}
}
}
Or at end end of your pipeline in a post block:
pipeline {
stages {
stage ("Build") {
agent { label "SLAVE" }
steps {
// stuff
}
}
}
post {
// Or failure, unstable, success...
always {
node('SLAVE1'){
// stuff
}
}
}
}
I am having issues with multibranch pipeline for job DSL plugin to automate the creation of multibranch pipeline job.
The piece am having issues with is how to let set the path to the Jenkinsfile on the repo. I have looked online for documentation but found nothing to help. I have even tried to get example scripts but multibranch job DSL scripts are rare on the internet. Matter of fact could not find any that has Jenkinsfile set in it
jobs.groovy
folderName = "${JENKINS_PATH}"
folder(folderName)
multibranchPipelineJob("${folderName}/jenkins_multibranch_devops") {
branchSources {
git {
remote("https://gitlab.com/${REPO_PATH}")
credentialsId('gitlab_credentials')
includes('*')
}
}
configure { project ->
project / factory {
scriptPath('jenkins/Jenkinsfile')
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(14)
}
}
}
Here is what i have and its failing because i am obviously missing some stuffs which is why am looking for help
What am i missing and where can i get documentation if i plan on adding more and more to this jobs.groovy file and want to know how to know what stuffs to add because current doc page doesn't help at all
You can set it using this:
multibranchPipelineJob("${folderName}/jenkins_multibranch_devops") {
branchSources {
git {
remote("https://gitlab.com/${REPO_PATH}")
credentialsId('gitlab_credentials')
includes('*')
}
}
factory {
workflowBranchProjectFactory {
scriptPath('jenkins/Jenkinsfile')
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(14)
}
}
}
Documentation is available through the Job DSL API viewer in your jenkins installation: https://{your-jenkins}/plugin/job-dsl/api-viewer/index.html
I have a Jenkins pipeline job that (among other things) creates another pipelineJob (to cleanup everything afterwards) using Job DSL plugin.
pipeline {
agent { label 'Deployment' }
stages {
stage('Clean working directory and Checkout') {
steps {
deleteDir()
checkout scm
}
}
// Complex logic omitted
stage('Generate cleanup job') {
steps {
build job: 'cleanup-job-template',
parameters: [
string(name: 'REGION', value: "${REGION}"),
string(name: 'DEPLOYMENT_TYPE', value: "${DEPLOYMENT_TYPE}")
]
}
}
}
}
The thing is that I need this newly generated job to be built only once and then, if the build was successful, the job should be deleted.
pipeline {
stages {
stage('Cleanup afterwards') {
// cleanup logic
}
}
post {
success {
// delete this job?
}
}
}
I thought, that this can be done using Pipeline Post Action, but, unfortunately, I couldn't find any out-of-the-box solution for this.
Is it possible to achieve this at all?
You can achieve this using the post Groovy and then you will need to write some groovy code in order to delete the job:
#!/usr/bin/env groovy
import hudson.model.*
pipeline {
agent none
stages {
stage('Cleanup afterwards') {
// cleanup logic
steps {
node('worker') {
sh 'ls -la'
}
}
}
}
post {
success {
script {
jobsToDelete = ["<JOB_TO_DELETE"]
deleteJob(Hudson.instance.items, jobsToDelete)
}
}
}
}
def deleteJob(items, jobsToDelete) {
items.each { item ->
if (item.class.canonicalName != 'com.cloudbees.hudson.plugins.folder.Folder') {
if (jobsToDelete.contains(item.fullName)) {
manager.listener.logger.println(item.fullName)
item.delete()
}
}
}
}
Tested both cases and work on Jenkins 2.89.4
You should do that all in one job instead of creating and deleting jobs. Use multiple stages for that, e.g. deploy test system, run tests / wait for tests to be finished, undeploy. No need for extra jobs. Example posted here: Can a Jenkins pipeline have an optional input step?
I have installed Pipeline Plugin which used to be called as Workflow Plugin earlier.
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin
I want to know how can i use Job Dsl to create and configure a job which is of type Pipeline
You should use pipelineJob:
pipelineJob('job-name') {
definition {
cps {
script('logic-here')
sandbox()
}
}
}
You can define the logic by inlining it:
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'logic'
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
or load it from a file located in workspace:
pipelineJob('job-name') {
definition {
cps {
script(readFileFromWorkspace('file-seedjob-in-workspace.jenkinsfile'))
sandbox()
}
}
}
Example:
Seed-job file structure:
jobs
\- productJob.groovy
logic
\- productPipeline.jenkinsfile
then productJob.groovy content:
pipelineJob('product-job') {
definition {
cps {
script(readFileFromWorkspace('logic/productPipeline.jenkinsfile'))
sandbox()
}
}
}
I believe this question is asking something how to use the Job DSL to create a pipeline job which references the Jenkinsfile for the project, and doesn't combine the job creation with the detailed step definitions as has been given in the answers to date. This makes sense: the Jenkins job creation and metadata configuration (description, triggers, etc) could belong to Jenkins admins, but the dev team should have control over what the job actually does.
#meallhour, is the below what you're after? (works as at Job DSL 1.64)
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
This example worked for me. Here's another example based on what worked for me:
pipelineJob('Your App Pipeline') {
def repo = 'https://github.com/user/yourApp.git'
def sshRepo = 'git#git.company.com:user/yourApp.git'
description("Your App Pipeline")
keepDependencies(false)
properties{
githubProjectUrl (repo)
rebuild {
autoRebuild(false)
}
}
definition {
cpsScm {
scm {
git {
remote { url(sshRepo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
If you build the pipeline first through the UI, you can use the config.xml file and the Jenkins documentation https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob to create your pipeline job.
In Job DSL, pipeline is still called workflow, see workflowJob.
The next Job DSL release will contain some enhancements for pipelines, e.g. JENKINS-32678.
First you need to install Job DSL plugin and then create a freestyle project in jenkins and select Process job DSLs from the dropdown in the build section.
Select Use the provided DSL script and provide following script.
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage name 1') {
steps {
// your logic here
}
}
stage('Stage name 2') {
steps {
// your logic here
}
}
}
}
}
''')
}
}
}
Or you can create your job by pointing the jenkinsfile located in remote git repository.
pipelineJob("job-name") {
definition {
cpsScm {
scm {
git {
remote {
url("<REPO_URL>")
credentials("<CREDENTIAL_ID>")
}
branch('<BRANCH>')
}
}
scriptPath("<JENKINS_FILE_PATH>")
}
}
}
If you are using a git repo, add a file called Jenkinsfile at the root directory of your repo. This should contain your job dsl.