I'm writing pipelinejob which looks like that:
pipelineJob('TESTING) {
description('TEST')
definition {
cpsScm {
scm {
git {
branch("\$deploy_branch")
remote {
credentials("CREDENTIALS")
url('REPO')
}
browser {
bitbucketWeb {
repoUrl("REPO")
}
}
extensions {
wipeWorkspace()
}
}
}
scriptPath("Jenkinsfile.deploy_seeds.build")
}
}
parameters {
gitParam('deploy_branch') {
type('BRANCH')
defaultValue('origin/master')
}
}
}
so inside this pipelinejob I call script from repo which looks like that:
pipeline {
agent any
stages {
stage('Test'){
steps {
echo('Hello World!')
dsl {
external('deploy_SEEDS.groovy')
additionalClasspath('src')
}
echo("End dsl block")
}
}
}
}
Unfortunately, after running this job I got:
java.lang.NoSuchMethodError: No such DSL method 'dsl' found among steps [ansiColor, ansiblePlaybook etc.... I have installed job-dsl plugin. Could somebody tell me what I'm doing wrong?
Related
this should be fairly basic, but when I research I come to things like gerrit triggrs and whatnot, which seem way too complicated for doing something simple like this.
I would like to do something like either this in the JobDSL script:
pipelineJob('deploy-game') {
definition {
environmentVariables {
env('ENVIRONMENT', "${ENVIRONMENT}")
keepBuildVariables(true)
}
cpsScm {
scm {
git{
remote {
url('https://blabla.git')
credentials('gitlab-credentials')
}
branches('${gitlabsourcebranch}')
}
}
scriptPath('path/to/this.jenkinsfile')
}
triggers {
gitlabPush {
buildOnMergeRequestEvents(true)
if ($gitlabMergeRequestState == 'merged') // this part
}
}
}
}
Or, trigger on all MR events, and then filter out in the pipeline script:
pipeline {
agent none
environment {
ENVIRONMENT = "${ENVIRONMENT}"
}
triggers {
$gitlabMergeRequestState == 'merged' // this one
}
stages {
stage ('do-stuff') {
agent {
label 'agent'
}
steps {
sh 'some commands ...'
}
}
}
}
How do I do this ?
So this is how it should be, I hope this is what you are looking for it.
pipelineJob('Job_Name') {
definition {
cpsScm {
lightweight(true)
triggers {
gitlabPush {
buildOnMergeRequestEvents(true) // it will trigger build when MR is opened.
buildOnPushEvents(true)
commentTrigger('retry a build') // When you write the comment on MR on gitlab. it will also trigger build
enableCiSkip(true)
rebuildOpenMergeRequest('source')
skipWorkInProgressMergeRequest(false)
targetBranchRegex('.*master.*|.*release.*') //This mean only push happened to master or release then only trigger jenkins build. Do not trigger build on normal feature branch push until the MR is opened.
}
}
configure {
it / triggers / 'com.dabsquared.gitlabjenkins.GitLabPushTrigger' << secretToken('ADD_TOKEN_FROM_JENKINS_JOB')
}
scm {
git {
remote {
credentials('ID')
url("git#URL.git")
branch("refs/heads/master")
}
}
}
scriptPath("jenkinsfile")
}
}
}
In my jenkins pipeline script (Jenkinsfile) I am creating pipelineJob:s using the jobDsl step, like this:
pipeline {
agent any
stages {
stage('create pipelines') {
steps {
jobDsl scriptText: """
pipelineJob('myfolder/myname') {
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile.subfolder')
}
}
}
"""
}
}
}
}
The above code works fine, however I believe those jobs would be better off geting their Jenkinsfile from SCM instead.
There is the cpsScm variant, but how could I reuse the scm-info from the current pipeline so that I don't have to specify each parameter individually?
I would like something along the lines of:
// ...
// Note: this does not work, sadly :)
jobDsl scriptText: """
pipelineJob('myfolder/myname') {
definition {
cpsScm = ${scm}
scriptPath('Jenkinsfile.subfolder')
}
}
"""
// ...
So far I've come up with:
cpsScm {
scm {
git {
remote {
url('${scm.getRepositories()[0].getURIs()[0].toString()}')
credentials('bitbucket-jenkins')
}
branch('${env.BRANCH_NAME}')
}
}
scriptPath('${pipelineFile}')
}
I have something like this:
stages {
stage('createTemplate') {
parallel {
stage('template_a') {
creating template a
}
stage('template_b') {
creating template b
}
}
}
stage('deployVm') {
parallel {
stage('deploy_a') {
deploy vm a
}
stage('deploy_b') {
deploy vm b
}
}
}
}
How can I make sure that deployVm stages run when and only when respective createTemplate stages were successful?
You may want to run one parallel like this:
parallel {
stage('a') {
stages {
stage ('template_a') { ... }
stage ('deploy_a') { ... }
}
stage('b') {
stages {
stage ('template_b') { ... }
stage ('deploy_b') { ... }
}
}
}
This will make sure only stages that deploy are the ones following successful template stages.
I have 3 different jobs (Build, Undeploy and Deploy), which want to execute Build and Undeploy in parallel and after that Deploy.
From search got to know that Build Flow Plugin got deprecated.
Please suggest a plugin.
You can write Jenkins file with the below format:-
pipeline {
stages {
agent { node { label 'master' } }
stage('Build/Undeploy') {
parallel {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
//Call your build script
}
}
}
stage('Undeploy') {
agent { node { label 'Undeploy' } }
steps {
script {
//Call your undeploy script
}
}
}
}
}
stage('Deploy'){
agent { node { label 'Deploy' } }
steps {
script {
//Call your deploy script
}
}
}
}
}
Below is the skeleton of my Jenkinsfile. The post directive is executed on success but not in case of failure. Is this the expected behavior of jenkins?
Thanks
#!/usr/bin/env groovy
pipeline {
agent {
node { label 'ent_linux_node' }
}
stages {
stage('Prepare'){
steps {
//some steps
}
}
stage('Build') {
steps {
//Fails at this stage
}
}
stage('ArtifactoryUploads') {
steps {
//skips since previous stage failed
}
}
}
post {
always {
//Doesn't get executed but I am expecting it to execute
}
}
}