The Configure block for the Job DSL scan by webhook is not working - jenkins

I am using the below Configure block for the scan by webhook functionality in Jenkins Job DSL.
Environment: Jenkins and Bitbucket
traits << 'com.igalg.jenkins.plugins.mswt.trigger.ComputedFolderWebHookTrigger' {
token("TEST_HOOK")
}
The above block is not working.
But the below periodic trigger syntax is working with out any issues.
it / 'triggers' << 'com.cloudbees.hudson.plugins.folder.computed.PeriodicFolderTrigger'{
spec '* * * * *'
interval "60000"
}
As we want to use the scan by webhook functionality. Kindly correct my scan by webhook syntax

The Job DSL Pipeline provides some declarative Parts for triggering a job either on changes or scheduled.
Another case can be the Multibranch Pipeline, where any branch is configured by the included Jenkinsfile. In order to create Jobs for a new branch, the Task "Scan Multibranch Pipeline Now" has to be executed either manually or scheduled. This can be programmatically(schedule) done via the configure block inside the Multibranch Pipeline Job:
multibranchPipelineJob("JobName") {
...
configure { node ->
def periodicFolderTrigger = node / triggers / 'com.cloudbees.hudson.plugins.folder.computed.PeriodicFolderTrigger' {
spec('H H * * *')
//4 hours (60000(Milliseconds)*60(Minutes)*4(hours)
interval(60000*60*4)
}
}
}
The webhook Solution, is more elegant, but you need the Jenkins plugin https://plugins.jenkins.io/multibranch-scan-webhook-trigger/ to be installed. Programmatically you can activate this via the following way:
multibranchPipelineJob("JobName") {
...
configure { node ->
def webhookTrigger = node / triggers / 'com.igalg.jenkins.plugins.mswt.trigger.ComputedFolderWebHookTrigger' {
spec('')
token("TESTTOKEN")
}
}
}

Related

Jenkins pipeline with a conditional trigger

My Jenkins pipeline is as follow:
pipeline {
triggers {
cron('H */5 * * *')
}
stages {
stage('Foo') {
...
}
}
}
The repository is part of a Github Organization on Jenkins - every branch or PR pushed results in a Jenkins job being created for that branch or PR.
I would like the trigger to only be run on the "main" branch because we don't need all branches and PRs to be run on a cron schedule; we only need them to be run on new commits which they already do.
Is it possible?
yes - it's possible. To schedule cron trigger only for a specific branch you can do it like this in your Jenkinsfile:
String cron_string = (scm.branches[0].name == "main") ? 'H */5 * * *' : ''
pipeline {
triggers {
cron(cron_string)
}
// whatever other code, options, stages etc. is in your pipeline ...
}
What it does:
Initialize a variable based on a branch name. For main branch it sets requested cron configuration, otherwise there's no scheduling (empty string is set).
Use this variable within pipeline
Further comments:
it's possible to use it also with parameterizedCron (in a case you'd want / need to).
you can use also some other variables for getting branch name, e.g: env.BRANCH_NAME instead of scm.branches[0].name. Whatever fits your needs...
This topic and solution is discussed also in Jenkins community: https://issues.jenkins.io/browse/JENKINS-42643?focusedCommentId=293221#comment-293221
EDIT: actually a similar question that leads to the same configuration - here on Stack: "Build Periodically" with a Multi-branch Pipeline in Jenkins
You can simply add a when condition to your pipeline.
when { branch 'main' }

Jenkins doesnt automatically trigger the pipeline when a new tag is pushed

I have a jenkins pipeline, and I simply want to trigger the pipeline whenever a tag is created, currently, jenkins detects the newly created tag, but I still need to build it manually!
My pipeline looks like the following:
pipeline {
agent {
node {
label 'some-label'
}
}
stage('core pipeline') {
when {
tag "*"
}
// do something
}
I am using Gitlab as source code repository.
Anything I am missing here?
Have you enabled correct triggers (In this case Polling?)
triggers { pollSCM('H */4 * * 1-5') }

Jenkins build frequency based on previous build status

I have a Jenkins pipeline which is scheduled to trigger every 4 hours. However, my requirements is that once the build fails, I want the builds to happen more frequently and keep sending constant reminders that the build is broken. In short, the build schedule must depend on the status of the previous build.
Is that possible in Jenkins?
Thanks,
In Scripted Pipeline, you can do something like this :
def triggers = []
if(currentBuild.getPreviousBuild().result != 'SUCCESS') {
triggers << cron('0 */1 * * *') // every hour
} else {
triggers << cron('0 */4 * * *') // every 4 hours
}
properties ([
pipelineTriggers(triggers)
])
node {
...
}
I can't think of a direct way but you can have a workaround. You can have a replica of the same job(let's call it job 'B') and trigger it when the build of the first job fails(let's call it job 'A'). Now if B fails again then you can retrigger it(B) (adding some wait time) and send a notification after it fails, keep doing it until it passes. This will be done in a much easier way if you are using the scripted Jenkins pipeline. Hope this answer helps you in some way.

How to limit Jenkins concurrent multibranch pipeline builds?

I am looking at limiting the number of concurrent builds to a specific number in Jenkins, leveraging the multibranch pipeline workflow but haven't found any good way to do this in the docs or google.
Some docs say this can be accomplished using concurrency in the stage step of a Jenkinsfile but I've also read elsewhere that that is a deprecated way of doing it.
It looks like there was something released fairly recently for limiting concurrency via Job Properties but I couldn't find documentation for it and I'm having trouble following the code. The only thing I found a PR that shows the following:
properties([concurrentBuilds(false)])
But I am having trouble getting it working.
Does anybody know or have a good example of how to limit the number of concurrent builds for a given, multibranch project? Maybe a Jenkinsfile snippet that shows how to limit or cap the number of multibranch concurrent builds?
Found what I was looking for. You can limit the concurrent builds using the following block in your Jenkinsfile.
node {
// This limits build concurrency to 1 per branch
properties([disableConcurrentBuilds()])
//do stuff
...
}
The same can be achieved with a declarative syntax:
pipeline {
options {
disableConcurrentBuilds()
}
}
Limiting concurrent builds or stages are possible with the Lockable Resources Plugin (GitHub). I always use this mechanism to ensure that no publishing/release step is executed at the same time, while normal stages can be build concurrently.
echo 'Starting'
lock('my-resource-name') {
echo 'Do something here that requires unique access to the resource'
// any other build will wait until the one locking the resource leaves this block
}
echo 'Finish'
As #VadminKotov indicated it is possible to disable concurrentbuilds using jenkins declarative pipelines as well:
pipeline {
agent any
options { disableConcurrentBuilds() }
stages {
stage('Build') {
steps {
echo 'Hello Jenkins Declarative Pipeline'
}
}
}
}
disableConcurrentBuilds
Disallow concurrent executions of the Pipeline. Can be useful for
preventing simultaneous accesses to shared resources, etc. For
example: options { disableConcurrentBuilds() }
Thanks Jazzschmidt, I looking to lock all stages easily, this works for me (source)
pipeline {
agent any
options {
lock('shared_resource_lock')
}
...
...
}
I got the solution for multibranch locking too, with de lockable-resources plugin and the shared-libs here is :
Jenkinsfile :
#Library('my_pipeline_lib#master') _
myLockablePipeline()
myLockablePipeline.groovy :
call(Map config){
def jobIdentifier = env.JOB_NAME.tokenize('/') as String[];
def projectName = jobIdentifier[0];
def repoName = jobIdentifier[1];
def branchName = jobIdentifier[2];
//now you can use either part of the jobIdentifier to lock or
limit the concurrents build
//here I choose to lock any concurrent build for PR but you can choose all
if(branchName.startsWith("PR-")){
lock(projectName+"/"+repoName){
yourTruePipelineFromYourSharedLib(config);
}
}else{
// Others branches can freely concurrently build
yourTruePipelineFromYourSharedLib(config);
}
}
To lock for all branches just do in myLockablePipeline.groovy :
call(Map config){
def jobIdentifier = env.JOB_NAME.tokenize('/') as String[];
def projectName = jobIdentifier[0];
def repoName = jobIdentifier[1];
def branchName = jobIdentifier[2];
lock(projectName+"/"+repoName){
yourTruePipelineFromYourSharedLib(config);
}
}

Multi branch Pipeline plugin load multiple jenkinsfile per branch

I am able to load Jenkinsfile automatically through multi branch pipeline plugin with a limitation of only one jenkinsfile per branch.
I have multiple Jenkinsfiles per branch which I want to load, I have tried with below method by creating master Jenkins file and loading specific files. In below code it merges 1.Jenkinsfile and 2.Jenkinsfile as one pipeline.
node {
git url: 'git#bitbucket.org:xxxxxxxxx/pipeline.git', branch: 'B1P1'
sh "ls -latr"
load '1.Jenkinsfile'
load '2.Jenkinsfile'
}
Is there a way I can load multiple Jenkins pipeline code separately from one branch?
I did this writing a share library (ref https://jenkins.io/doc/book/pipeline/shared-libraries/) containing the following file (in vars/generateJobsForJenkinsfiles.groovy):
/**
* Creates jenkins pipeline jobs from pipeline script files
* #param gitRepoName name of github repo, e.g. <organisation>/<repository>
* #param filepattern ant style pattern for pipeline script files for which we want to create jobs
* #param jobPath closure of type (relativePathToPipelineScript -> jobPath) where jobPath is a string of formated as '<foldername>/../<jobname>' (i.e. jenkins job path)
*/
def call(String gitRepoName, String filepattern, def jobPath) {
def pipelineJobs = []
def base = env.WORKSPACE
def pipelineFiles = new FileNameFinder().getFileNames(base, filepattern)
for (pipelineFil in pipelineFiles) {
def relativeScriptPath = (pipelineFil - base).substring(1)
def _jobPath = jobPath(relativeScriptPath).split('/')
def jobfolderpath = _jobPath[0..-2]
def jobname = _jobPath[-1]
echo "Create jenkins job ${jobfolderpath.join('/')}:${jobname} for $pipelineFil"
def dslScript = []
//create folders
for (i=0; i<jobfolderpath.size(); i++)
dslScript << "folder('${jobfolderpath[0..i].join('/')}')"
//create job
dslScript << """
pipelineJob('${jobfolderpath.join('/')}/${jobname}') {
definition {
cpsScm {
scm {
git {
remote {
github('$gitRepoName', 'https')
credentials('github-credentials')
}
branch('master')
}
}
scriptPath("$relativeScriptPath")
}
}
configure { d ->
d / definition / lightweight(true)
}
}
"""
pipelineJobs << dslScript.join('\n')
//println dslScript
}
if (!pipelineJobs.empty)
jobDsl sandbox: true, scriptText: pipelineJobs.join('\n'), removedJobAction: 'DELETE', removedViewAction: 'DELETE'
}
Most likely you want to map old Jenkins' jobs (pre pipeline) operating on single branch of some project to a single multi branch pipeline. The appropriate approach would be to create stages that are input dependent (like a question user if he/she wants to deploy to staging / live).
Alternatively you could just create a new separate Pipeline jenkins job that actually references an your project's SCM and points to your other Jenkinsfile (then one pipeline job per every other jenkinsfile).

Resources