How to configure basic branch build strategies plugin using job dsl? - jenkins

The multi branch pipeline plugin, awesome as it is, doesn't build tags out of the box. The usage of the basic-branch-build-strategies-plugin is required to enable tag discovery and building.
My question is directly related to: Is there a way to automatically build tags using the Multibranch Pipeline Jenkins plugin?
This plugin works great in the UI but doesn't appear to be easily configurable using the Jenkins job dsl. Does anyone have any examples of how to set the branch strategies using the dsl (or dsl configure->) so that tags will be discovered and built?
Having examined the delta between the config.xml files when the settings are changed via ui, it looks like I need to be able to add this trait:
<org.jenkinsci.plugins.github__branch__source.TagDiscoveryTrait />
and this section under build strategies:
<buildStrategies
<jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl
plugin="basic-branch-build-strategies#1.1.1">
<atLeastMillis>-1</atLeastMillis>
<atMostMillis>172800000</atMostMillis>
</jenkins.branch.buildstrategies.basic.TagBuildStrategyImpl>
</buildStrategies>

Something like
multibranchPipelineJob('pipline') {
...
branchSources {
branchSource {
source {
github {
...
traits {
...
gitTagDiscovery()
}
}
buildStrategies {
buildTags {
atLeastDays '-1'
atMostDays '20'
}
}
}
}
}
}
is what I've been working with. It's not documented in the plugin, but that doesn't stop the job-dsl plugin from dynamically generating the API calls for it.
You can see what the API for your specific Jenkins installation is by going to {your_jenkins_url}/plugin/job-dsl/api-viewer/index.html.
Sometimes things won't appear there because a plugins lacks support for job-dsl.
In that case you can still generate the xml with the Configure Block.
However, this is pretty clumsy to use.
Edit: At least if I use gitHubTagDiscovery() as suggested by the dynamically generated API, Jenkins will crash. Instead, the configure block has to be used to get all the discovery methods for github.
configure {
def traits = it / sources / data / 'jenkins.branch.BranchSource' / source / traits
traits << 'org.jenkinsci.plugins.github__branch__source.BranchDiscoveryTrait' {
strategyId(1)
}
traits << 'org.jenkinsci.plugins.github__branch__source.OriginPullRequestDiscoveryTrait' {
strategyId(1)
}
traits << 'org.jenkinsci.plugins.github__branch__source.TagDiscoveryTrait'()
}

Related

Disable or auto approve Script Approval for scripts executed in Job Dsl (Active Choice Parameters)?

Running Jenkins 2.289.1.
I have this pipelineJob Job Dsl setting up Active Choice parameters:
https://plugins.jenkins.io/uno-choice/
pipelineJob("test") {
parameters {
activeChoiceParam('CHOICE-1') {
description('Allows user choose from multiple choices')
filterable()
choiceType('SINGLE_SELECT')
groovyScript {
script('return ["choice1", "choice2", "choice3"];')
fallbackScript('"fallback choice"')
}
}
}
definition {
cpsScm {
scm {
git {
remote {
credentials("${creds}")
url("${gitUrl}")
}
branch("${gitBranch}")
}
}
scriptPath("${pathToFile}")
}
}
}
To make sure I can run Job Dsl in the first place without having to manually approve that I have added the following to jcasc:
jenkins:
security:
globalJobDslSecurityConfiguration:
useScriptSecurity: false
But that is not enough. Before I can run the generated pipeline based on above Job Dsl I still need to manually approve:
How do I configure Job Dsl, jcasc or something else to either disable script approval for anything that goes on in a Job Dsl or automatically approve any script that might be created inside a job dsl?
Hopefully I don't have to hack my way around that like suggested here:
https://stackoverflow.com/a/64364086/363603
I am aware that there is a reason for this feature but its for a local only jenkins that I am using for experimenting and this is currently killing my productivity. Related:
https://issues.jenkins.io/browse/JENKINS-28178?focusedCommentId=376405&page=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-376405
What worked for me:
Manage Jenkins > Configure Global Security > CSRF Protection (section header -- not sure why) > Enable script security for Job DSL scripts (the name of the option that I disabled).

Can we use a single jenkins file for multibranch piepeline in jenkins using shared libraries?

I am trying to write a jenkinsfile which will take the data from shared libraries in jenkins for multibranch pipeline, something like below:-
#Library('Template')_
if (env.BRANCH_NAME == 'master') {
jenkins1(PROJECTNAME: 'test', GITURL: 'http://test/test.git')
} else {
jenkins2(PROJECTNAME: 'test1', GITURL: 'http:////test/test.git')
}
so the pipeline take the shared library depending upon the if condition, if the branch is master if statement data should work or else should be build.
Yes that’s possible. Actually we’re using a multibranch project to test our changes to our shared library that way.
You have to use the library step to load the library instead of the #Library annotation, like:
if (condition) {
library(‘someLib#${env.BRANCH_NAME}’)
} else {
library(‘someOtherLib’)
}
See https://jenkins.io/doc/pipeline/steps/workflow-cps-global-lib/#library-load-a-shared-library-on-the-fly for all details.
By the way: In case you’re planning to do Pull Requests the following Post might be useful to you as well: https://stackoverflow.com/a/51915362/4279361

Jenkins DSL script - Test Failure - Found multiple extensions which provide method lastCompleted

Trying to create multijobs in Jenkins with DSL scripting.
There are multiple jobs in a phase and I want to create a consolidated report for the multijob from downstream jobs.
I am using copy artifact to copy the results of downstream jobs to the multijob's target dir. Using selector - lastCompleted()
However I am getting this an error saying multiple extensions providing the method and tests are failing. lastCompleted() is apparently present in copyArtifact and multijob plugins where in this case I require both.
Here is my script:
multiJob('dailyMultiJob') {
concurrentBuild(true)
logRotator(-1, 10, -1, 10)
triggers {
cron('H H(0-4) * * 0-6')
}
steps {
phase('Smoke Tests'){
phaseJob('JobA')
phaseJob('JobB')
phaseJob('JobC')
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobA')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobB')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
copyArtifacts{
selector{
lastCompleted()
}
projectName('JobC')
filter('target/allure-results/*.*')
target('/path/to/this/multijob/workspace')
flatten(false)
}
}
publishers {
allure {
results {
resultsConfig {
path('target/allure-results')
}
}
}
archiveArtifacts {
pattern('target/reports/**/*.*')
pattern('target/allure-results/**/*.*')
allowEmpty(true)
}
}
}
Getting this below error after running gradle tests
Caused by: javaposse.jobdsl.dsl.DslException: Found multiple extensions which provide method lastCompleted with arguments []: [[hudson.plugins.copyartifact.LastCompletedBuildSelector, com.tikal.jenkins.plugins.multijob.MultiJobBuildSelector]]
I am not sure if there is a way to indicate use specific artifact's method.
Been stuck on this for quite some time. Any helps are highly appreciated. Thank you in advance!
I had come across the same issue few months back.
There are two possible solutions to this issue.
1 - Keep only one plugin that will avoid the conflict. (Not recommended as it might break other jobs)
2- Use configure block to modify the xml file which will avoid this conflict & you can keep multiple plugins that support the same extensions. (Recommended solution)
Thanks,
Late update:
What I had to do is to switch to scripted pipeline jobs instead.
Configure blocks are not really allowed on all the methods you want to use and they are limited by design. I believe some plugins also don't allow it for security reasons.
Better do use Pipelines.

How to enabe SCM polling with the Jenkins DSL plugin

I'd like to enable SCM polling in Jenkins by DSL code. As it's easily possible manually ( without DSL ) and works perfectly, but I'm looking for DSL code to
Make it enable -- check attached image for reference.
I already checked below link, but no any solution here.
https://jenkinsci.github.io/job-dsl-plugin/#method/javaposse.jobdsl.dsl.helpers.triggers.TriggerContext.scm
GitHub hook trigger for GITScm polling
and
Poll SCM
click here to check image
I'm not using Jenkins pipeline
Finally I got solution for this:
Following are the DSL code to enable scm polling:
triggers {
configure {
it / 'triggers' << 'com.cloudbees.jenkins.GitHubPushTrigger'{
spec''
}
scm('')
}
}
I have tested, It's working perfectly
Another Solution:
job('myjob') {
configure { it / 'triggers' / 'com.cloudbees.jenkins.GitHubPushTrigger' / 'spec' }
}
I had a similar situation when trying to enable scm polling for a pipeline.
I am configuring pipelines via job-dsl and CasC, and I specifically wanted to enable SCM polling.
So here's what I have working; I'm within the pipelineJob context, but I believe the solution is the same for the job context:
pipelineJob('myPipelineName') {
environmentVariables {
...
}
definition {
...
}
configure { project ->
project / 'properties' / 'org.jenkinsci.plugins.workflow.job.properties.PipelineTriggersJobProperty' / 'triggers' / 'hudson.triggers.SCMTrigger' {
'spec'('* * * * *')
}
}
}
The way I landed on this was manually changing a pipeline config (in the UI) to enable polling, then going and looking at the job's .xml on disk.
The slash delimited stuff you see in the code above represents the xml tag path to the value i want to change.

Jenkins, how to check regressions against another job

When you set up a Jenkins job various test result plugins will show regressions if the latest build is worse than the previous one.
We have many jobs for many projects on our Jenkins and we wanted to avoid having a 'job per branch' set up. So currently we are using a parameterized build to build eg different development branches using a single job.
But that means when I build a new branch any regressions are measured against the previous build, which may be for a different branch. What I really want is to measure regressions in a feature branch against the latest build of the master branch.
I thought we should probably set up a separate 'master' build alongside the parameterized 'branches' build. But I still can't see how I would compare results between jobs. Is there any plugin that can help?
UPDATE
I have started experimenting in the Script Console to see if I could write a post-build script... I have managed to get the latest build of master branch in my parameterized job... I can't work out how to get to the test results from the build object though.
The data I need is available in JSON at
http://<jenkins server>/job/<job name>/<build number>/testReport/api/json?pretty=true
...if I could just get at this data structure it would be great!
I tried using JsonSlurper to load the json via HTTP but I get 403, I guess because my script has no auth session.
I guess I could load the xml test results from disk and parse them in my script, it just seems a bit stupid when Jenkins has already done this.
I eventually managed to achieve everything I wanted, using a Groovy script in the Groovy Postbuild Plugin
I did a lot of exploring using the script console http://<jenkins>/script and also the Jenkins API class docs are handy.
Everyone's use is going to be a bit different as you have to dig down into the build plugins to get the info you need, but here's some bits of my code which may help.
First get the build you want:
def getProject(projectName) {
// in a postbuild action use `manager.hudson`
// in the script web console use `Jenkins.instance`
def project = manager.hudson.getItemByFullName(projectName)
if (!project) {
throw new RuntimeException("Project not found: $projectName")
}
project
}
// CloudBees folder plugin is supported, you can use natural paths:
project = getProject('MyFolder/TestJob')
build = project.getLastCompletedBuild()
The main test results (jUnit etc) seem to be available directly on the build as:
result = build.getTestResultAction()
// eg
failedTestNames = result.getFailedTests().collect{ test ->
test.getFullName()
}
To get the more specialised results from eg Violations plugin or Cobertura code coverage you have to look for a specific build action.
// have a look what's available:
build.getActions()
You'll see a list of stuff like:
[hudson.plugins.git.GitTagAction#2b4b8a1c,
hudson.scm.SCMRevisionState$None#40d6dce2,
hudson.tasks.junit.TestResultAction#39c99826,
jenkins.plugins.show_build_parameters.ShowParametersBuildAction#4291d1a5]
These are instances, the part in front of the # sign is the class name so I used that to make this method for getting a specific action:
def final VIOLATIONS_ACTION = hudson.plugins.violations.ViolationsBuildAction
def final COVERAGE_ACTION = hudson.plugins.cobertura.CoberturaBuildAction
def getAction(build, actionCls) {
def action = build.getActions().findResult { act ->
actionCls.isInstance(act) ? act : null
}
if (!action) {
throw new RuntimeException("Action not found in ${build.getFullDisplayName()}: ${actionCls.getSimpleName()}")
}
action
}
violations = getAction(build, VIOLATIONS_ACTION)
// you have to explore a bit more to find what you're interested in:
pylint_count = violations?.getReport()?.getViolations()?."pylint"
coverage = getAction(build, COVERAGE_ACTION)?.getResults()
// if you println it looks like a map but it's really an Enum of Ratio objects
// convert to something nicer to work with:
coverage_map = coverage.collectEntries { key, val -> [key.name(), val.getPercentageFloat()] }
With these building blocks I was able to put together a post-build script which compared the results for two 'unrelated' build jobs, then using the Groovy Postbuild plugin's helper methods to set the build status.
Hope this helps someone else.

Resources