Last successful build's revision for an upstream MultiBranch Job in Jenkins Declarative Pipeline - jenkins

I'd like to get the build revisions of the last successful builds of Upstream jobs. The upstream jobs are multibranch jobs.
So far I'm generating a list of upstream jobs' names as triggers. But I can't seem to find the right method to call.
import jenkins.model.Jenkins
def upstreamPackages = ['foo', 'bar']
def upstreamJobs = upstreamPackages.collect { "${it}-multibranch/master" }.join(',')
pipeline {
agent none
triggers {
upstream(upstreamProjects: upstreamJobs,
threshold: hudson.model.Result.SUCCESS)
}
stages {
stage('test'){
steps{
script {
upstreamJobs.each {
println it
job = Jenkins.instance.getItem(it)
job.getLastSuccessfulBuild()
revision = job.getLastSuccessfulBuild().changeset[0].revision
println revision
}
}
}
}
}
}
This results in a null object for item. What's the correct way to do this?
UPDATE 1
After discovering the Jenkins Script Console and this comment, I managed to come up with the folllowing:
import jenkins.model.Jenkins
import hudson.plugins.git.util.BuildData
def upstreamPackages = ['foo', 'bar']
def upstreamJobsList = upstreamPackages.collect { "${it}-multibranch/master" }.join(',')
#NonCPS
def resolveRequirementsIn(packages){
BASE_URL = 'git#github.com:myorg'
requirementsIn = ''
packages.each { pkg ->
revision = getLastSuccessfulBuildRevision("${pkg}-multibranch")
requirementsIn <<= "-e git+${BASE_URL}/${pkg}.git#${revision}#egg=${pkg}\n"
}
println requirementsIn
return requirementsIn
}
#NonCPS
def getLastSuccessfulBuildRevision(jobName){
project = Jenkins.instance.getItem(jobName)
masterJob = project.getAllItems().find { job -> job.getName() == 'master' }
build = masterJob.getLastSuccessfulBuild()
return build.getAction(BuildData.class).getLastBuiltRevision().sha1String
}
pipeline {
agent { label 'ci_agent' }
triggers {
upstream(upstreamProjects: upstreamJobsList,
threshold: hudson.model.Result.SUCCESS)
}
stages {
stage('Get artifacts'){
steps{
script{
requirementsIn = resolveRequirementsIn upstreamPackages
writeFile file: 'requirements.in', text: requirementsIn
}
}
}
}
}
It's throwing an error:
an exception which occurred:
in field org.jenkinsci.plugins.pipeline.modeldefinition.withscript.WithScriptScript.script
in object org.jenkinsci.plugins.pipeline.modeldefinition.agent.impl.LabelScript#56d1724
in field groovy.lang.Closure.delegate
in object org.jenkinsci.plugins.workflow.cps.CpsClosure2#27378d57
in field groovy.lang.Closure.delegate
in object org.jenkinsci.plugins.workflow.cps.CpsClosure2#6e6c3c4e
in field org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.closures
in object org.jenkinsci.plugins.workflow.cps.CpsThreadGroup#5d0ffef3
in object org.jenkinsci.plugins.workflow.cps.CpsThreadGroup#5d0ffef3
Caused: java.io.NotSerializableException:
org.jenkinsci.plugins.workflow.multibranch.WorkflowMultiBranchProject

The problem was that Jenkins' Pipeline DSL requires all assigned objects to be Serializable.
Jenkins.instance.getItem(jobName) returns a WorkflowMultiBranchProject which is not Serializable. Neither is Jenkins.instance.getItem(jobName).getItem('master') which is a WorkflowJob object.
So I invariably went down the call chain to what I needed replacing variable assignments with chained method calls and came up with the following solution.
def upstreamPackages = ['foo', 'bar']
def upstreamJobsList = upstreamPackages.collect { "${it}-multibranch/master" }.join(',')
def String requirementsInFrom(packages){
final BASE_URL = 'git#github.com:myorg'
requirementsIn = ''
packages.each{ pkg ->
revision = Jenkins.instance.getItem("${pkg}-multibranch")
.getItem('master')
.getLastSuccessfulBuild()
.getAction(BuildData.class)
.getLastBuiltRevision()
.sha1String
requirementsIn <<= "-e git+${BASE_URL}/${pkg}.git#${revision}#egg=${pkg}\n"
}
return requirementsIn.toString()
}

Related

Getting java.io.NotSerializableException: org.jenkinsci.plugins.workflow.job.WorkflowJob error in jenkins

I am using declarative pipeline wherein when I build my pipeline it is giving me java.io.NotSerializableException: org.jenkinsci.plugins.workflow.job.WorkflowJob error.
These are the 2 methods which I am using:-
#NonCPS
def getJob(name) {
def hi = Hudson.instance
return hi.getItemByFullName(name, Job)
}
#NonCPS
def getParam(WorkflowJob job, String paramName) {
def prop = job.getProperty(ParametersDefinitionProperty.class)
for (param in prop.getParameterDefinitions()) {
if (param.name == paramName) {
return param
}
}
return null
}
And below is the part of my code where I am getting this error.
stages{
stage("A"){
steps{
script {
def job = getJob(JOB_NAME)
def param = getParam(job, "AWS Ser")
def service_name = ("${SERVICE_NAME}".replace('AWS Ser:', '')).toString().tokenize(',[]')
if (service_name != 'All') {
def regions = "${REGIONS}".toString()
regions.split('\n').each() {
service_name.each() {
sh '''
echo "Welcome"
'''
}
}
}
Here, if you see when I put sh script then I get this error and if I remove this sh script then there is no error.
I tried to troubleshoot and something is wrong with the 2 methods which I mentioned above.
Don't return the WorkflowJob object to the Pipeline step. Refactor your functions like below.
#NonCPS
def getJob(name) {
def hi = Hudson.instance
return hi.getItemByFullName(name, Job)
}
#NonCPS
def getParam(String jobName, String paramName) {
def job = getJob(jobName)
def prop = job.getProperty(ParametersDefinitionProperty.class)
for (param in prop.getParameterDefinitions()) {
if (param.name == paramName) {
return param
}
}
return null
}
Then in the Pipeline stage call getParam as.
def param = getParam(JOB_NAME, "AWS Ser")

How to enable existing job in Jenkins using job DSL or other means?

I use Jenkins job DSL script to create other jobs. Now I want to have a separate script that will enable/disable the jobs that I created by the DSL script.
Here is my enable/disable script:
job("cronjob/${JOB_TYPE}_${ENVRIONMENT}_CRONJOB") {
if ( ACTION == "enable" ) {
disabled(false)
} else if ( ACTION == "disable" ) {
disabled(true)
}
}
It does enable/disable the job. But it also empties the job which has the SCM, schedule and parameters setup.
How do I enable/disable an existing job in Jenkins w/o losing the job? Not manually!
Thanks!
You can use the followig script to do this.
def jobToDisable = "Sample"
Jenkins.instance.getAllItems(Job.class).each { jobitem ->
def jobName = jobitem.name
def jobInfo = Jenkins.instance.getItem(jobName)
if(jobName.equals(jobToDisable)) {
jobInfo.setDisabled(true) // false to enable
}
}
Full pipeline
node {
stage('Stage one') {
script {
disableJob("folder1/Sample3")
}
}
stage('Final Step') {
echo "Result"
}
}
def disableJob(name) {
def jobToDisable = name
Jenkins.instance.getAllItems(Job.class).each { jobitem ->
def jobName = jobitem.getFullName()
def jobInfo = Jenkins.instance.getItemByFullName(jobName)
if(jobName.equals(jobToDisable)) {
println("Disabling Job!!")
jobInfo.setDisabled(true)
}
}
}

How to execute the groovy script written on the Jenkins parameters in slave node

My small piece of code
def proc ='./test.py'.execute()
proc.waitFor()
def output = proc.in.text
def exitcode = proc.exitValue()
def error = proc.err.text
return output.tokenize()
This above groovy script will execute from one of the Active Choice Reactive Reference Parameter in my Jenkins pipeline. Is there anyway to execute this from different slave. I don't have idea that the groovy script written in parameter will execute from other slave or not..
Could someone help me to achieve this?
You can try this
pipeline {
agent {
node { label "name-of-slave-jenkins"}
}
stages {
stage('stage 1') {
steps {
script {
def proc ='./test.py'.execute()
proc.waitFor()
}
}
}
stage('stage 2') {
steps{
script{
def output = proc.in.text
def exitcode = proc.exitValue()
def error = proc.err.text
return output.tokenize()
}
}
}
}
}

Update build parameter in declarative pipeline

I need to update my build param in declarative pipeline script, I tried:
pipeline {
stages {
stage {
steps {
script {
def reporter = build.buildVariableResolver.resolve("reporter")
if (reporter != null) {
reporter = reporter.tokenize(',').find {item -> item.contains('displayName')}.tokenize('=')[1]
} else {
reporter = ""
}
def reporterParameter = new StringParameterValue("reporter", "\${reporter}")
build.addOrReplaceAction(new ParametersAction(reporterParameter))
}
}
}
}
}
but I get error hudson.remoting.ProxyException: groovy.lang.MissingPropertyException: No such property: build for class: WorkflowScript
How can I run this groovy script in declarative pipeline or upadate my build params in another way (but declaratively)

How to trigger multiple down stream jobs in jenkins dynamically based on some input parameter

Scenario: I want to trigger few down stream jobs(Job A and Job B ....) dynamically based on the input parameter received by the current job.
import hudson.model.*
def values = ${configname}.split(',')
def currentBuild = Thread.currentThread().executable
println ${configname}
println ${sourceBranch}
values.eachWithIndex { item, index ->
println item
println index
def job = hudson.model.Hudson.instance.getJob(item)
def params = new StringParameterValue('upstream_job', ${sourceBranch})
def paramsAction = new ParametersAction(params)
def cause = new hudson.model.Cause.UpstreamCause(currentBuild)
def causeAction = new hudson.model.CauseAction(cause)
hudson.model.Hudson.instance.queue.schedule(job, 0, causeAction, paramsAction)
}
How about something like this? I was getting a comma separated list from the upstream system and I splitted them as individaul string which is internally jobs. Making a call by passing each individual strings.
this Jenkinsfile would do that:
#!/usr/bin/env groovy
pipeline {
agent { label 'docker' }
parameters {
string(name: 'myHotParam', defaultValue: '', description: 'What is your param, sir?')
}
stages {
stage('build') {
steps {
script {
if (params.myHotParam == 'buildEverything') {
build 'mydir/jobA'
build 'mydir/jobB'
}
}
}
}
}
}

Resources