Not serializable error for jenkins declarative pipeline - jenkins

I am trying to trigger my email promotion job from my pipeline which extracts the repo name from Jenkins messages. But not able to resolve the SerializableException error for this block. Any help is greatly appreciated.
post{
success{
script{
#NonCPS
//upstream_job_name = null
def manager = manager.getLogMatcher('.*Obtained Jenkinsfile from git (.*)$')
if(manager.matches()){
def gitMsg=manager.group(1)
gitrepo = "${gitMsg}"
echo gitrepo
def upstream_job_name = gitrepo.split("/")[4].replace(".git", "")
println upstream_job_name
}
build job: 'job-approval' , parameters: [[$class: 'StringParameterValue', name: 'upstream_job_name', value: upstream_job_name]]
}
}
}
Below is the error messages i am receiving :
[Pipeline] // script
Error when executing success post condition:
java.io.NotSerializableException: java.util.regex.Matcher
at org.jboss.marshalling.river.RiverMarshaller.doWriteObject(RiverMarshaller.java:926)
at org.jboss.marshalling.river.BlockMarshaller.doWriteObject(BlockMarshaller.java:65)
at org.jboss.marshalling.river.BlockMarshaller.writeObject(BlockMarshaller.java:56)
at org.jboss.marshalling.MarshallerObjectOutputStream.writeObjectOverride(MarshallerObjectOutputStream.java:50)
at org.jboss.marshalling.river.RiverObjectOutputStream.writeObjectOverride(RiverObjectOutputStream.java:179)

You need to release manager immediately after using. More detail can find in this post
script{
//upstream_job_name = null
def manager = manager.getLogMatcher('.*Obtained Jenkinsfile from git (.*)$')
if(manager.matches()){
def gitMsg=manager.group(1)
gitrepo = "${gitMsg}"
echo gitrepo
def upstream_job_name = gitrepo.split("/")[4].replace(".git", "")
println upstream_job_name
}
manager = null
build job: 'job-approval' ,
parameters: [
[$class: 'StringParameterValue', name: 'upstream_job_name', value: upstream_job_name]
]
}

Related

Jenkins pipeline stuck on build job

I recently create a new jenkins pipeline that mainly relies on other build jobs. Strange thing is, the 1st stage job gets triggered, ran successfully + Finished with "SUCCESS" state. But the pipeline keeps on loading forever after Scheduling project: "run-operation".
Any idea what mistake i made below?
UPDATE 1: remove param with hard coded advertiser & query
pipeline {
agent {
node {
label 'slave'
}
}
stages {
stage('1') {
steps {
script{
def buildResult = build job: 'run-operation', parameters: [
string(name: 'ADVERTISER', value: 'car'),
string(name: 'START_DATE', value: '2019-12-29'),
string(name: 'END_DATE', value: '2020-01-11'),
string(name: 'QUERY', value: 'weekly')
]
def envVar = buildResult.getBuildVariables();
}
}
}
stage('2') {
steps {
script{
echo 'track the query operations from above job'
def trackResult = build job: 'track-operation', parameters: [
string(name: 'OPERATION_NAMES', value: envVar.operationList),
]
}
}
}
stage('3') {
steps {
echo 'move flag'
}
}
stage('callback') {
steps {
echo 'for each operation call back url'
}
}
}
}
Console log (despite the job was running, the pipeline doesnt seems to know, see log):
Started by user reusable
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Running on Jenkins in /var/lib/jenkins/jobs/etl-pipeline/workspace
[Pipeline] {
[Pipeline] stage
[Pipeline] { (1)
[Pipeline] build (Building run-operation)
Scheduling project: run-operation)
...

How do i setup a jenkins job that calls another jenkins job in parallel but with an array as a parameter

I want to build a jenkins job, say Job A. I have another Jenkins Deploy Job, say Job B.
I have an aws cli command that gets me the names of ECS services under a particular cluster, i put that in an array.
Now, for Each Element of the array as parameter, i want to Call Job B.
i.e I want to have Job A to invoke Job B in Parallel, for each element of the array, i pass to it as parameter.
New to jenkins so tried using "Multi-Job Plugin" and "Parameterized Plugin"
pipeline {
agent any
stages {
stage('Run JobB') {
steps {
script{
def ecs_services = ['service1', 'service2', 'service3']
for (int i = 0; i < ecs_services.size(); i++) {
def service = ecs_services[i]
ecs["${service}"] = build job: 'JobB' , parameters: [name: 'foo', value: 'bar']
}
failFast: true
parallel ecs
}
}
}
}
}
something as simple as, also errors out
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'Hello world!'
}
}
}
}
Started by user ops
[BFA] Scanning build for known causes...
[BFA] No failure causes found
[BFA] Done. 0s
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 2: Invalid argument for agent - '${any}' - must be map of config options or bare none. # line 2, column 11.
agent any
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:603)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:581)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:558)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:298)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:268)
at groovy.lang.GroovyShell.parseClass(GroovyShell.java:688)
at groovy.lang.GroovyShell.parse(GroovyShell.java:700)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:67)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:410)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:373)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:213)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Finished: FAILURE
This is how I would do it in a jenkins pipeline script of JobA:
pipeline {
agent {
node {
label "master" //change this as per your agent label
}
}
stages {
stage('Run JobB') {
steps {
script{
def ecs = [:]
def ecs_services = ['service1', 'service2', 'service3']
for (int i = 0; i < ecs_services.size(); i++) {
def service = ecs_services[i]
ecs["${service}"] = build job: 'JobB' , parameters: [name: 'string1', value: 'foo']
}
ecs.failFast: true
parallel ecs
}
}
}
}
}
You can pass the ecs service name as parameter if you need to.

How to get the logs from previous build in Jenkins pipeline?

In my Jenkins Pipeline, the first stage executes a freestyle job. Therefore, I need to get the output/logs from that job because it will return a string of IPs which will be used on my second stage.
def getEc2ListResult = []
pipeline {
agent {
label 'master'
}
stages{
stage('Get EC2 List'){
steps {
script {
def getEc2List = build job: 'get-ec2-by-tag', parameters: [
string(name: 'envTagValue', value: "${envTagValue}"),
string(name: 'OS', value: "${OS}")
]
getEc2ListResult = getEc2List.getPreviousBuild().getLog(20)
}
}
}
}
}
This is the error that I'm getting:
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper.getLog() is applicable for argument types: (java.lang.Integer) values: [20]
Possible solutions: getId(), getAt(java.lang.String), getClass()
getEc2List is of type RunWrapper, also getEc2List.getPreviousBuild().
RunWrapper doesn't supply a getLog() api, it is supplied by rawBuild.
you can get getEc2List's rowBuild by call getEc2List.rawBuild or getEc2List.getRawBuild().
But getRawBuild() is not in #Whitelisted of RunWrapper, so you will get following message in jenkins log:
Scripts not permitted to use method
org.jenkinsci.plugins.workflow.support.steps.build.RunWrapper
getRawBuild. Administrators can decide whether to approve or reject
this signature.
One option, it's to ask Jenkins admin to change Script Approve
Another option, it's do as following:
stage('') {
environment {
JENKINS_AUTH = credentials('<credentail id of jenkins auth>')
// add an credentail with a jenkins user:password before use it at here
}
steps {
script {
def getEc2List = build job: 'get-ec2-by-tag', parameters: [
string(name: 'envTagValue', value: "${envTagValue}"),
string(name: 'OS', value: "${OS}")
]
logUrl = getEc2List.absoluteUrl + 'consoleText'
log = sh(script: 'curl -u {JENKINS_AUTH} -k' + logUrl,
returnStdout: true).trim()
// parse the log to extract the ip
...
}
}
}

How to trigger a Jenkins Job on multiple nodes from a Pipeline (only one job executing)

I have a Jenkins Job, configured as a Scripted Jenkins Pipeline, which:
Checks out the code from GitHub
merges in developer changes
builds a debug image
it is then supposed to split into 3 separate parallel processes - one of which builds the release version of the code and unit tests it.
The other 2 processes are supposed to be identical, with the debug image being flashed onto a target and various tests running.
The targets are identified in Jenkins as slave_1 and slave_2 and are both allocated the label 131_ci_targets
I am using 'parallel' to trigger the release build, and the multiple instances of the test job. I will post a (slightly redacted) copy of my Scripted pipeline below for full reference, but for the question I have tried all 3 of the following options.
Using a single build call with LabelParamaterValue and allNodesMatchingLabel set to true. In this the TEST_TARGETS is the label 131_ci_targets
parallel_steps = [:]
parallel_steps["release"] = { // Release build and test steps
}
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'LabelParameterValue',
name: 'RUN_NODE', label: "${TEST_TARGETS}",
allNodesMatchingLabel: true,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
stage('Parallel'){
parallel parallel_steps
} // Parallel
Using a single build call with NodeParamaterValue and a list of all nodes. In this TEST_TARGETS is again the label, while test_nodes is a list of 2 strings: [slave_1, slave_2]
parallel_steps = [:]
parallel_steps["release"] = { // Release build and test steps
}
test_nodes = hostNames("${TEST_TARGETS}")
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: test_nodes,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
stage('Parallel'){
parallel parallel_steps
} // Parallel
3: Using multiple stages, each with a single build call with NodeParamaterValue and a list containing only 1 slave id.
test_nodes is the list of strings : [slave_1, slave_2], while the first call passes slave_1 and the second slave_2.
for ( tn in test_nodes ) {
parallel_steps["${tn}"] = {
stage("${tn}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: [tn],
nodeEligibility: [$class: 'IgnoreOfflineNodeEligibility']]],
wait: false
}
} // ${tn}
}
All of the above will trigger only a single run of the 'Trial_Test_Pipe' on slave_2 assuming that both slave_1 and slave_2 are defined, online and have available executors.
The Trial_Test_Pipe job is another Jenkins Pipeline job, and has the checkbox "Do not allow concurrent builds" unchecked.
Any thoughts on:
Why the job will only trigger one of the runs, not both?
What the correct solution may be?
For reference now: here is my full(ish) scripted Jenkins job:
import hudson.model.*
import hudson.EnvVars
import groovy.json.JsonSlurperClassic
import groovy.json.JsonBuilder
import groovy.json.JsonOutput
import java.net.URL
def BUILD_SLAVE=""
// clean the workspace before starting the build process
def clean_before_build() {
bat label:'',
script: '''cd %GITHUB_REPO_PATH%
git status
git clean -x -d -f
'''
}
// Routine to build the firmware
// Can build Debug or Release depending on the environment variables
def build_the_firmware() {
return
def batch_script = """
REM *** Build script here
echo "... Build script here ..."
"""
bat label:'',
script: batch_script
}
// Copy the hex files out of the Build folder and into the Jenkins workspace
def copy_hex_files_to_workspace() {
return
def batch_script = """
REM *** Copy HEX file to workspace:
echo "... Copy HEX file to workspace ..."
"""
bat label:'',
script: batch_script
}
// Updated from stackOverflow answer: https://stackoverflow.com/a/54145233/1589770
#NonCPS
def hostNames(label) {
nodes = []
jenkins.model.Jenkins.instance.computers.each { c ->
if ( c.isOnline() ){
labels = c.node.labelString
labels.split(' ').each { l ->
if (l == label) {
nodes.add(c.node.selfLabel.name)
}
}
}
}
return nodes
}
try {
node('Build_Slave') {
BUILD_SLAVE = "${env.NODE_NAME}"
echo "build_slave=${BUILD_SLAVE}"
stage('Checkout Repo') {
// Set a desription on the build history to make for easy identification
currentBuild.setDescription("Pull Request: ${PULL_REQUEST_NUMBER} \n${TARGET_BRANCH}")
echo "... checking out dev code from our repo ..."
} // Checkout Repo
stage ('Merge PR') {
// Merge the base branch into the target for test
echo "... Merge the base branch into the target for test ..."
} // Merge PR
stage('Build Debug') {
withEnv(['LIB_MODE=Debug', 'IMG_MODE=Debug', 'OUT_FOLDER=Debug']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Debug
stage('Post Build') {
if (currentBuild.resultIsWorseOrEqualTo("UNSTABLE")) {
echo "... Send a mail to the Admins and the Devs ..."
}
} // Post Merge
} // node
parallel_steps = [:]
parallel_steps["release"] = {
node("${BUILD_SLAVE}") {
stage('Build Release') {
withEnv(['LIB_MODE=Release', 'IMG_MODE=Release', 'OUT_FOLDER=build\\Release']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Release
stage('Unit Tests') {
echo "... do Unit Tests here ..."
}
}
} // release
test_nodes = hostNames("${TEST_TARGETS}")
if (true) {
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'LabelParameterValue',
name: 'RUN_NODE', label: "${TEST_TARGETS}",
allNodesMatchingLabel: true,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
} else if ( false ) {
parallel_steps["${TEST_TARGETS}"] = {
stage("${TEST_TARGETS}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: test_nodes,
nodeEligibility: [$class: 'AllNodeEligibility']]]
}
} // ${TEST_TARGETS}
} else {
for ( tn in test_nodes ) {
parallel_steps["${tn}"] = {
stage("${tn}") {
echo "test_nodes: ${test_nodes}"
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: "${TARGET_BRANCH}"),
string(name: 'FRAMEWORK_VERSION', value: "${FRAMEWORK_VERSION}"),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: [tn],
nodeEligibility: [$class: 'IgnoreOfflineNodeEligibility']]],
wait: false
}
} // ${tn}
}
}
stage('Parallel'){
parallel parallel_steps
} // Parallel
} // try
catch (Exception ex) {
if ( manager.logContains(".*Merge conflict in .*") ) {
manager.addWarningBadge("Pull Request ${PULL_REQUEST_NUMBER} Experienced Git Merge Conflicts.")
manager.createSummary("warning.gif").appendText("<h2>Experienced Git Merge Conflicts!</h2>", false, false, false, "red")
}
echo "... Send a mail to the Admins and the Devs ..."
throw ex
}
So ... I have a solution for this ... as in, I understand what to do, and why one of the above solutions wasn't working.
The winner is Option 3 ... the reason it wasn't working is that the code inside the enclosure (the stage part) isn't evaluated until the stage is actually being run. As a result the strings aren't expanded until then and, since tn is fixed at slave_2 by that point, that's the value used on both parallel streams.
In the Jenkins examples here ... [https://jenkins.io/doc/pipeline/examples/#parallel-from-grep] ... the enclosures are returned from a function transformIntoStep and by doing this I was able to force early evaluation of the strings and so get parallel steps running on both slaves.
If you're here looking for answers, I hope this helps. If you are, and it has, please feel free to give me an uptick. Cheers :)
My final scripted jenkinsfile looks something like this:
import hudson.model.*
import hudson.EnvVars
import groovy.json.JsonSlurperClassic
import groovy.json.JsonBuilder
import groovy.json.JsonOutput
import java.net.URL
BUILD_SLAVE=""
parallel_steps = [:]
// clean the workspace before starting the build process
def clean_before_build() {
bat label:'',
script: '''cd %GITHUB_REPO_PATH%
git status
git clean -x -d -f
'''
}
// Routine to build the firmware
// Can build Debug or Release depending on the environment variables
def build_the_firmware() {
def batch_script = """
REM *** Build script here
echo "... Build script here ..."
"""
bat label:'',
script: batch_script
}
// Copy the hex files out of the Build folder and into the Jenkins workspace
def copy_hex_files_to_workspace() {
def batch_script = """
REM *** Copy HEX file to workspace:
echo "... Copy HEX file to workspace ..."
"""
bat label:'',
script: batch_script
}
// Updated from stackOverflow answer: https://stackoverflow.com/a/54145233/1589770
#NonCPS
def hostNames(label) {
nodes = []
jenkins.model.Jenkins.instance.computers.each { c ->
if ( c.isOnline() ){
labels = c.node.labelString
labels.split(' ').each { l ->
if (l == label) {
nodes.add(c.node.selfLabel.name)
}
}
}
}
return nodes
}
def transformTestStep(nodeId) {
return {
stage(nodeId) {
build job: 'Trial_Test_Pipe',
parameters: [string(name: 'TARGET_BRANCH', value: TARGET_BRANCH),
string(name: 'FRAMEWORK_VERSION', value: FRAMEWORK_VERSION),
[$class: 'NodeParameterValue',
name: 'RUN_NODE', labels: [nodeId],
nodeEligibility: [$class: 'IgnoreOfflineNodeEligibility']]],
wait: false
}
}
}
def transformReleaseStep(build_slave) {
return {
node(build_slave) {
stage('Build Release') {
withEnv(['LIB_MODE=Release', 'IMG_MODE=Release', 'OUT_FOLDER=build\\Release']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Release
stage('Unit Tests') {
echo "... do Unit Tests here ..."
}
}
}
}
try {
node('Build_Slave') {
BUILD_SLAVE = "${env.NODE_NAME}"
echo "build_slave=${BUILD_SLAVE}"
parallel_steps["release"] = transformReleaseStep(BUILD_SLAVE)
test_nodes = hostNames("${TEST_TARGETS}")
for ( tn in test_nodes ) {
parallel_steps[tn] = transformTestStep(tn)
}
stage('Checkout Repo') {
// Set a desription on the build history to make for easy identification
currentBuild.setDescription("Pull Request: ${PULL_REQUEST_NUMBER} \n${TARGET_BRANCH}")
echo "... checking out dev code from our repo ..."
} // Checkout Repo
stage ('Merge PR') {
// Merge the base branch into the target for test
echo "... Merge the base branch into the target for test ..."
} // Merge PR
stage('Build Debug') {
withEnv(['LIB_MODE=Debug', 'IMG_MODE=Debug', 'OUT_FOLDER=Debug']){
clean_before_build()
build_the_firmware()
copy_hex_files_to_workspace()
archiveArtifacts "${LIB_MODE}\\*.hex, ${LIB_MODE}\\*.map"
}
} // Build Debug
stage('Post Build') {
if (currentBuild.resultIsWorseOrEqualTo("UNSTABLE")) {
echo "... Send a mail to the Admins and the Devs ..."
}
} // Post Merge
} // node
stage('Parallel'){
parallel parallel_steps
} // Parallel
} // try
catch (Exception ex) {
if ( manager.logContains(".*Merge conflict in .*") ) {
manager.addWarningBadge("Pull Request ${PULL_REQUEST_NUMBER} Experienced Git Merge Conflicts.")
manager.createSummary("warning.gif").appendText("<h2>Experienced Git Merge Conflicts!</h2>", false, false, false, "red")
}
echo "... Send a mail to the Admins and the Devs ..."
throw ex
}

Jenkins pipeline - No such DSL method 'build'

I am using below Groovy Script in Jenkins Pipeline to call a Free style job but it ends up with "No such DSL method 'build'" error.
node{
def branches = [:]
List rows =["Test2", "Test1"]
for (int i = 0; i <rows.size(); i++)
{
def index = i
String db = rows[i]
branches["branch${i}"] = {
build job: 'CopyFile', parameters: [[$class:
'StringParameterValue', name: 'DatabaseName', value: db], [$class:
'StringParameterValue', name:'dummy', value: "${index}"]]
}
}
parallel branches
}
Installing "Pipeline Build Step Plugin" resolved this issue
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Build+Step+Plugin

Resources