append array and ArrayList to ArrayList - jenkins

Using Jenkins pipeline we have our own build script. Also all of our projects have a rakefile which is what we use to do a lot of the building steps. Our typical jenkins build executes 3 rake tasks but we do have some exceptions and that has to do when we have a angular website we try to build with it.
I've configured my pipeline like this:
buildGitProject {
repository='https://anonymous.visualstudio.com/Project/_git/my-csharp-project-with-angular'
branchName= 'master'
solutionName='MyCSharpSolution.sln'
emailTo='someone#aol.com'
preRakeCommands=['install_npm_dependencies', 'ng_build']
}
that relies on our build script which is this:
def call(body) {
def args= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = args
body()
def agentName = "windows && ${args.branchName}"
def remoteConfig = org.pg.RemoteConfigFactory.create(args.repository);
pipeline {
agent none
options {
buildDiscarder(logRotator(numToKeepStr: org.pg.Settings.BUILDS_TO_KEEP))
skipStagesAfterUnstable()
timestamps()
}
stages {
stage("checkout") {
agent any
steps {
checkoutFromGit(remoteConfig, args.branchName)
}
}
stage('build') {
agent{node{ label agentName as String}}
steps {
buildSolution(args.solutionName, args.get('preRakeCommands', []), args.get('postRakeCommands', []))
}
}
stage('test') {
agent{node{ label agentName as String}}
steps {
testSolution(args.solutionName)
}
}
}
}
}
which fails in the build stage.
buildSolution.groovy
def call(String solutionName, ArrayList preRakeCommands, ArrayList postRakeCommands) {
unstash 'ws'
String[] rakeCommands = [
"build_solution[${solutionName}, Release, Any CPU]",
"copy_to_deployment_folder",
"execute_dev_dropkick"
]
String[] combinedRakeCommand = (preRakeCommands.plus(rakeCommands).plus(postRakeCommands)) as String[]
executeRake( combinedRakeCommand )
stash name: 'deployment', includes: 'deployment/**/*'
}
executeRake.groovy
def call(String... rakeTasks) {
def safeRakeTasks = rakeTasks.collect{ "\"$it\"" }.join(' ');
bat script: "rake ${safeRakeTasks}"
}
in the jenkins build log it says:
08:43:09 C:\jenkins_repos\Project\my-csharp-project-with-angular>rake "install_npm_dependencies" "ng_build" "[Ljava.lang.String;#11bd466"
I have no idea how or why it is using a string pointer because I thought that plus concated arrays and ArrayList... Plus it is in Jenkins so it is a pain to test.

List a = ['a1','a2','a3']
String [] s = ['s1','s2','s3']
List b = ['b1','b2','b3']
println a.plus(s as List).plus(b)
output:
[a1, a2, a3, s1, s2, s3, b1, b2, b3]

Another approach:
List a = ['a1','a2','a3']
String[] s = ['s1','s2','s3']
List b = ['b1','b2','b3']
println ([*a,*s,*b])
alternatively
println a + [*s] + b
which should perform better

Related

How to pass parameters and variables from a file to jenkinsfile?

I'm trying to convert my jenkins pipeline to a shared library since it can be reusable on most of the application. As part of that i have created groovy file in vars folder and kept pipeline in jenkins file in github and able to call that in jenkins successfully
As part of improving this i want to pass params, variables, node labels through a file so that we should not touch jenkins pipeline and if we want to modify any vars, params, we have to do that in git repo itself
pipeline {
agent
{
node
{
label 'jks_deployment'
}
}
environment{
ENV_CONFIG_ID = 'jenkins-prod'
ENV_CONFIG_FILE = 'test.groovy'
ENV_PLAYBOOK_NAME = 'test.tar.gz'
}
parameters {
string (
defaultValue: 'test.x86_64',
description: 'Enter app version',
name: 'app_version'
)
choice (
choices: ['10.0.0.1','10.0.0.2','10.0.0.3'],
description: 'Select a host to be delpoyed',
name: 'host'
)
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
def props = readProperties file: 'extravars.properties'
env.var1 = props.var1
env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
In above code,i used pipeline utility steps plugin and able to read variables from extravars.properties file. Is it same way we can do for jenkins parameters also? Or do we have any suitable method to take care of passing this parameters via a file from git repo?
Also is it possible to pass variable for node label also?
=====================================================================
Below are the improvements which i have made in this project
Used node label plugin to pass the node name as variable
Below is my vars/sayHello.groovy file content
def call(body) {
// evaluate the body block, and collect configuration into the object
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent
{
node
{
label "${pipelineParams.slaveName}"
}
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
// def props = readProperties file: 'extravars.properties'
// script {
readProperties(file: 'extravars.properties').each {key, value -> env[key] = value }
//}
// env.var1 = props.var1
// env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
}
stage ('stage2') {
steps {
sh "echo ${var1}"
sh "echo ${var2}"
sh "echo ${pipelineParams.appVersion}"
sh "echo ${pipelineParams.hostIp}"
}
}
}
}
}
Below is my vars/params.groovy file
properties( [
parameters([
choice(choices: ['10.80.66.171','10.80.67.6','10.80.67.200'], description: 'Select a host to be delpoyed', name: 'host')
,string(defaultValue: 'fxxxxx.x86_64', description: 'Enter app version', name: 'app_version')
])
] )
Below is my jenkinsfile
def _hostIp = params.host
def _appVersion = params.app_version
sayHello {
slaveName = 'master'
hostIp = _hostIp
appVersion = _appVersion
}
Now Is it till we can improve this?Any suggestions let me know.

Scripted Jenkinsfile parallel builders is not working

I'm attempting to use parallel builders in my scripted jenkinsfile. When I run the code jenkins is ignoring the node labels and just choosing the first available. What am I doing wrong?
here is the code:
node {
withCredentials([
string(credentialsId: 'some ID', variable: 'some variable')
]) {
stage('Initialize') {
setup()
}
}
}
}
def setup_worker() {
def labels = ['label2', 'label1']
def builders = [:]
for (x in labels) {
def label = x
builders[label] = {
node(label) {
stage('Setup') {
step1
checkout scm
login()
write_config()
}
}
}
}
parallel builders
}```
I would highly recommend to use Jenkins Declarative pipelines for handling parallel stages on different nodes. The syntax is simpler and well documented
https://jenkins.io/blog/2017/09/25/declarative-1/

Last successful build's revision for an upstream MultiBranch Job in Jenkins Declarative Pipeline

I'd like to get the build revisions of the last successful builds of Upstream jobs. The upstream jobs are multibranch jobs.
So far I'm generating a list of upstream jobs' names as triggers. But I can't seem to find the right method to call.
import jenkins.model.Jenkins
def upstreamPackages = ['foo', 'bar']
def upstreamJobs = upstreamPackages.collect { "${it}-multibranch/master" }.join(',')
pipeline {
agent none
triggers {
upstream(upstreamProjects: upstreamJobs,
threshold: hudson.model.Result.SUCCESS)
}
stages {
stage('test'){
steps{
script {
upstreamJobs.each {
println it
job = Jenkins.instance.getItem(it)
job.getLastSuccessfulBuild()
revision = job.getLastSuccessfulBuild().changeset[0].revision
println revision
}
}
}
}
}
}
This results in a null object for item. What's the correct way to do this?
UPDATE 1
After discovering the Jenkins Script Console and this comment, I managed to come up with the folllowing:
import jenkins.model.Jenkins
import hudson.plugins.git.util.BuildData
def upstreamPackages = ['foo', 'bar']
def upstreamJobsList = upstreamPackages.collect { "${it}-multibranch/master" }.join(',')
#NonCPS
def resolveRequirementsIn(packages){
BASE_URL = 'git#github.com:myorg'
requirementsIn = ''
packages.each { pkg ->
revision = getLastSuccessfulBuildRevision("${pkg}-multibranch")
requirementsIn <<= "-e git+${BASE_URL}/${pkg}.git#${revision}#egg=${pkg}\n"
}
println requirementsIn
return requirementsIn
}
#NonCPS
def getLastSuccessfulBuildRevision(jobName){
project = Jenkins.instance.getItem(jobName)
masterJob = project.getAllItems().find { job -> job.getName() == 'master' }
build = masterJob.getLastSuccessfulBuild()
return build.getAction(BuildData.class).getLastBuiltRevision().sha1String
}
pipeline {
agent { label 'ci_agent' }
triggers {
upstream(upstreamProjects: upstreamJobsList,
threshold: hudson.model.Result.SUCCESS)
}
stages {
stage('Get artifacts'){
steps{
script{
requirementsIn = resolveRequirementsIn upstreamPackages
writeFile file: 'requirements.in', text: requirementsIn
}
}
}
}
}
It's throwing an error:
an exception which occurred:
in field org.jenkinsci.plugins.pipeline.modeldefinition.withscript.WithScriptScript.script
in object org.jenkinsci.plugins.pipeline.modeldefinition.agent.impl.LabelScript#56d1724
in field groovy.lang.Closure.delegate
in object org.jenkinsci.plugins.workflow.cps.CpsClosure2#27378d57
in field groovy.lang.Closure.delegate
in object org.jenkinsci.plugins.workflow.cps.CpsClosure2#6e6c3c4e
in field org.jenkinsci.plugins.workflow.cps.CpsThreadGroup.closures
in object org.jenkinsci.plugins.workflow.cps.CpsThreadGroup#5d0ffef3
in object org.jenkinsci.plugins.workflow.cps.CpsThreadGroup#5d0ffef3
Caused: java.io.NotSerializableException:
org.jenkinsci.plugins.workflow.multibranch.WorkflowMultiBranchProject
The problem was that Jenkins' Pipeline DSL requires all assigned objects to be Serializable.
Jenkins.instance.getItem(jobName) returns a WorkflowMultiBranchProject which is not Serializable. Neither is Jenkins.instance.getItem(jobName).getItem('master') which is a WorkflowJob object.
So I invariably went down the call chain to what I needed replacing variable assignments with chained method calls and came up with the following solution.
def upstreamPackages = ['foo', 'bar']
def upstreamJobsList = upstreamPackages.collect { "${it}-multibranch/master" }.join(',')
def String requirementsInFrom(packages){
final BASE_URL = 'git#github.com:myorg'
requirementsIn = ''
packages.each{ pkg ->
revision = Jenkins.instance.getItem("${pkg}-multibranch")
.getItem('master')
.getLastSuccessfulBuild()
.getAction(BuildData.class)
.getLastBuiltRevision()
.sha1String
requirementsIn <<= "-e git+${BASE_URL}/${pkg}.git#${revision}#egg=${pkg}\n"
}
return requirementsIn.toString()
}

Jenkins parallel script in loop using wrong variables

I'm trying to build a dynamic group of steps to run in parallel. The following example is what I came up with (and found examples of at https://devops.stackexchange.com/questions/3073/how-to-properly-achieve-dynamic-parallel-action-with-a-declarative-pipeline). But I'm having trouble getting it to use the expected variables. The result always seems to be the variables from the last iteration of the loop.
In the following example the echo output is always bdir2 for both tests:
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
def tests = [:]
def files
files = ['adir1/adir2/adir3','bdir1/bdir2/bdir3']
files.each { f ->
rolePath = new File(f).getParentFile()
roleName = rolePath.toString().split('/')[1]
tests[roleName] = {
echo roleName
}
}
parallel tests
}
}
}
}
}
I'm expecting one of the tests to output adir2 and another to be bdir2. What am I missing here?
Just try to move the test section a little higher, and it will be work
pipeline {
agent any
stages {
stage('Test') {
steps {
script {
def tests = [:]
def files
files = ['adir1/adir2/adir3','bdir1/bdir2/bdir3']
files.each { f ->
tests[f] = {
rolePath = new File(f).getParentFile()
roleName = rolePath.toString().split('/')[1]
echo roleName
}
}
parallel tests
}
}
}
}
}

How to trigger multiple down stream jobs in jenkins dynamically based on some input parameter

Scenario: I want to trigger few down stream jobs(Job A and Job B ....) dynamically based on the input parameter received by the current job.
import hudson.model.*
def values = ${configname}.split(',')
def currentBuild = Thread.currentThread().executable
println ${configname}
println ${sourceBranch}
values.eachWithIndex { item, index ->
println item
println index
def job = hudson.model.Hudson.instance.getJob(item)
def params = new StringParameterValue('upstream_job', ${sourceBranch})
def paramsAction = new ParametersAction(params)
def cause = new hudson.model.Cause.UpstreamCause(currentBuild)
def causeAction = new hudson.model.CauseAction(cause)
hudson.model.Hudson.instance.queue.schedule(job, 0, causeAction, paramsAction)
}
How about something like this? I was getting a comma separated list from the upstream system and I splitted them as individaul string which is internally jobs. Making a call by passing each individual strings.
this Jenkinsfile would do that:
#!/usr/bin/env groovy
pipeline {
agent { label 'docker' }
parameters {
string(name: 'myHotParam', defaultValue: '', description: 'What is your param, sir?')
}
stages {
stage('build') {
steps {
script {
if (params.myHotParam == 'buildEverything') {
build 'mydir/jobA'
build 'mydir/jobB'
}
}
}
}
}
}

Resources