I'm integrating Squish automation tool and Jenkins pipeline. Everything went smoothly. Now I need to send email report after the job's done. I have a Groovy file in pre-send script, but when this script runs, it throws out exception:
java.lang.NullPointerException: Cannot invoke method getRootDir() on null object
I figured out the "build" object in my Groovy script is Null. Not sure why it is. Please note if I use built-in Squish plugin and Editable Email on Jenkins, everything went smoothly. The problem just happen when I moved to use Pipeline.
### - This is my Groovy script:
List getJenkinsTestResultFiles() {
File squishResultsPath = new File( build.getRootDir(), "squishResults" )
if ( !squishResultsPath.exists() || !squishResultsPath.isDirectory() ) {
throw new GroovyRuntimeException( "Squish results path does not exist at: " + squishResultsPath.getAbsolutePath() )
}
File summaryFile = new File( squishResultsPath, "summary.xml" )
if ( !summaryFile.exists() || !summaryFile.isFile() ) {
throw new GroovyRuntimeException( "Squish summary file does not exist at: " + summaryFile.getAbsolutePath() )
}
List resultFiles = []
def summaries = new XmlSlurper().parse( summaryFile )
summaries.summary.each {
resultFiles.push( new File( squishResultsPath, it.xmlFileName.text() ) )
}
return resultFiles
}
### - This is my Pipeline script:
node('Slave_10.133.88.151') {
stage('Squish Test') {
step([$class: 'SquishBuilder',
abortBuildOnError: false,
extraOptions: '',
host: '',
port: '',
resultFolder: "${WORKSPACE}\\Squish_Report",
skipTestCases: false,
snoozeFactor: '1',
squishConfig: 'Default',
testCase: '',
testSuite: "${WORKSPACE}\\${TEST_SUITE}"])
}
stage('Send Email') {
emailext body: 'Test',
postsendScript: '${SCRIPT, template="SquishSummary.groovy"}',
subject: 'Pipeline',
to: 'hoang#local.com'
}
}
The build object is a hudson.model.Build object, and since you are calling a shared library you'll have to import the Build object in your groovy script.
import hudson.model.Build
At the top of your shared library.
If you have already imported the object then the issue could be that you haven't initialized it inside of your shared library.
Related
I am creating jobs for use with Terraform. There are several environments and the number is growing all the time. Rather than update both the pipeline file and the jobdsl file as the parameters change, I started working from the standpoint of scanning the repo for environment files and updating the pipeline and jobdsl file as needed.
My jobdsl script:
#Library('mylib') _
params = [
"serviceName": "infrastructure-${repo}",
"repoUrl": "${repoUrl}",
"sshCredentials": 'git-readonly',
"environment": "${env.Environment}",
"configParams": getTFConfigs(
repoUrl,
"env/${env.AccountName}/${env.AWSRegion}/${env.Environment}")
]
template = libraryResource('dslTemplates/infra.groovy')
jobDsl scriptText: helpers.renderTemplate(template, params)
shared library method: getTFConfigs
#!/usr/bin/env groovy
#NonCPS
import java.util.zip.ZipEntry
import java.util.zip.ZipInputStream
def call(String repoUrl, String filter=""){
def gitProc = new ProcessBuilder(
"git",
"archive",
"--format=zip",
"--remote=${repoUrl}",
"main").start()
def zipIn = new ZipInputStream(gitProc.inputStream)
def zipMembers = []
while (true) {
def ZipEntry entry = zipIn.getNextEntry()
if (entry == null) break
if ( (entry.getName()).contains(filter) ) {
entryName = entry.getName()
zipMembers.push("${entryName}")
}
}
println zipMembers
return zipMembers
}
dslTemplates/infra.groovy template
pipelineJob("${serviceName}") {
description("Apply TF for ${serviceName} to all environment configurations")
definition {
parameters {
<% configParams.each { %>
booleanParam(name: "<%= "${it}" %>", defaultValue: true, description: "<%= "${it}" %>" )
<% } %>
}
logRotator {
numToKeep(20)
}
cpsScm {
scm {
git{
remote{
url("${repoUrl}")
credentials("${sshCredentials}")
branch('*/main')
}
}
}
scriptPath('infra.groovy')
}
}
}
Template result
...
definition {
parameters {
booleanParam(name: env1.tfvars, defaultValue: true, description: env1.tfvars )
booleanParam(name: env2.tfvars, defaultValue: true, description: env2.tfvars )
}
...
When the seed job runs and executes the code, the parameters should be updated with a checkbox for each environment. However, the jobdsl fails with this:
ERROR: (script, line 6) No signature of method: javaposse.jobdsl.dsl.helpers.BuildParametersContext.booleanParam() is applicable for argument types: (java.util.LinkedHashMap) values: [[name:env1.tfvars, defaultValue:true, ...]]
Possible solutions: booleanParam(java.lang.String), booleanParam(java.lang.String, boolean), booleanParam(java.lang.String, boolean, java.lang.String)
Finished: FAILURE
I have tried to applying "toString()" at various steps and cannot seem to find any solution to this.
I have tried to write the entire jobdsl script to a file and read it back in using "jobDsl targets: filename" and got the same result!
Banging my head! as it were!
Thanks
It looks like you used Pipeline Syntax for the parameters in DSL script. If you want to define a parameter in a DSL script do not use name, defaultValue and description. (See Job DSL Plugin)
booleanParam('BOOL_PARAM', true, 'This is a boolean param')
Question: why are some functions disallowed if called in a Jenkinsfile, but allowed if called in a shared library that is imported by that same Jenkinsfile?
This question is not specific to directory-creation, but I will use it as an example, since that is the context in which I discovered this behavior:
The following Jenkins pipeline succeeds in creating a directory:
#Library('my-shared-libs') _
pipeline {
agent any
stages {
stage( "1" ) {
steps {
script {
utils.MkDir("/home/user/workspace/prj/foo")
}
}
}
}
}
// vars/utils.groovy
import java.io.File
def MkDir(the_dir) {
def f = new File(the_dir)
if ( ! f.mkdirs() ) { echo "Failed creating ${the_dir}" }
else { echo "Succeeded creating ${the_dir}" }
}
But the following pipeline:
pipeline {
agent any
stages {
stage( "1" ) {
steps {
script {
def the_dir = "/home/user/workspace/prj/bar"
def f = new File(the_dir)
if ( ! f.mkdirs() ) { echo "Failed creating ${the_dir}" }
else { echo "Succeeded creating ${the_dir}" }
}
}
}
}
}
...fails with this error:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new java.io.File java.lang.String
Why is the directory-creation unsuccessful when called from the Jenkinsfile, but successful when called from the shared-library that is imported from that same Jenkinsfile?
The broader question this raises: what is the underlying "distinction" between a Jenkinsfile and shared libraries that it uses? There is some kind of "delineation" or "distinction" between Jenkinsfile declarative-syntax scripts and Groovy, and shared libraries, that isn't quite gelling in my mind. I'd be grateful if someone could help me understand.
Following #injecteer's suggestion, I tried the following modification to the second Jenkinsfile:
def the_dir = "/home/user/workspace/prj/bar"
def u = new URL( "file://${the_dir}" ).toURI()
def f = new File(u)
if ( ! f.mkdirs() ) { echo "Failed creating ${the_dir}" }
else { echo "Succeeded creating ${the_dir}" }
...which resulted in this error:
Scripts not permitted to use method java.net.URL toURI. Administrators can decide whether to approve or reject this signature.
It's not an option for me to do (or have done) this administrative approval, so this suggestion can't be an option for me, unfortunately.
I'm trying to convert my jenkins pipeline to a shared library since it can be reusable on most of the application. As part of that i have created groovy file in vars folder and kept pipeline in jenkins file in github and able to call that in jenkins successfully
As part of improving this i want to pass params, variables, node labels through a file so that we should not touch jenkins pipeline and if we want to modify any vars, params, we have to do that in git repo itself
pipeline {
agent
{
node
{
label 'jks_deployment'
}
}
environment{
ENV_CONFIG_ID = 'jenkins-prod'
ENV_CONFIG_FILE = 'test.groovy'
ENV_PLAYBOOK_NAME = 'test.tar.gz'
}
parameters {
string (
defaultValue: 'test.x86_64',
description: 'Enter app version',
name: 'app_version'
)
choice (
choices: ['10.0.0.1','10.0.0.2','10.0.0.3'],
description: 'Select a host to be delpoyed',
name: 'host'
)
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
def props = readProperties file: 'extravars.properties'
env.var1 = props.var1
env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
In above code,i used pipeline utility steps plugin and able to read variables from extravars.properties file. Is it same way we can do for jenkins parameters also? Or do we have any suitable method to take care of passing this parameters via a file from git repo?
Also is it possible to pass variable for node label also?
=====================================================================
Below are the improvements which i have made in this project
Used node label plugin to pass the node name as variable
Below is my vars/sayHello.groovy file content
def call(body) {
// evaluate the body block, and collect configuration into the object
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent
{
node
{
label "${pipelineParams.slaveName}"
}
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
// def props = readProperties file: 'extravars.properties'
// script {
readProperties(file: 'extravars.properties').each {key, value -> env[key] = value }
//}
// env.var1 = props.var1
// env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
}
stage ('stage2') {
steps {
sh "echo ${var1}"
sh "echo ${var2}"
sh "echo ${pipelineParams.appVersion}"
sh "echo ${pipelineParams.hostIp}"
}
}
}
}
}
Below is my vars/params.groovy file
properties( [
parameters([
choice(choices: ['10.80.66.171','10.80.67.6','10.80.67.200'], description: 'Select a host to be delpoyed', name: 'host')
,string(defaultValue: 'fxxxxx.x86_64', description: 'Enter app version', name: 'app_version')
])
] )
Below is my jenkinsfile
def _hostIp = params.host
def _appVersion = params.app_version
sayHello {
slaveName = 'master'
hostIp = _hostIp
appVersion = _appVersion
}
Now Is it till we can improve this?Any suggestions let me know.
Good day!
I would like to call a function that sets a a choice parameter with the list of folder in the branch. I getting null value at the end.
My Jenkins file looks like the below:
void findCollectionDirs() {
directories = sh (
script: "find . -path './[^.]*/*/*' -prune -type d",
returnStdout: true
)
return directories
}
def directories
pipeline () {
agent {
label 'slave'
}
triggers {
cron(cron_string)
}
parameters {
choice(choices: "\n" + directories, description: 'Please select a directory', name: 'directory')
}
stages {
stage ("Checkout scm") {
steps {
deleteDir()
checkout scm
//sh 'directories = findCollectionDirs()'
findCollectionDirs()
}
}
I tried by calling it as below:
sh 'directories = findCollectionDirs()'
or,
findCollectionDirs()
or
directories.findCollectionDirs()
but the value still null.
Can someone help me to call correctly the function so I get the right values in the choice parameter.
Thanks in advance
I need to check for the existence of a certain .exe file in my workspace as part of my pipeline build job. I tried to use the below Groovy script from my Jenkinsfile to do the same. But I think the File class by default tries to look for the workspace directory on jenkins master and fails.
#com.cloudbees.groovy.cps.NonCPS
def checkJacoco(isJacocoEnabled) {
new File(pwd()).eachFileRecurse(FILES) { it ->
if (it.name == 'jacoco.exec' || it.name == 'Jacoco.exec')
isJacocoEnabled = true
}
}
How to access the file system on slave using Groovy from inside the Jenkinsfile?
I also tried the below code. But I am getting No such property: build for class: groovy.lang.Binding error. I also tried to use the manager object instead. But get the same error.
#com.cloudbees.groovy.cps.NonCPS
def checkJacoco(isJacocoEnabled) {
channel = build.workspace.channel
rootDirRemote = new FilePath(channel, pwd())
println "rootDirRemote::$rootDirRemote"
rootDirRemote.eachFileRecurse(FILES) { it ->
if (it.name == 'jacoco.exec' || it.name == 'Jacoco.exec') {
println "Jacoco Exists:: ${it.path}"
isJacocoEnabled = true
}
}
Had the same problem, found this solution:
import hudson.FilePath;
import jenkins.model.Jenkins;
node("aSlave") {
writeFile file: 'a.txt', text: 'Hello World!';
listFiles(createFilePath(pwd()));
}
def createFilePath(path) {
if (env['NODE_NAME'] == null) {
error "envvar NODE_NAME is not set, probably not inside an node {} or running an older version of Jenkins!";
} else if (env['NODE_NAME'].equals("master")) {
return new FilePath(path);
} else {
return new FilePath(Jenkins.getInstance().getComputer(env['NODE_NAME']).getChannel(), path);
}
}
#NonCPS
def listFiles(rootPath) {
print "Files in ${rootPath}:";
for (subPath in rootPath.list()) {
echo " ${subPath.getName()}";
}
}
The important thing here is that createFilePath() ins't annotated with #NonCPS since it needs access to the env variable. Using #NonCPS removes access to the "Pipeline goodness", but on the other hand it doesn't require that all local variables are serializable.
You should then be able to do the search for the file inside the listFiles() method.