Use variable in nexusArtifactUploader/Jenkinsfile - jenkins

I have this Jenkins DSL script which gets the version from pom.xml file and uses that version in nexusArtifactUploader. This is currently not working and I am getting "groovy.lang.MissingPropertyException: No such property: version for class: groovy.lang.Binding" error. I am new to Groovy/Jenkins DSL and don't know how to make it work.
stage('Nexus') {
steps {
script {
def pom = readMavenPom file: 'pom.xml'
echo pom.version
nexusArtifactUploader(
nexusVersion: 'nexus3',
protocol: 'http',
nexusUrl: 'nexus.example.com:8080/nexus',
groupId: 'com.example',
**version: "${pom.version}",**
repository: 'example',
credentialsId: 'ciuser',
artifacts: [
[artifactId: 'com.example',
file: 'com.example-' + version + '.jar',
type: 'jar']
]
)
}
}
}

I think your error is actually on this line:
file: 'com.example-' + version + '.jar',
Try replacing version with pom.version.

Related

groovy:: java.lang.IllegalArgumentException: Expected named arguments but got

I have a Jenkins pipeline as follows and this pipe line is failing with following error, I suspect there is something funky going on with Java Hashmaps but I am not sure at all can someone help me with that?
pipeline {
agent { label 'master' }
parameters {
string(defaultValue: '123456789.ngrok.io/app-name:v1', description: '', name: 'docker_image', trim: true)
password(defaultValue: 'app_db_password', description: '', name: 'app_db_password')
}
environment {
.....
}
stages {
stage('DeployAWS') {
steps {
script{
withEnv(["ENV_APP_DB_PASSWORD=${params.app_db_password}"]) {
env.artifacts = sh(
returnStdout: true,
script: """
set +x
python3 some_script.py --app_db_password='${ENV_APP_DB_PASSWORD}'
set -x
"""
)
def encrypted_key_value_map = readJSON text: env.artifacts
ansiblePlaybook credentialsId: 'dev-server', disableHostKeyChecking: true, "-e \"docker_image=${env.docker_image} fernet_key=${encrypted_key_value_map["fernet_key"]} app_db_password=${encrypted_key_value_map["app_db_password"]}\"", inventory: 'playbooks/dvmt30/dev.inv', playbook: "playbooks/dvmt30/deploy-docker.yml"
}
}
}
}
}
}
ERROR
java.lang.IllegalArgumentException: Expected named arguments but got [{credentialsId=dev-server, disableHostKeyChecking=true, inventory=playbooks/dvmt30/dev.inv, playbook=playbooks/dvmt30/deploy-docker.yml}, -e "docker_image=123456789.ngrok.io/app-name:v1 fernet_key=$$$$$$$$$ app_db_password=*********"]
See the Plugin Documentation, the ansiblePlaybook keyword receives a dictionary as input and your passed dictionary to the function is malformed as the middle value is missing the its key.
To fix the issue you should add that value with the extras key:
ansiblePlaybook credentialsId: 'dev-server',
disableHostKeyChecking: true,
extras: "-e \"docker_image=${env.docker_image} fernet_key=${encrypted_key_value_map["fernet_key"]} app_db_password=${encrypted_key_value_map["app_db_password"]}\"",
inventory: 'playbooks/dvmt30/dev.inv',
playbook: "playbooks/dvmt30/deploy-docker.yml"

Jenkins pipeline send email error "java.lang.NullPointerException"

I'm integrating Squish automation tool and Jenkins pipeline. Everything went smoothly. Now I need to send email report after the job's done. I have a Groovy file in pre-send script, but when this script runs, it throws out exception:
java.lang.NullPointerException: Cannot invoke method getRootDir() on null object
I figured out the "build" object in my Groovy script is Null. Not sure why it is. Please note if I use built-in Squish plugin and Editable Email on Jenkins, everything went smoothly. The problem just happen when I moved to use Pipeline.
### - This is my Groovy script:
List getJenkinsTestResultFiles() {
File squishResultsPath = new File( build.getRootDir(), "squishResults" )
if ( !squishResultsPath.exists() || !squishResultsPath.isDirectory() ) {
throw new GroovyRuntimeException( "Squish results path does not exist at: " + squishResultsPath.getAbsolutePath() )
}
File summaryFile = new File( squishResultsPath, "summary.xml" )
if ( !summaryFile.exists() || !summaryFile.isFile() ) {
throw new GroovyRuntimeException( "Squish summary file does not exist at: " + summaryFile.getAbsolutePath() )
}
List resultFiles = []
def summaries = new XmlSlurper().parse( summaryFile )
summaries.summary.each {
resultFiles.push( new File( squishResultsPath, it.xmlFileName.text() ) )
}
return resultFiles
}
### - This is my Pipeline script:
node('Slave_10.133.88.151') {
stage('Squish Test') {
step([$class: 'SquishBuilder',
abortBuildOnError: false,
extraOptions: '',
host: '',
port: '',
resultFolder: "${WORKSPACE}\\Squish_Report",
skipTestCases: false,
snoozeFactor: '1',
squishConfig: 'Default',
testCase: '',
testSuite: "${WORKSPACE}\\${TEST_SUITE}"])
}
stage('Send Email') {
emailext body: 'Test',
postsendScript: '${SCRIPT, template="SquishSummary.groovy"}',
subject: 'Pipeline',
to: 'hoang#local.com'
}
}
The build object is a hudson.model.Build object, and since you are calling a shared library you'll have to import the Build object in your groovy script.
import hudson.model.Build
At the top of your shared library.
If you have already imported the object then the issue could be that you haven't initialized it inside of your shared library.

Jenkins Shared Library failed to reference

I'm failing to reference a second groovy file in my src of my repo.
My set up is this: library name pipeline-library-demo
github
I have added a second groovy file to the src folder
app_config.groovy
#!/usr/bin/groovy
def bob(opt) {
sh "docker run --rm " +
'--env APP_PATH="`pwd`" ' +
'--env RELEASE=${RELEASE} ' +
"-v \"`pwd`:`pwd`\" " +
"-v /var/run/docker.sock:/var/run/docker.sock " +
"docker_repo/bob:1.4.0-8" ${opt}
}
def test(name) {
echo "Hello ${name}"
}
The Jenkins file I am using is:
pipeline {
Library('pipeline-library-demo') _
agent {
node {
label params.SLAVE
config = new app_config()
}
}
parameters {
string(name: 'SLAVE', defaultValue: 'so_slave')
}
stages {
stage('Demo') {
steps {
echo 'Hello World'
sayHello 'Dave'
}
}
stage('bob') {
steps {
config.test 'bob'
config.bob '--help'
}
}
}
}
I think I am not referencing the app_config.groovy correctly and it's not finding
Library call should come in starting of the jenkins file, please follow below
If you have added the library configuration in jenkins configuration then call should be like below:-
#Library('pipeline-library-demo')_
If you want to call the library dynamically you should call like below:-
library identifier: 'custom-lib#master', retriever:
modernSCM([$class:'GitSCMSource',remote:
'git#git.mycorp.com:my-jenkins-utils.git', credentialsId:
'my-private-key'])
please refer this link
And please define package in your app_config.groovy. (ex. package com.cleverbuilder)

wrong variables format when filling yaml with Jenkinsfile

I have a yml file that I need to fill with Jenkins.
global:
name: 'my_name'
code: 'my_code'
So, I define Jenkins params:
string(name: 'NAME', defaultValue: 'Nightly Valid', description: 'Nightly Valid Name')
string(name: 'CODE', defaultValue: 'NIGHTLY', description: '')
And further in my Jenkinsfile, I have:
script {
def filename = "configuration.yml"
def yaml = readYaml file: filename
// General data
yaml.global.name = "${params.NAME}"
yaml.global.code = "${params.CODE}"
// ...
sh "rm $filename"
writeYaml file: filename, data: yaml
When I do that, I get:
global:
name: '''my_name'''
code: '''my_code'''
How can I do to just have:
global:
name: 'my_name'
code: 'my_code'
"${params.NAME}" is GStringImpl, try to convert it to string directly: "${params.NAME}".toString()

Creating XML file using StreamingMarkupBuilder() in Jenkins

i have a groovy method that now creates XML files. I have verified it using the groovyConsole. But if i use this snippet in my jenkinsfile, the XML file is not seen to be created in the workspace, although the job completes successfully.
Question: how do i make sure that the XML file is generated in the workspace? I will be using this XML for subsequent stages in the jenkinsfile
Here is how the jenkinsfile looks like:
import groovy.xml.*
node('master') {
deleteDir()
stage('Checkout') {
// checks out the code
}
generateXML("deploy.xml") //This calls the method to generate the XML file
//stage for packaging
//Stage to Publish
//Stage to Deploy
}
#NonCPS
def generateXML(file1) {
println "Generating the manifest XML........"
def workflows = [
[ name: 'A', file: 'fileA', objectName: 'wf_A', objectType: 'workflow', sourceRepository: 'DEV2', folderNames: [ multifolder: '{{multifolderTST}}', multifolder2: '{{multifolderTST2}}' ]],
[ name: 'B', file: 'fileB', objectName: 'wf_B', objectType: 'workflow', sourceRepository: 'DEV2', folderNames: [ multifolder3: '{{multifolderTST3}}', multifolder4: '{{multifolderTST4}}']]
]
def builder = new StreamingMarkupBuilder()
builder.encoding = 'UTF-8'
new File(file1).newWriter() << builder.bind {
mkp.xmlDeclaration()
mkp.declareNamespace(udm :'http://www.w3.org/2001/XMLSchema')
mkp.declareNamespace(powercenter:'http://www.w3.org/2001/XMLSchema')
delegate.udm.DeploymentPackage(version:'$BUILD_NUMBER', application: "informaticaApp"){
delegate.deployables {
workflows.each { item ->
delegate.powercenter.PowercenterXml(name:item.name, file:item.file) {
delegate.scanPlaceholders(true)
delegate.sourceRepository(item.sourceRepository)
delegate.folderNameMap {
item.folderNames.each { name, value ->
it.entry(key:name, value)
}
}
delegate.objectNames {
delegate.value(item.objectName)
}
delegate.objectTypes {
delegate.value(item.objectType)
}
}
}
}
delegate.dependencyResolution('LATEST')
delegate.undeployDependencies(false)
}
}
}
I found the file in the / dir.. As i haven't entered any path in the filewriter.
UPDATE:
this is not the right solution for a distributed env. It appears that the java file io operations only works in master and not in the agent machines.

Resources