I'm getting same issue listed as fixed here : https://issues.jenkins.io/browse/JENKINS-58643
We are using Jenkins 2.190.3.2
stage('upload artefactory') {
steps {
sh "touch /tmp/blabla"
sh "gzip /tmp/blabla"
script {
server = Artifactory.server('myid')
server.credentialsId = 'my-cred'
def uploadSpec = """{
"files": [
{
"pattern": "/tmp/blabla.gz",
"target": "pkg/com/myentreprise/mystuff/scm/dumps/solr/"
}
]
}"""
server.upload spec: uploadSpec, failNoOp: true
}
}
}
[Pipeline] artifactoryUpload
expected to call org.jfrog.hudson.pipeline.common.types.ArtifactoryServer.upload but wound up catching artifactoryUpload; see: https://jenkins.io/redirect/pipeline-cps-method-mismatches/
this was fixed in Pipeline Groovy 2.75
We have 2.74, thus why we still ahve this bug.
Only solution seems to be to upgrade.
Related
I have the below Jenkins pipeline and I am trying to echo out SolutonName and TargetVersion value. I tried different approaches and it either gave me error or not the result I wanted.
If I use echo "Solution Name: $solution['SolutionName']", it gave a result of Solution Name: SolutionA={SolutionName=SolutionA, TargetVersion=1.0.0.0}['SolutionName'], which is the map itself with ['SolutionName'] at the end.
If I use echo "Solution Name: ${solution.SolutionName}", it throws an error of org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: No such field found: field java.util.AbstractMap$SimpleImmutableEntry SolutionName
def NodeLabel = 'windows'
// Solution
def SolutionMap = [
SolutionA: [
SolutionName: 'SolutionA',
TargetVersion: '1.0.0.0'
],
SolutionB: [
SolutionName: 'SolutionB',
TargetVersion: '2.1.0.0'
]
]
pipeline {
agent { node { label "${NodeLabel}" } }
stages {
stage('Test') {
steps {
script {
SolutionMap.each { solution ->
stage(solution.key) {
echo "Solution Name: ${solution['SolutionName']}"
echo "Solution Name: ${solution['TargetVersion']}"
}
}
}
}
}
}
}
I figured it out, apparently I need to call:
echo "Solution Name: $Solution.value.SolutionName"
So it seems like calling $Solution does not assume it wants its value so I need to call $Solution.value to get the value and from there call .SolutionName to get the child value.
So I am downloading multiple artifacts with jFrog
rtDownload (
serverId: 'Artifactory-1',
spec: '''{
"files": [
{
"pattern": "bazinga-repo/froggy-files/",
"target": "bazinga/"
}
]
}''',
// Optional - Associate the downloaded files with the following custom build name and build number,
// as build dependencies.
// If not set, the files will be associated with the default build name and build number (i.e the
// the Jenkins job name and number).
buildName: 'holyFrog',
buildNumber: '42'
)
Which works but this works async and I have to use the results as soon as it finished. How do I await for each rtDownload in pipeline syntax?
Below working for me with downloading 2 Artifacts :
def target = params.BuildInfo.trim()
def downloadSpec = """{
"files": [{
"pattern": "${artifactory}.zip",
"target": "./${target}.zip"
}]
}"""
def buildInfo = server.download spec: downloadSpec
def files = findFiles(glob: "**/${target}.zip") // Define `files` here
if (files) { // And verify `it` here, so it'll wait
...
}
Personally, I ended up implementing Khoa's idea this way
try {
// attempt to retreive the results from artifactory
rtDownload (
serverId: 'my_arti_server',
spec: """{
"files": [
{
"pattern": "somepath/run_*.zip",
"target": "run/"
}
]
}""",
failNoOp: false, // no failure if no file was found
)
// rtDownload is async, we have to wait for dl completion
def count = 5
while(count > 0) {
sh script:"""#!/bin/bash +e
chmod 777 run/run_*.zip
"""
def files = findFiles glob: "run/run_*.zip"
if (files.length > 0 ){
break
}
sleep(5)
count--
}
} catch (Exception e) {
echo 'Exception occurred: ' + e.toString()
}
def files = findFiles glob: "run/run_*.zip"
if (files.length == 0 ){
error("files couldn't be found")
}
This is not perfect but it waits for some files to be present. If you have only one file it should work but if you have several files, it may continue as soon as one file is downloaded. I haven't checked but with this I assume that:
a file can be found once the download is completed (no file size changing regularly)
all files are downloaded "at the same time"
I have a Jenkins Job DSL seed job that calls out to a couple of pipeline jobs e.g.
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
pipelineJob("job2") {
definition {
cps {
script(readFileFromWorkspace('job2.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
}
}
}
job1.groovy and job2.groovy are standard Jenkinsfile style pipelines.
I want to pass a couple of common maps into these jobs. These contains things that may vary between environments, e.g. target servers, credential names.
Something like:
def SERVERS_MAP = [
'prod': [
'prod-server1',
'prod-server2',
],
'dev': [
'dev-server1',
'dev-server2',
],
]
Can I define a map in my seed job that I can then pass and access as a map in my pipeline jobs?
I've come up with a hacky workaround using the pipeline-utility-steps plugin.
Essentially I pass my data maps around as JSON.
So my seed job might contain:
def SERVERS_MAP = '''
{
"prod": [
"prod-server1",
"prod-server2"
],
"dev": [
"dev-server1",
"dev-server2"
]
}
'''
pipelineJob("job1") {
definition {
cps {
script(readFileFromWorkspace('job1.groovy'))
}
parameters {
choiceParam('ENV', ['dev', 'prod'], 'Build Environment')
stringParam('SERVERS_MAP', "${SERVERS_MAP}", "")
}
}
}
and my pipeline would contain something like:
def serversMap = readJSON text: SERVERS_MAP
def targetServers = serversMap["${ENV}"]
targetServers.each { server ->
echo server
}
I could also extract these variables into a JSON file and read them from there.
Although it works, it feels wrong somehow.
You can use string parameter pass the Map val, downstream read it as json format.
UPSTREAM PIPELINE
timestamps{
node("sse_lab_CI_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage("-- regression execute --"){
def test_map =
"""
{
"gerrit_patchset_commit": "aad5fce",
"build_cpu_x86_ubuntu": [
"centos_compatible_build_test",
"gdb_compatible_build_test",
"visual_profiler_compatible_build_test"
],
}
"""
build(job: 'tops_regression_down',
parameters: [string(name: 'UPSTREAM_JOB_NAME',
value: "${env.JOB_BASE_NAME}"),
string(name: 'UPSTREAM_BUILD_NUM',
value: "${env.BUILD_NUMBER}"),
string(name: 'MAP_PARAM',
value: "${test_map}"),
],
propagate: true,
wait: true)
}
}
}
DOWNSTREAM PIPELINE
timestamps{
node("sse_lab_inspur_076"){ //${execNode}
currentBuild.description="${env.NODE_NAME};"
stage('--in precondition--'){
dir('./'){
cleanWs()
println("hello world")
println("${env.MAP_PARAM}")
Map result_json = readJSON(text: "${env.MAP_PARAM}")
println(result_json)
}
}
}
}
We have a Jenkins job that uses a declarative pipeline.
This job can be triggered by different other builds.
In the declarative pipeline how can I find out which build has triggered the pipeline?
Code sample below
pipeline {
agent any
stages {
stage('find upstream job') {
steps {
script {
def causes = currentBuild.rawBuild.getCauses()
for(cause in causes) {
if (cause.class.toString().contains("UpstreamCause")) {
println "This job was caused by job " + cause.upstreamProject
} else {
println "Root cause : " + cause.toString()
}
}
}
}
}
}
}
You can check the job's REST API to get extra information like below
{
"_class" : "org.jenkinsci.plugins.workflow.job.WorkflowRun",
"actions" : [
{
"_class" : "hudson.model.ParametersAction",
"parameters" : [
]
},
{
"_class" : "hudson.model.CauseAction",
"causes" : [
{
"_class" : "hudson.model.Cause$UpstreamCause",
"shortDescription" : "Started by upstream project \"larrycai-sto-46908390\" build number 7",
"upstreamBuild" : 7,
"upstreamProject" : "larrycai-sto-46908390",
"upstreamUrl" : "job/larrycai-sto-46908390/"
}
]
},
Reference:
https://jenkins.io/doc/pipeline/examples/#get-build-cause
Get Jenkins upstream jobs
I realize that this is a couple years old, but the previous response required some additional security setup in my Jenkins instance. After a bit of research, I found that there was a new feature request completed in 11/2018 that addresses this need and exposes build causes in currentBuild. Here is a little lib I wrote that returns the cause with the string "JOB/" prepended if the build was triggered by another build:
def call(body) {
if (body == null) {body = {DEBUG = false}}
def myParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = myParams
body()
def causes = currentBuild.getBuildCauses()
if (myParams.DEBUG) {
echo "causes count: " + causes.size().toString()
echo "causes text : " + causes.toString()
}
for(cause in causes) {
// echo cause
if (cause._class.toString().contains("UpstreamCause")) {
return "JOB/" + cause.upstreamProject
} else {
return cause.toString()
}
}
}
To use this, I place it in a library in a file named "buildCause.groovy". Then I reference the library at the top of my Jenkinsfile:
library identifier: 'lib#master', retriever: modernSCM(
[$class: 'GitSCMSource', remote: '<LIBRARY_REPO_URL>',
credentialsId: '<LIBRARY_REPO_CRED_ID', includes: '*'])
Then I can call it as needed within my pipeline:
def cause=buildCause()
echo cause
if (!cause.contains('JOB/')) {
echo "started by user"
} else {
echo "triggered by job"
}
Larry's answer didn't quite work for me.
But, after I've modified it slightly with the help of these docs and this version works:
def causes = currentBuild.getBuildCauses()
for(cause in causes) {
if (cause._class.toString().contains("UpstreamCause")) {
println "This job was caused by job " + cause.upstreamProject
} else {
println "Root cause : " + cause.toString()
}
}
P.S. Actually, Daniel's answer mentions this method, but there's too much clutter, I only noticed it after I wrote my solution.
I'm having trouble downloading a build from my artifactory server to my windows jenkins slave node using the jenkins pipeline plugin. It all appears to be going fine, but it doesn't actually download the file. Am I doing something wrong?
I don't see any requests in my Artifactory system logs to download, just to upload.
(2017-04-25 18:39:48,096 [http-nio-8081-exec-2] [INFO ] (o.a.e.UploadServiceImpl:516) - Deploy to 'BUILDS:windows/5840/build.tar.gz' Content-Length: 278600525)
I've been using this as a reference: https://wiki.jenkins-ci.org/pages/viewpage.action?pageId=99910084
Here's the output from my jenkins pipeline:
For pattern: build.tar.gz 1 artifacts were found.
Deploying artifact: http://myartifactory:8081/artifactory/BUILDS/windows/5840/build.tar.gz
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] timeout
Timeout set to expire in 3 min 0 sec
[Pipeline] {
[Pipeline] node
Running on test-windows-0 in C:/jenkinsroot/workspace/test-windows
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] echo
{
"files": [
{
"pattern": "BUILDS/windows/5840/build.tar.gz",
"target": "download/",
}
]
}
[Pipeline] echo
Artifactory Download: BUILDS/windows/5840/build.tar.gz -> download/
The file exists on artifactory.
Here's my jenkins code:
#NonCPS
def downloadArtifactory(String localPath, String repository, String remotePath) {
def downloadSpec = """{
"files": [
{
"pattern": "${repository}/${remotePath}",
"target": "${localPath}",
}
]
}"""
echo "${downloadSpec}"
echo "Artifactory Download: ${repository}/${remotePath} -> ${localPath}"
def server = Artifactory.server("MYARTIFACTORYSERVER")
def buildInfo = server.download spec: downloadSpec
return buildInfo
}
Called with:
downloadArtifactory("download/", "BUILDS", "windows/5840/build.tar.gz")
Removing the NonCPS annotation should solve the problem.
As you can see in this Jenkins issue, Artifactory Jenkins plugin does not support NonCPS.
Please remove the ,(comma) from the line "target": "${localPath}"
,
It works
make it,
def downloadArtifactory(String localPath, String repository, String remotePath) {
def downloadSpec = """{
"files": [
{
"pattern": "${repository}/${remotePath}",
"target": "${localPath}"
}
]
}"""
echo "${downloadSpec}"
echo "Artifactory Download: ${repository}/${remotePath} -> ${localPath}"
def server = Artifactory.server("MYARTIFACTORYSERVER")
def buildInfo = server.download spec: downloadSpec
return buildInfo
}