I'm trying to access the file system from a Jenkinsfile using groovy. I'm following the suggestion in this SO thread:
Recursive listing of all files matching a certain filetype in Groovy
I've already given access to
new java.io.File java.lang.String
staticField groovy.io.FileType FILES
under Script Approvals.
However this line is failing without an error message:
new File('.').eachFileRecurse(FILES) {
Here's the broader code block in question:
stage("Install") {
print "here a"
new File('.').eachFileRecurse(FILES) {
print "here c"
if(it.name.endsWith(".sh")) {
print "here d"
println it
}
}
print "here b"
Here's the console output from that section:
[Pipeline] stage
[Pipeline] { (Install)
[Pipeline] echo
here a
[Pipeline] }
[Pipeline] // stage
[Pipeline] echo
Email Recipients: opike99#gmail.com, opike#yahoo.com
[Pipeline] emailext
messageContentType = text/html; charset=UTF-8
Adding recipients from project recipient list
Adding recipients from trigger recipient list
Setting In-Reply-To since last build was not successful
Better use the findFiles step, instead of the Java API to read files. This is whitelisted by script security.
each loops don't work in pipeline because of Jenkins' CPS. Instead (if you really prefer it over good old for loops), wrap it in a method annotated with #NonCPS.
Related
Please help. env is null.
Jenkinsfile:
node {
println "Your ENV:"
println env.dump
println "Your params:"
println params.dump
}
Jenkins output:
[Pipeline] properties
[Pipeline] node
Running on foobarquux in c:\workspace\123abc
[Pipeline] {
[Pipeline] echo
Your ENV:
[Pipeline] echo
null
[Pipeline] echo
Your params:
[Pipeline] echo
null
I expect that my environment variables will not be null. I expect env.dump not to be null and to see something beyond Your ENV: when println env.dump executes.
After reading very helpful comments from #mkobit, I realized I needed parentheses for dump, and even with them Jenkins throws a security exception.
${WORKSPACE} only works if it is used in an agent (node)! Otherwise it comes out as null.
I have agent none at the top of my pipeline because I have a few input steps that i don't want use heavyweight executors for. And I was setting an environment variable in the top-level environment {} block that used ${WORKSPACE}. For the life of me I couldn't figure out why it was being set to null. Some other thread mentioned the workspace on an agent, so i moved that definition into a step on an agent, and lo and behold, when you set a var with WORKSPACE while running on an agent, it all works as expected.
The sidebar here is that if you are using a top-level agent none, the environment and presumably other pre-stages blocks are not running in an agent. So anything that relies on an agent will behave unexpectedly.
Groovy's optional parenthesis requires at least one parameter, which is different than Ruby.
Method calls can omit the parentheses if there is at least one parameter and there is no ambiguity:
So, to call the dump() method you would do env.dump() or params.dump(). However, this method will not be whitelisted and you will get a security exception (if you are running in the sandbox or using any sort of Jenkins security) because this would print out all fields of the object.
Thanks to StephenKing for pointing out, i check again with a new fresh Jenkins instance. see comments inside
Assuming the job has 2 parameters [str1=val1, bool1=true] :
node {
// Print the value of a job parameter named "str1"
// output: val1
println "${params.str1}"
// Calling the dump() function to print all job parameters (keys/vals)
// NOTE: calling this method should be approved by Jenkins admin
// output: .... m=[str1:val1, bool1:true] ...
println params.dump()
// Same as the above.
// output: .... m=[str1:val1, bool1:true] ...
println "${params.dump()}"
// SYNTAX ERROR, the '$' is not expected here by the parser
//println ${params.dump()};
// This appears in the question, but it seems like this is not
// what the author meant. It tries to find a param named "dump"
// which is not available
// output: null
println params.dump
}
In a #NonCPS annotated function, only code up to the very first jenkins build step is executed. Does anyone have the same problem? Am I missing something? I am using Jenkins LTS... just sayin' (2.73.2).
This is my code:
#NonCPS
def hello() {
println 'Output "hello":'
sh 'echo Hello'
println 'Output "World":'
sh 'echo World'
}
node {
stage('Test') {
hello()
}
}
I would expect this code to run properly, but the output is the following:
[Pipeline] node
Running on Jenkins in /var/lib/jenkins/workspace/Sandbox/pipeline-test
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Test)
[Pipeline] echo
Output "hello":
[Pipeline] sh
[pipeline-test] Running shell script
+ echo Hello
Hello
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
You cannot run build steps inside #NonCPS methods. Pipeline scripts are considered "serializable", allowing them to be durable across system failures etc. Only a subset of the capabilities of groovy used by pipeline scripts is serializable - for anything that is not, you use #NonCPS to execute it.
Essentially, your #NonCPS method needs to do its business and return data back to the "safe", serialized execution stack.
In your particular example code I see no reason why hello() has to be #NonCPS at all - I can only assume your real function is doing something more complex.
(Edit)
Having just looked at your question history and the original script; I don't know if this is still the case with the latest versions but when I was writing our scripts ~6 months ago, each { thing -> iteration was not serializable.
Is there a way to use the Jenkins "Execute system groovy script" step from a pipeline file which is SCM commited ?
If yes, how would I access the predefined variables (like build) in it ?
If no, would I be able to replicate the functionality otherwise, using for example the Shared Library Plugin ?
Thanks !
You can put groovy code in a pipeline in a (always-source-controlled) Jenkinsfile, like this:
pipeline {
agent { label 'docker' }
stages {
stage ('build') {
steps {
script {
// i used a script block because you can jam arbitrary groovy in here
// without being constrained by the declarative Jenkinsfile DSL
def awesomeVar = 'so_true'
print "look at this: ${awesomeVar}"
// accessing a predefined variable:
echo "currentBuild.number: ${currentBuild.number}"
}
}
}
}
}
Produces console log:
[Pipeline] echo
look at this: so_true
[Pipeline] echo
currentBuild.number: 84
Click on the "Pipeline Syntax" link in the left navigation of any of pipeline job to get a bunch of examples of things you can access in the "Global Variables Reference."
I'm trying to mask a password in my Jenkins build.
I have been trying the mask-passwords plugin.
However, this doesn't seem to work with my Jenkins pipeline script, because if I define the password PASSWD1 and then I use it in the script like this ${PASSWD1}, I am getting:
No such DSL method '$' found among steps [addToClasspath, ansiColor, ansiblePlaybook, ....]
If I use env.PASSWD1, then its value will be resolved to null.
So how should I mask a password in a Jenkins pipeline script?
The simplest way would be to use the Credentials Plugin.
There you can define different types of credential, whether it's a single password ("secret text"), or a file, or a username/password combination. Plus other plugins can contribute other types of credentials.
When you create a credential (via the Credentials link on the main Jenkins page), make sure you set an "ID". In the example below, I've called it my-pass. If you don't set it, it will still work, Jenkins will allocate an opaque UUID for you instead.
In any case, you can easily generate the required syntax with the snippet generator.
withCredentials([string(credentialsId: 'my-pass', variable: 'PW1')]) {
echo "My password is '${PW1}'!"
}
This will make the password available in the given variable only within this block. If you attempt to print the password, like I do here, it will be masked.
Looking at this issue, https://issues.jenkins-ci.org/browse/JENKINS-27392, you should be able to do the following:
node {
wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: [[password: '123ADS', var: 'SECRET']]]) {
echo env['SECRET'];
}
}
However, if you look at the last comments in that issue it doesn't work, seems like a bug. However, if you know the secret and accidentally print int in the logs, the it is hidden, like this:
node {
wrap([$class: 'MaskPasswordsBuildWrapper', varPasswordPairs: [[password: '123ADS', var: 'SECRET']]]) {
echo "123ADS";
}
}
This produces:
[Pipeline] node
Running on master in workspace/pl
[Pipeline] {
[Pipeline] wrap
[Pipeline] {
[Pipeline] echo
********
[Pipeline] }
[Pipeline] // wrap
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Regarding the error you are getting, No such DSL method '$' found among steps ..., I'm just guessing but you are probably using ${VAR} directly in the pipeline script, ${...} is only relevant inside strings in groovy.
EDIT:
Or you can use the Credentails Plugin and pipeline step withCredentials:
// Credential d389273c-03a0-45af-a847-166092b77bda is set to a string secret in Jenkins config.
node {
withCredentials([string(credentialsId: 'd389273c-03a0-45af-a847-166092b77bda', variable: 'SECRET')]) {
bat """
if ["${SECRET}"] == ["123ASD"] echo "Equal!"
""";
}
}
This results in:
[Pipeline] node
Running on master in workspace/pl
[Pipeline] {
[Pipeline] withCredentials
[Pipeline] {
[Pipeline] bat
[pl] Running batch script
workspace/pl>if ["****"] == ["****"] echo "Equal!"
"Equal!"
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Note that this plugin binds the variable directly to the closure and not the environment as the other, e.g. I can use the variable SECRET directly.
I had a script that was being executed as a jenkins pipeline and it was working fine. I wanted to reuse it for multiple environments so I moved all the code to functions and load them from multiple files.
Library file - Healthcheck:
#!groovy
#NonCPS
def check(type) {
stage "prepare"
echo "TEST1"
props = readProperties file:'build.properties'
echo "TEST2"
stage "queues"
checkQueues()
}
#NonCPS
def checkQueues() {
txt = "http://ltxl0207.sgdcelab.sabre.com:8161/api/jolokia/read/org.apache.activemq:brokerName=localhost,destinationName=!/tss!/trip_source_updates,destinationType=Queue,type=Broker/QueueSize".toURL().getText(requestProperties: [Authorization: "Basic " + "admin:admin".getBytes().encodeBase64().toString()])
json = new groovy.json.JsonSlurper().parseText(txt)
echo "Got response: " + txt
}
return this;
File that uses it - Healthcheck-dev:
#!groovy
node {
checkout scm
healthcheck = load 'Healthcheck'
healthcheck.check('DEV')
}
And the trouble is that the script doesn't get pass readProperties and the prepare stage it just stops there, ignoring the queues stage:
[Pipeline] load
[Pipeline] { (Healthcheck)
[Pipeline] }
[Pipeline] // load
[Pipeline] stage (prepare)
Entering stage prepare
Proceeding
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
What I'm doing wrong? When I move the code to single file it works correctly.
Have you tried to execute your pipeline with your 2 files ("Healthcheck" & "Healtcheck-dev") being in the same repository ? Because when you load a script from another SCM repo like you seem to be doing, Jenkins actually creates another directory #script or such for your loaded script.
You might need to do something like this to load your general script from the correct workspace :
node {
checkout scm
dir("${projectWorkspace}#script") {
healthcheck = load 'Healthcheck'
healthcheck.check('DEV')
}
}
with ${projectWorkspace} being the original workspace for your build.