I am executing a cURL command using sh command with no issue.
pulic uploadArtifct (String user, String password, String file, String
location) {
def cred = "${user}:${password}"
def cmd = "curl -v -u cred --upload-file ${file} ${location}"
sh cmd
}
However, when I try to execute the same cmd, using the Process object. I get an error:
public uploadArtifct (String user, String password, String file, String
location) {
def cred = "${user}:${password}"
def cmd = "curl -v -u cred --upload-file ${file} ${location}"
try {
def sout = new StringBuffer(), serr = new StringBuffer()
def proc = cmd.execute()
proc.consumeProcessOutput(sout, serr)
proc.waitForOrKill(1000)
println sout
} catch (Exception e) {
throw new RuntimeExceptipon("Cannot execute curl, exception: [${e.getClass().getName()} - '${e.getMessage()}']")
}
}
The error that I see is:
java.lang.RuntimeException: Cannot execute curl, exception: [java.lang.RuntimeException - 'Error running; stdout='', stderr='curl: Can't open 'Folder/artifact/file.zip'!
curl: try 'curl --help' or 'curl --manual' for more information
'']
What is it about Process.execute() that does not work. Am I missing something?
I ran into a similar issue half a year ago. As i found out , the curl request that you run using the sh command is executed on the agent where the build is run
sh cmd //this runs on the agent and hence finds the ${file}
However the second piece of code
def proc = cmd.execute() . //This is run on the master and hence it cannot find the ${file}
When you use groovy classes, it is by design to be executed on the master node. This is because the groovy engine that Jenkins uses is on the master
Related
I want to get a line from a file in my workspace. I am using this script :
stage('Test') {
steps {
script {
outputJenkins = 'output-jenkins.log'
sh "cd invoker && mvn clean install && mvn exec:java -Dexec.mainClass=\"com.JenkinsRunner\" -Dexec.args=\"qal ${GIT_COMMIT_HASH}\" > ../${outputJenkins}"
logFile = readFile(outputJenkins)
echo logFile
adminRepoLogLine = sh "echo logFile | grep \"Admin repo url is :::\""
echo adminRepoLogLine
}
}
}
But I am getting this error:
+ echo logFile
+ grep Admin repo url is :::
script returned exit code 1
The script works fine in my shell when I try it locally. Are there any contains around doing it in a JenkinsFile?
If we apply various fixes and improvements to the code in the question to achieve the desired functionality, then it will succeed:
stage('Test') {
steps {
script {
dir('invoker') {
sh(label: 'Maven Clean Install', script: 'mvn clean install')
// assign maven output to variable
String output = sh(label: 'Maven Git Log', script: "mvn exec:java -Dexec.mainClass=\"com.JenkinsRunner\" -Dexec.args=\"qal ${GIT_COMMIT_HASH}\"", returnStdout: true)
}
// assign regex return to variable
def adminRepoLogLine = output =~ /(.*Admin repo url is :::.*)/
// print extracted string from return
print adminRepoLogLine[0][1]
}
}
}
Note that GIT_COMMIT_HASH is neither an intrinsic Jenkins environment variable, nor defined in the pipeline code in the question, so it will need to be defined at Pipeline scope elsewhere in your code.
This is because the string literal logFile does not contain the string Admin repo url is :::. If there's no such match, then grep will exit with status 1.
You probably want to use
cat logFile | grep \"Admin repo url is :::\"
instead, or, even simpler:
grep \"Admin repo url is :::\" logFile
Append || true (or ||:) to the command if you want to avoid the errors when the log line does not appear.
I'm running a curl command in my Jenkinsfile.
post {
success {
script {
sh '''
|SCORE=+1
|GERRIT_COMMENT="$(cat <<-EOL
|Sonar result was: SUCCESS
|Report: ${Jenkins_URL}/job/${JOB_NAME}/${BUILD_NUMBER}/artifact/report1.txt
|EOL
|)"
|curl -s -u ${apiToken}: ${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children | json_pp -json_opt pretty,canonical > report1.txt
|echo "Voting unsuccessful"
'''.stripMargin().stripIndent()
archiveArtifacts artifacts: 'report1.txt', fingerprint: true
echo 'I Succeeded'
}
}
But I get the error
malformed JSON string, neither array, object, number, string or atom, at character offset 0 (before "(end of string)") at /usr/bin/json_pp
I can't use jq as it's not installed and installing it isn't an option.
The curl command works fine on my terminal but is failing in my Jenkins pipeline.
Also, when I do this instead, it works.
post {
success {
script {
sh '''
|SCORE=+1
|GERRIT_COMMENT="$(cat <<-EOL
|Sonar result was: SUCCESS
|Report: ${Jenkins_URL}/job/${JOB_NAME}/${BUILD_NUMBER}/artifact/report1.txt
|EOL
|)"
|echo "Voting unsuccessful"
'''.stripMargin().stripIndent()
sh """
curl -s -u ${apiToken}: '${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children' | json_pp -json_opt pretty,canonical > report1.txt
"""
archiveArtifacts artifacts: 'report1.txt', fingerprint: true
echo 'I Succeeded'
}
}
But it throws a warning in the console output.
Warning: A secret was passed to "sh" using Groovy String interpolation, which is insecure. Affected argument(s) used the following variable(s): [apiToken]
What am I doing wrong, please?
In a Jenkins pipeline, how would you properly pass a JSON response using curl into a file?
I recommend to not use shell scripts whenever it is possible. Shell scripts are not cross platform and require installing additional tools (e.g. curl).
In your case the curl call could be replaced by the httpRequest step.
First let's replace the curl call and saves the result in a componentTree.json file:
httpRequest(
url: "${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children",
authorization: 'id-of-credentials-which-was-used-to-create-the-apiToken-variable',
outputFile: 'componentTree.json'
)
You want to format the JSON data in a human-readable format, so let's use the readJSON and writeJSON steps:
def json = readJSON(file: 'componentTree.json')
writeJSON(json: json, file: 'report1.txt', pretty: 4)
Now the report1.txt file contains JSON formatted with indent 4.
The componentTree.json file is written and read only once, so let's decrease the number of the IO operations:
def response = httpRequest(
url: "${Sonar_URL}/api/measures/component_tree?ps=100&s=qualifier,name&component=sonarqube&metricKeys=ncloc,bugs,vulnerabilities,code_smells,security_hotspots,coverage,duplicated_lines_density&strategy=children",
authorization: 'id-of-credentials-which-was-used-to-create-the-apiToken-variable'
)
def json = readJSON(text: response.content)
writeJSON(json: json, file: 'report1.txt', pretty: 4)
About the warning:
Warning: A secret was passed to "sh" using Groovy String interpolation, which is insecure. Affected argument(s) used the following variable(s): [apiToken]
Secrets never should be interpolated because they may contain special characters which could be interpreted. Example:
my secret: My' https://example.org; cat /etc/passwd; echo \
command: curl -u '${password}' https://private.server/path/file.txt
After the interpolation the following command is called:
curl -u 'My' https://example.org; cat /etc/passwd; echo \' https://private.server/path/file.txt
There are two options to fix it:
if apiToken is an environment variable:
sh "curl -s -u \$apiToken: '${Sonar_URL}/api/measures/component..."
if apiToken is a Groovy variable:
withEnv(["CREDENTIALS=${apiToken}"]) {
sh "curl -s -u \$CREDENTIALS: '${Sonar_URL}/api/measures/component..."
}
In both cases the dollar sign ($) is escaped before the credentials which means that shell script will resolve it (it will be taken from environment variables).
I am trying to render information obtained from the shell in an active Active Choices parameter with a groovy script. I can easily access the shell from a groovy script in a jenkins pipeline with the sh method like this:
node()
{
sh 'git log ...'
}
But when I try this in the groovy script of Active choices, it crashes and the fallback script is executed.
Is it possible to switch to a node in this context and execute a shell command ?
Thanks for the help!
Here is sample snippet using the active choice plugin.
def command = $/aws ec2 describe-instances \
--filters Name=tag:Name,Values=Test \
--query Reservations[*].Instances[*].PrivateIpAddress \
--output text /$
def proc = command.execute()
proc.waitFor()
def output = proc.in.text
def exitcode= proc.exitValue()
def error = proc.err.text
if (error) {
println "Std Err: ${error}"
println "Process exit code: ${exitcode}"
return exitcode
}
//println output.split()
return output.tokenize()
I'd like to populate the the groovy variable "committer" with the output of the command:
def committer = utils.sh("curl -s -u \${J_USER}:\${J_PASS} \${env.BUILD_URL}/api/json | python -mjson.tool | grep authorEmail | awk '{print \$2}' | tr -d '"|,' ")
Because of the known issue in Jenkins (JENKINS-26133) it is not possible to do that but only to populate the variable with the exit status of the command.
So I've go these 2 functions:
def gen_uuid(){
randomUUID() as String
}
def sh_out(cmd){ // As required by bug JENKINS-26133
String uuid = gen_uuid()
sh """( ${cmd} )> ${uuid}"""
String out = readFile(uuid).trim()
sh "set +x ; rm ${uuid}"
return out
}
These functions allow me to wrap my shell commands in sh_out(COMMAND) and in the background I'm using the workaround which is suggested in the mentioned above known issue link which means running the command while redirecting it's output to a file (in the case of my function it's a random filename) and then reading it into a variable.
So, In the beginning of my pipeline I load my functions file which ends with return this; like so:
fileLoader.withGit('git#bitbucket.org:company/pipeline_utils.git', 'master', git_creds, ''){
utils = fileLoader.load('functions.groovy');
}
And that's why the "utils.sh_out" that you see in the command, but when I use the shown above command in my Jenkins pipeline, I get the following error:
/home/ubuntu/workspace/-6870-bitbucket-integration-ECOPKSSBUJ6HCDNM4TOY77X7UTZ#tmp/durable-006d5c7e/script.sh: 2: /home/ubuntu/workspace/-6870-bitbucket-integration-ECOPKSSBUJ6HCDNM4TOY77X7UTZ#tmp/durable-006d5c7e/script.sh: Bad substitution
Running the command in a shell works properly:
$ curl -s -u user:password http://IPADDR:8080/job/COMPANY_BitBucket_Integration/job/research/job/COMPANY-6870-bitbucket-integration/3/api/json/api/json | python -mjson.tool | grep authorEmail | awk '{print $2}' | tr -d '"|,'
user#email.com
I suspect it has something to do with the tr command in the end and with the character escaping I did there but whatever I try fails, anyone got an idea?
according to the documentation now sh supports std output.
and i know i'm not answering your question directly, but i suggest to use groovy to parse json.
you are trying to get the value of authorEmail from json
if the response from /api/json looks like this (just an example):
{
"a":{
"b":{
"c":"ccc",
"authorEmail":"user#email.com"
}
}
}
then the groovy to take athorEmail:
def cmd = "curl -s -u \${J_USER}:\${J_PASS} \${env.BUILD_URL}/api/json"
def json = sh(returnStdout: true, script: cmd).trim()
//parse json and access it as an object (Map/Array)
json = new groovy.json.JsonSlurper().parseText(json)
def mail = json.a.b.athorEmail
you could receive java.io.NotSerializableException explained here
so i changed the code like this:
node {
def json = sh(
returnStdout: true,
script: "curl -s -u \${J_USER}:\${J_PASS} \${env.BUILD_URL}/api/json"
).trim()
def mail = evaluateJson(json, '${json.a.b.authorEmail}')
echo mail
}
#NonCPS
def evaluateJson(String json, String gpath){
//parse json
def ojson = new groovy.json.JsonSlurper().parseText(json)
//evaluate gpath as a gstring template where $json is a parsed json parameter
return new groovy.text.GStringTemplateEngine().createTemplate(gpath).make(json:ojson).toString()
}
I am running the following Groovy script which works perfect from the compiler & Jenkins ( if it runs a slave /node)
Because I want to run the script the "This project is parameterised" I noticed Jenkins is always running.
I Use the following Script:
// setup SSH connection:
sshString = "ssh -T -i keyfile -p 22 test#server.com "
cmdLine = "/appl/test/script.sh"
conString = sshString + cmdLine
// execute command
def proc = conString.execute()
def outputStream = new StringBuffer()
proc.waitForProcessOutput(outputStream, System.out)
output = (outputStream .toString())
println(output)
On the slave I get the result of the shell script, on the master the result is NULL.
What am I doing wrong