Unable to run shell command in Groovy in Jenkins - jenkins

I am trying to get certain values from the slave by running shell commands such as :
git rev-parse HEAD
git config --get remote.origin.url
The method that I have tried to write for this is :
def executeCommand(String command) {
stdout = sh script: command, returnStdout: true
return stdout.trim()
}
Now when I try to run the first command :
output = executeCommand('git rev-parse HEAD')
I get the ERROR :
[Running] groovy "/Users/user-a/Documents/cmd.groovy"
Caught: groovy.lang.MissingMethodException: No signature of method: cmd.sh() is applicable for argument types: (LinkedHashMap) values: [[script:git rev-parse HEAD, returnStdout:true]]
Possible solutions: is(java.lang.Object), use([Ljava.lang.Object;), run(), run(), any(), tap(groovy.lang.Closure)
groovy.lang.MissingMethodException: No signature of method: cmd.sh() is applicable for argument types: (LinkedHashMap) values: [[script:git rev-parse HEAD, returnStdout:true]]
Possible solutions: is(java.lang.Object), use([Ljava.lang.Object;), run(), run(), any(), tap(groovy.lang.Closure)
at cmd.executeCommand(cmd.groovy:2)
at cmd.run(cmd.groovy:6)
I also tried:
output = command.execute().text
But this returns nothing.
Im running out of ideas on how to run shell commands in Groovy in Jenkins and record the output.
MORE DETAILS
I am working with Jenkins shared Libraries. I have exposed a method in for my Jenkinsfile by the name getLatestBuildDetails(). This method is defined within my library. One of the actions within the method is to execute the git commands locally. So inorder to run any shell command locally, I have created the executeCommand function which takes the actual command to run as a String and executes it and returns the output to be used later by getLatestBuildDetails()

Library classes cannot directly call steps such as sh or git. They can however implement methods, outside of the scope of an enclosing class, which in turn invoke Pipeline steps, for example:
// src/org/foo/Zot.groovy
package org.foo;
def checkOutFrom(repo) {
git url: "git#github.com:jenkinsci/${repo}"
}
return this
Which can then be called from a Scripted Pipeline:
def z = new org.foo.Zot()
z.checkOutFrom(repo)
This approach has limitations; for example, it prevents the declaration of a superclass.
Alternately, a set of steps can be passed explicitly using this to a library class, in a constructor, or just one method:
package org.foo
class Utilities implements Serializable {
def steps
Utilities(steps) {this.steps = steps}
def mvn(args) {
steps.sh "${steps.tool 'Maven'}/bin/mvn -o ${args}"
}
}
When saving state on classes, such as above, the class must implement the Serializable interface. This ensures that a Pipeline using the class, as seen in the example below, can properly suspend and resume in Jenkins.
#Library('utils') import org.foo.Utilities
def utils = new Utilities(this)
node {
utils.mvn 'clean package'
}
If the library needs to access global variables, such as env, those should be explicitly passed into the library classes, or methods, in a similar manner.
Instead of passing numerous variables from the Scripted Pipeline into a library,
package org.foo
class Utilities {
static def mvn(script, args) {
script.sh "${script.tool 'Maven'}/bin/mvn -s ${script.env.HOME}/jenkins.xml -o ${args}"
}
}
The above example shows the script being passed in to one static method, invoked from a Scripted Pipeline as follows:
#Library('utils') import static org.foo.Utilities.*
node {
mvn this, 'clean package'
}
In your case you should write something like:
def getLatestBuildDetails(context){
//...
executeCommand(context, 'git rev-parse HEAD')
//...
}
def executeCommand(context, String command) {
stdout = script.sh(script: command, returnStdout: true)
return stdout.trim()
}
Jenkins file:
#Library('library_name') _
getLatestBuildDetails(this)
For more info see jenkins shared library documentation: https://jenkins.io/doc/book/pipeline/shared-libraries/

I am also using shared libraries. This is how I have used in my code:
String getMavenProjectName() {
echo "inside getMavenProjectName +++++++"
// mavenChartName = sh(
// script: "git config --get remote.origin.url",
// returnStdout: true
// ).trim()
def mavenChartName = sh returnStdout:true, script: '''
#!/bin/bash
GIT_LOG=$(env -i git config --get remote.origin.url)
basename "$GIT_LOG" .git; '''
echo "mavenChartName: ${mavenChartName}"
return mavenChartName
}
PS: Ignore the commented lines of code.

Try out the sh step instead of execute. :)
EDIT:
I would go with execute() or which I think it is even better, grgit.
I think you are not getting any output when you run cmd.execute().text because .text returns the standard output of the command and your command might only use the standard error as its output, you can check both:
def process = cmd.execute()
def stdOut = process.inputStream.text
def stdErr = process.errorStream.text

Related

Jenkins pipeline error in handling json file

I'm newbie to Jenkins pipeline and writing a groovy script to parse a json file. However I'm facing an error which many have faced but none of the solutions worked for me. Below is my Jenkinsfile and error msg.
def envname = readJSON file: '${env.WORKSPACE}/manifest.json'
pipeline {
agent any
stages {
stage('Build') {
steps {
echo WORKSPACE
sh "ls -a ${WORKSPACE}"
}
}
}
}
[Pipeline] Start of Pipeline
[Pipeline] readJSON
[Pipeline] End of Pipeline
org.jenkinsci.plugins.workflow.steps.MissingContextVariableException:
Required context class hudson.FilePath is missing Perhaps you forgot
to surround the code with a step that provides this, such as: node at
org.jenkinsci.plugins.pipeline.utility.steps.AbstractFileOrTextStepExecution.run(AbstractFileOrTextStepExecution.java:30)
I even tried readJSON file: '${WORKSPACE}/manifest.json but that didn't work too. I'm sure the mistake is with the first line since when removing that line, there execution is successful. The docs are pretty helpful but I'm not able to track down where exactly I'm going wrong that is why posted here.
UPDATE:
I tried the following methods def envname = readJSON file: "./manifest.json" and def envname = readJSON file: "${env.WORKSPACE}/manifest.json" and even tried them defining under the steps block. Nothing worked. Below is the error msg I recieved when I defined them under step block
WorkflowScript: 5: Expected a step # line 7, column 13
def envname =
^
Below is the official syntax doc of readJson and I can see that I'm using the correct syntax only. but still doesn't work as expected.
https://www.jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#readjson-read-json-from-files-in-the-workspace
'${env.WORKSPACE}/manifest.json' is interpolating the Groovy env map as a shell variable. You need to interpolate it as a Groovy variable like "${env.WORKSPACE}/manifest.json".
sh "ls -a ${WORKSPACE}" is interpolating the shell environment variable WORKSPACE as a Groovy variable. You need to interpolate it as a shell variable like sh 'ls -a ${WORKSPACE}'.
echo WORKSPACE is attempting to resolve the shell variable WORKSPACE as a first class Groovy variable expression. You need to use the Groovy env map instead like echo env.WORKSPACE.
As for the global variable indefinite type assignment on the first line: if it still throws the error above after making those fixes, then it may be due to invalid use of scripted syntax in a declarative syntax pipeline. You likely need to place it inside a step block within your pipeline in that case.
I've solved this myself with the help of "Matt Schuchard"'s below answer. I'm not sure whether this is the only way to solve but this worked for me.
pipeline {
agent any
stages {
stage('Json-Build') {
steps {
script {
def envname = readJSON file: "${env.WORKSPACE}/manifest.json"
element1 = "${envname.dev}"
echo element1
}
}
}
}
}

How do I use sh in Jenkins Global library

I am creating my own global library for Jenkins, which I have hosted on github, and to simplify some run-of-the-mill tasks, I wanted to add a function that returns the GIT tag.
Therefore I created something like this:
class Myclass{
static String getGitTag() {
return "${sh(returnStdout: true, script: 'git tag --sort version:refname | tail -1').trim()}"
}
}
... which results in this error:
No signature of method: static com.stevnsvig.jenkins.release.ReleaseUtil.sh()
So I'm left with two questions:
Is the solution to import the sh() library that Jenkins' groovy flavor obviously already has imported? (and if so how)
What is the best practice here? I am wondering why there isn't a GIT_TAG global variable when you use declarative pipelines, and something like this should (in my opinion) be easy as pie.
EDIT #1:
static String getGitTag() {
stdout = script.sh(script: "git tag --sort version:refname | tail -1", returnStdout: true)
return stdout.trim()
}
produces a similar error:
No signature of method: static com.stevnsvig.jenkins.release.ReleaseUtil.sh() is applicable for argument types: (java.util.LinkedHashMap) values: [[returnStdout:true, script:git tag --sort version:refname | tail -1]]
EDIT #2:
static String getGitTag() {
def stdout = "git tag --sort version:refname | tail -1".execute()
return stdout.in.text
}
completes, but the output is blank. Running the same command with pwd returns / which indicaes that the environment is not set, which makes sense, since all the commands running under Jenkins are designed to rununder pipelines
EDIT #3:
I went hunting for the import. Stumbled across the Jenkins CI project on github and started searching the many repositories. Found a promising one... and put a file called pwd.groovy in /vars with this content:
import org.jenkinsci.plugins.workflow.steps.durable_task.ShellStep
static String getPWD() {
def ret = ShellStep.sh(returnStdout: true, script: "git tag --sort version:refname | tail -1").trim()
echo "currently in ${ret}"
}
The error I got is a variation of the same. I guess since itsa plugin, the definition is different...
hudson.remoting.ProxyException: groovy.lang.MissingMethodException: No signature of method: static org.jenkinsci.plugins.workflow.steps.durable_task.ShellStep.sh() is applicable ...
Option 1) Use Groovy execute to run cmd and get its output as below
tag = "git tag --sort version:refname | tail -1".execute().text
Option 2) Use Jenkins pipeline step sh.
One concept need to get clear: the context of sh is global function is when sh used directly inside Jenkinsfile.
In your case, sh is used outside the Jenkinsfile. To make better understand I give an example Jenkinsfile.
pipeline {
stages('foo') {
steps {
sh 'pwd'
// In above sh step, there is an implicit `this` which represents the
// global object for Jenkinsfile, you can image sh 'pwd' to this.sh 'pwd'
//
// Thus if you want to use `sh` outside Jenkinsfile, you must pass down the
// implicit `this` into the file where you used `sh`
}
}
}
To address your issue
// ReleaseUtil.groovy
static String getGitTag(steps) {
// here `steps` is the global object for Jenkinsfile
// you can use other pipeline step here by `steps`
steps.echo 'test use pipeline echo outside Jenkinsfile'
steps.withCredentials([steps.string(credentialsId: 'git_hub_auth', variable: 'GIT_AUTH_TOKEN')]) {
steps.echo '....'
steps.sh '....'
}
return steps.sh(returnStdout: true, script:"git tag --sort version:refname | tail -1").trim()
}
// Jenkinsfile
import com.stevnsvig.jenkins.release.ReleaseUtil
pipeline {
stages('foo') {
steps {
ReleaseUtil.getGitTag(this)
}
}
}

How to execute Jenkins shared library functions on slave instead of master?

I need to write shared library that reads files in build workspace and shared library functions cannot read files because pipeline is on slave and shared library is executed in master. Is there any way tho change execution context of library functions?
Found out answer. You can read library file and give the file to writeFile pipeline step
writeFile(file:"foo.groovy", text: libraryResource("bar.groovy"))
"groovy foo.groovy"
writeFile neads BOTH parameters as named parameters so answer given in https://issues.jenkins-ci.org/browse/JENKINS-54646 is not fully right.
To execute Jenkins shared library functions on slave instead of master. You can implement de argument node("slaveName") in the call:
def call(Map config=[:], Closure body) {
def label = 'slave'
node("${label}") {
pipeline {
stage('Sonarqube') {
script {
withSonarQubeEnv('Sonar8') {
withMaven(maven: 'apache-maven') {
sh 'mvn sonar:sonar -Dmaven.test.skip=true -Dsonar.java.binaries=./target'
}
}
}
}//pipeline
} // call
You Can actually do it without writeFile, This shared Library Code will be executed in Master, but it will use RemoteDignostic to Execute commands to Slave
to execute uname -a in worker node
import hudson.util.RemotingDiagnostics
import jenkins.model.Jenkins
class test_exec{
def env
def propertiesFilePath
#NonCPS
def call(cmd) {
def trial_script = """
println "uname -a".execute().text
""".trim()
String result
Jenkins.instance.slaves.find { agent ->
agent.name == "${env.NODE_NAME}"
}.with { agent ->
result = RemotingDiagnostics.executeGroovy(trial_script, agent.channel)
}
return result
}
}
In your pipeline
steps{
println(new test_exec().call())
}

Create custom groovy closure to reuse shell() feature

I am using Groogy DSL plugin in my Jenkins server. I realize this step is repeated in many places and jobs.
steps {
shell('''#!/bin/bash -ex
|aws s3 cp s3://${STACK_S3_BUCKET_NAME}/file myfile --region ${AWS_REGION}
|aws s3 cp s3://${STACK_S3_BUCKET_NAME}/otherfile .
|
|'''.stripMargin())
I am new using Groovy and I will like to create kind of custom Groovy Step or Closure to avoid this process, I will like to do something like:
awsS3cp {
from: '${ORIGIN}'
to: '${DESTINATION}'
}
Then implement something like this:
def awsS3cp { context ->
shell('''#!/bin/bash -ex
|aws s3 cp s3://$from $to
|'''.stripMargin())
}
Attempt 1
I did it in this way and it failed:
def awsS3cp(String from, String to) {
shell("""#!/bin/bash -ex
|echo 'copy from $from to $to'
|""".stripMargin())
}
def createEnvironmentJob = freeStyleJob( jobName )
createEnvironmentJob.with{
description( jobDescription )
steps {
awsS3cp ("S3-SOURCE-BUCKET","S3-TARGET-BUCKET")
}
}
The error output:
No signature of method: create_environment.shell() is applicable for argument types: (java.lang.String) values: [#!/bin/bash -ex
echo 'copy from S3-SOURCE-BUCKET to S3-TARGET-BUCKET'
]
Possible solutions: queue(java.lang.String), sleep(long), every(), grep(), job(java.lang.String), queue(javaposse.jobdsl.dsl.Job)
Finished: FAILURE
One option would be to define a function outside:
def awsS3cp(String from, String to) {
return sh("""#!/bin/bash -ex
|aws s3 cp s3://$from $to
|""".stripMargin())
}
Then, call it inside of the pipeline definition:
steps {
awsS3cp('source', 'destination')
}

Load file with environment variables Jenkins Pipeline

I am doing a simple pipeline:
Build -> Staging -> Production
I need different environment variables for staging and production, so i am trying to source variables.
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh'
But it returns Not found
[Stack Test] Running shell script
+ source /var/jenkins_home/.envvars/stacktest-staging.sh
/var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: 2: /var/jenkins_home/workspace/Stack Test#tmp/durable-bcbe1515/script.sh: source: not found
The path is right, because i run the same command when i log via ssh, and it works fine.
Here is the pipeline idea:
node {
stage name: 'Build'
// git and gradle build OK
echo 'My build stage'
stage name: 'Staging'
sh 'source $JENKINS_HOME/.envvars/stacktest-staging.sh' // PROBLEM HERE
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To staging
input message: "Does Staging server look good?"
stage name: 'Production'
sh 'source $JENKINS_HOME/.envvars/stacktest-production.sh'
echo '$DB_URL' // Expects http://production_url/my_db
sh 'gradle flywayMigrate' // To production
sh './deploy.sh'
}
What should i do?
I was thinking about not using pipeline (but i will not be able to use my Jenkinsfile).
Or make different jobs for staging and production, using EnvInject Plugin (But i lose my stage view)
Or make withEnv (but the code gets big, because today i am working with 12 env vars)
One way you could load environment variables from a file is to load a Groovy file.
For example:
Let's say you have a groovy file in '$JENKINS_HOME/.envvars' called 'stacktest-staging.groovy'.
Inside this file, you define 2 environment variables you want to load
env.DB_URL="hello"
env.DB_URL2="hello2"
You can then load this in using
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
Then you can use them in subsequent echo/shell steps.
For example, here is a short pipeline script:
node {
load "$JENKINS_HOME/.envvars/stacktest-staging.groovy"
echo "${env.DB_URL}"
echo "${env.DB_URL2}"
}
From the comments to the accepted answer
Don't use global 'env' but use 'withEnv' construct, eg see:
issue #9: don't set env vars with global env in top 10 best practices jenkins pipeline plugin
In the following example: VAR1 is a plain java string (no groovy variable expansion), VAR2 is a groovy string (so variable 'someGroovyVar' is expanded).
The passed script is a plain java string, so $VAR1 and $VAR2 are passed literally to the shell, and the echo's are accessing environment variables VAR1 and VAR2.
stage('build') {
def someGroovyVar = 'Hello world'
withEnv(['VAR1=VALUE ONE',
"VAR2=${someGroovyVar}"
]) {
def result = sh(script: 'echo $VAR1; echo $VAR2', returnStdout: true)
echo result
}
}
For secrets / passwords you can use credentials binding plugin
Example:
NOTE: CREDENTIALS_ID1 is a registered username/password secret on the Jenkins settings.
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
echo "User name: $USER"
echo "Password: $PASSWORD"
}
}
The jenkisn console log output hides the real values:
[Pipeline] echo
User name: ****
[Pipeline] echo
Password: ****
Jenkins and credentials is a big issue, probably see: credentials plugin
For completeness: Most of the time, we need the secrets in environment variables, as we use them from shell scripts, so we combine the withCredentials and withEnv like follows:
stage('Push') {
withCredentials([usernamePassword(
credentialsId: 'CREDENTIALS_ID1',
passwordVariable: 'PASSWORD',
usernameVariable: 'USER')]) {
withEnv(["ENV_USERNAME=${USER}",
"ENV_PASSWORD=${PASSWORD}"
]) {
def result = sh(script: 'echo $ENV_USERNAME', returnStdout: true)
echo result
}
}
}
Another way to resolve this install 'Pipeline Utility Steps' plugin that provides us readProperties method ( for reference please go to the link https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#pipeline-utility-steps)
Here in the example we can see that they are storing the keys into an array and using the keys to retrieve the value.
But in that case the in production the problem will be like if we add any variable later into property file that variable needs to be added into the array of Jenkins file as well.
To get rid of this tight coupling, we can write code in such a way so that the Jenkins build environment can get information automatically about all the existing keys which presents currently in the Property file. Here is an example for the reference
def loadEnvironmentVariables(path){
def props = readProperties file: path
keys= props.keySet()
for(key in keys) {
value = props["${key}"]
env."${key}" = "${value}"
}
}
And the client code looks like
path = '\\ABS_Output\\EnvVars\\pic_env_vars.properties'
loadEnvironmentVariables(path)
With declarative pipeline, you can do it in one line ( change path by your value):
script {
readProperties(file: path).each {key, value -> env[key] = value }
}
Using withEnv() to pass environment variables from file splitted by new line and casted to List:
writeFile file: 'version.txt', text: 'version=6.22.0'
withEnv(readFile('version.txt').split('\n') as List) {
sh "echo ${version}"
}
If you are using Jenkins 2.0 you can load the property file (which consists of all required Environment variables along with their corresponding values) and read all the environment variables listed there automatically and inject it into the Jenkins provided env entity.
Here is a method which performs the above stated action.
def loadProperties(path) {
properties = new Properties()
File propertiesFile = new File(path)
properties.load(propertiesFile.newDataInputStream())
Set<Object> keys = properties.keySet();
for(Object k:keys){
String key = (String)k;
String value =(String) properties.getProperty(key)
env."${key}" = "${value}"
}
}
To call this method we need to pass the path of property file as a string variable For example, in our Jenkins file using groovy script we can call like
path = "${workspace}/pic_env_vars.properties"
loadProperties(path)
Please ask me if you have any doubt
Here is a complete example of externalizing environment variables and loading them in Jenkins pipeline execution. The pipeline is written in a declarative style.
stage('Reading environment variable defined in groovy file') {
steps {
script {
load "./pipeline/basics/extenvvariable/env.groovy"
echo "${env.env_var1}"
echo "${env.env_var2}"
}
}
}
Complete code example:
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/Jenkinsfile
Where variables are loaded from a groovy file placed with the pipeline code only.
https://github.com/dhruv-bansal/jenkins-pipeline-exploration/blob/master/pipeline/basics/extenvvariable/env.groovy
This pattern comes very handy when you are creating a generic pipeline that could be used across teams.
You can externalize the dependent variable in such groovy file and each team can define their values according to their ecosystem.
Another solution is to use a custom method without allowing extra permissions such as for new Properties() which leads to this error before allowing:
org.jenkinsci.plugins.scriptsecurity.sandbox.RejectedAccessException: Scripts not permitted to use new java.util.Properties
or adding extra plugin methods such as readProperties.
here is a method which reads a simple file named env_vars in this format:
FOO=bar
FOO2=bar
pipeline {
<... skipped lines ...>
script {
loadEnvironmentVariablesFromFile("env_vars")
echo "show time! ${BAR} ${BAR2}"
}
<... skipped lines ...>
}
private void loadEnvironmentVariablesFromFile(String path) {
def file = readFile(path)
file.split('\n').each { envLine ->
def (key, value) = envLine.tokenize('=')
env."${key}" = "${value}"
}
}

Resources