Assigning a step to a variable in Jenkins Pipeline - jenkins

I have the following pipeline script:
node {
def myStep = sh
myStep "ls -la"
}
I thought steps were visible as variables and could be assigned to variables so that they can be used later (for example choosing a different step depending on some conditions).
However, this fails with:
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: myStep for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:232)
at org.kohsuke.groovy.sandbox.impl.Checker$6.call(Checker.java:282)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:286)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.getProperty(SandboxInvoker.java:28)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at WorkflowScript.run(WorkflowScript:3)
at ___cps.transform___(Native Method)
How can I put a step in a variable to use it later without hardcoding its name?

You can write a method in your pipeline that wraps the behavior you want. It will have access to the script variables.
node {
myStep("ls -la")
}
def myStep(String script) {
sh(script)
}

My current workaround:
node {
def myStep = { script ->
sh script
}
myStep("ls -la")
}

Related

Cannot move defined variables between stages in JenkinsFile (Scripted)

I have a scriptedJenkins file:
node{
build_id = env.BUILD_ID
stage("Clone") {
checkout scm
}
stage("Build"){
def docker_image = docker.build("articlestream:${env.BUILD_ID}")
}
stage("test"){
// we should do testing here in the future
}
stage("Docker Push"){
//our Registry
docker.withRegistry("https://localhost:4000", "docker-registry-credentials") {
docker_image.push("latest")
}
}
}
My expectation is that the def docker_image would be accessible in the "Docker Push" stage. However, I get the following error:
groovy.lang.MissingPropertyException: No such property: docker_image for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:270)
at org.kohsuke.groovy.sandbox.impl.Checker$6.call(Checker.java:291)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:295)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:271)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:271)
at org.kohsuke.groovy.sandbox.impl.Checker.checkedGetProperty(Checker.java:271)
at com.cloudbees.groovy.cps.sandbox.SandboxInvoker.getProperty(SandboxInvoker.java:29)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at WorkflowScript.run(WorkflowScript:17)
which suggests that docker_image is not a variable at this stage. I can confirt that docker is building an image in the build stage (checked logs) Can someone help me rephrase teh docker_image.push("latest") line so that I can push?

How to execute Jenkins shared library functions on slave instead of master?

I need to write shared library that reads files in build workspace and shared library functions cannot read files because pipeline is on slave and shared library is executed in master. Is there any way tho change execution context of library functions?
Found out answer. You can read library file and give the file to writeFile pipeline step
writeFile(file:"foo.groovy", text: libraryResource("bar.groovy"))
"groovy foo.groovy"
writeFile neads BOTH parameters as named parameters so answer given in https://issues.jenkins-ci.org/browse/JENKINS-54646 is not fully right.
To execute Jenkins shared library functions on slave instead of master. You can implement de argument node("slaveName") in the call:
def call(Map config=[:], Closure body) {
def label = 'slave'
node("${label}") {
pipeline {
stage('Sonarqube') {
script {
withSonarQubeEnv('Sonar8') {
withMaven(maven: 'apache-maven') {
sh 'mvn sonar:sonar -Dmaven.test.skip=true -Dsonar.java.binaries=./target'
}
}
}
}//pipeline
} // call
You Can actually do it without writeFile, This shared Library Code will be executed in Master, but it will use RemoteDignostic to Execute commands to Slave
to execute uname -a in worker node
import hudson.util.RemotingDiagnostics
import jenkins.model.Jenkins
class test_exec{
def env
def propertiesFilePath
#NonCPS
def call(cmd) {
def trial_script = """
println "uname -a".execute().text
""".trim()
String result
Jenkins.instance.slaves.find { agent ->
agent.name == "${env.NODE_NAME}"
}.with { agent ->
result = RemotingDiagnostics.executeGroovy(trial_script, agent.channel)
}
return result
}
}
In your pipeline
steps{
println(new test_exec().call())
}

Load env variables successfully in Jenkins pipeline but not while the pipeline was used as shared library

In one stage of my declarative jenkins pipeline codes, it executes a bash script(sh '''./a.sh''', script "a.sh" is maintained outsides) - in that script, the value of "jarVersion" is injected in ${WORKSPACE}/.jarVersion (echo "jarVersion=${jarVersion}" > ${WORKSPACE}/.jarVersion). At later stage, we need get the value of jarVersion. We use load "${WORKSPACE}/.jarVersion" and ${jarVersion} to get the value. It works when we do so in pipeline script.
However, when we set this pipeline as a shared library (put it in /vars/testSuite.groovy) and call it in another pipeline script. It can not recognize var ${jarVersion}.
Please advise how to solve the issue. A common question is: how to transfer a value in a script from stage A to stage B?
stage('getJarVersion'){
steps{
script{
load "${WORKSPACE}/.jarVersion"
currentBuild.description = "jarVersion:${jarVersion}"
}
}
}
I expected it could work as it is in pipeline scripts.
But it shows:
groovy.lang.MissingPropertyException: No such property: jarVersion for class: testSuite
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:458)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getProperty(DefaultInvoker.java:34)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at testSuite.call(/jenkins/jobs/TestSuite1/builds/11/libs/pipelineUtilities/vars/testSuite.groovy:84)
With the stages under the same groovy file, you have to declare the object out of the stage blocks and before the node block. So for each stage, you can define the value inside the variable:
Pipeline {
def my_var
stage('stage1'){
---------
}
stage('stage2'){
---------
}
}
If you are defining a stage per file, you have to create the closures with the input object and to pass it in the call from the parent groovy file:
test.groovy:
def call(def my_obj, String my_string) {
stage('my_stage') {
println(my_obj)
}
}
parent_test.groovy
test(obj_value,string_value)

Workspace path in job DSL within Jenkins pipeline

I am creating a jenkins pipeline job to seed jobs using the jenkins job DSL plugin. How do I get the workspace path inside the DSL file? The jenkins pipeline code is as such:
#!groovy
node{
stage("build jobs"){
ws{
git poll: true, credentialsId: 'xxx', url: 'ssh://git#aaaaa.cc.xxx.com:/xxx/xxx.git'
checkout scm
jobDsl(removedJobAction: 'DISABLE', removedViewAction: 'DELETE', targets: 'jobs/*.groovy', unstableOnDeprecation: true)
}
}
}
The DSL code that is failing is:
hudson.FilePath workspace = hudson.model.Executor.currentExecutor().getCurrentWorkspace()
With the error:
Processing DSL script pipeline.groovy
java.lang.NullPointerException: Cannot invoke method getCurrentWorkspace() on null object
at org.codehaus.groovy.runtime.NullObject.invokeMethod(NullObject.java:91)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:48)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.NullCallSite.call(NullCallSite.java:35)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
at pipeline.run(pipeline.groovy:1)
at pipeline$run.call(Unknown Source)
Variables created in the pipeline area are not accessible inside the job DSL step
I stumbled upon this because there seems to be no good way. Here is how I do it:
node {
stage('test') {
sh 'pwd > workspace.txt'
jobDsl scriptText: '''
String workspace = readFileFromWorkspace('workspace.txt').trim()
def file = new File(workspace, 'test.txt')
file.append('It worked!')'''
}
}
So first grab the workspace in the pipeline script and then pass it to the job dsl. If you have more than just the workspace variable, that you need in your scripts I suggest transferring via a properties file:
node {
stage('test') {
sh 'echo "workspace="$(pwd) > build.properties'
jobDsl scriptText: '''
Properties props = new Properties();
props.load(streamFileFromWorkspace('build.properties'))
def file = new File(props.getProperty('workspace'), 'test.txt')
file.append('It worked!')'''
}
}
This can be achieved by using SEED_JOB variable:
String workspacePath = SEED_JOB.lastBuild.checkouts[0].workspace
It is described in project's wiki:
Access to the seed job is available through the SEED_JOB variable. The variable contains a reference to the internal Jenkins object that
represents the seed job. The actual type of the object depends on the
type of job that runs the DSL. For a freestyle project, the object is
an instance of hudson.model.FreeStyleProject. See the Jenkins API
Documentation for details.
The SEED_JOB variable is only available in scripts, not in any classes
used by a script. And it is only available when running in Jenkins,
e.g. in the "Process Job DSLs" build step.
The following example show how to apply the same quiet period for a
generated job as for the seed job.
job('example') {
quietPeriod(SEED_JOB.quietPeriod) }
You can use the __FILE__ variable in a Job DSL script to get the path of the current script. Maybe you can use that to derive the workspace directory. See Script Location for details.
def scriptDir = new File(__FILE__).parent.absolutePath
You can pass the workspace argument to job dsl. for example:
The pipeline code as follow:
node {
step([
$class: 'ExecuteDslScripts',
scriptText: 'job("example-2")'
])
step([
$class: 'ExecuteDslScripts',
targets: ['jobs/projectA/*.groovy', 'jobs/common.groovy'].join('\n'),
removedJobAction: 'DELETE',
removedViewAction: 'DELETE',
lookupStrategy: 'SEED_JOB',
additionalClasspath: ['libA.jar', 'libB.jar'].join('\n'),
additionalParameters: [
message: 'Hello from pipeline',
credentials: 'SECRET'
WORKSPACE: env.WORKSPACE
]
])
}
https://github.com/jenkinsci/job-dsl-plugin/wiki/User-Power-Moves#use-job-dsl-in-pipeline-scripts

Usage of Global property in Pipeline job

I want to use a global variable that I have created in my Jenkins Configuration as follow:
My question is: How can I use it in my Pipeline(aka workflow) job? I'm doing something like:
When I ran it, It displayed:
[Pipeline] node
Running on master in /opt/devops/jenkins_home/jobs/siman/jobs/java/jobs/demo-job/workspace
[Pipeline] {
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
groovy.lang.MissingPropertyException: No such property: PRODUCTION_MAILS for class: groovy.lang.Binding
at groovy.lang.Binding.getVariable(Binding.java:63)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.SandboxInterceptor.onGetProperty(SandboxInterceptor.java:224)
Instead If I create a "Free Style Project" I can use the global property as follow without problems:
When I ran it, it display the value if I do some "echo" as follow:
This is how I achieved it:
node('master') {
echo "${env.PRODUCTION_MAILS}"
}
Or You could use
echo "This is the value: " + PRODUCTION_MAILS
That should work.

Resources