jenkins parallel streams with local files? - jenkins

My parallel steps need access to a local file in the workspace of the job. But it seems they can access it?
I tried listing the workspace in each stream:
powershell "ls ${workspace}"
they are all empty! the output of each stream ls C:\workspace\branch_name#<stream#> shows no files.
How they get access to the workspace? they're pretty much useless to me if they can't even access local files.
Is there a feature to copy files from the main workspace to the stream workspaces?

Since Powershell support was recently introduced, and there is a powershell step (issue JENKINS-34581), do check you are in the same path each time:
node {
powershell '$(pwd).Path'
bat 'echo %cd%'
}
And that you have checked out a repository first (or your workspace would be empty anyway)

The way you handle this in Jenkins is to stash and unstash the files for each parallel job.
https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#code-stash-code-stash-some-files-to-be-used-later-in-the-build

Related

Why does Jenkins shell file copy not work as expected (does not overwrite existing files)

I have a step in a Jenkins pipeline to copy some source files to the workspace.
stage('Copy Files') {
script {
echo 'Staging files'
sh "cp -ar /home/dev/src/ ${env.WORKSPACE}"
}
}
Yet, when I rerun the build it uses the old code. The only solution is to delete the workspace prior to the copy. In a normal Linux file system a copy will overwrite the destination. Why does Jenkins behave differently--i.e., old files are not overwritten? From the syntax it seems like it is just running a shell command, so why does this not have the expected behavior?
It is because, Jenkins run on master node and workspace will be on the worker node.
when checkout scm and sh "" code blocks are in different stages, files will not be save from first stage to other. You should use stash & unstash. when you stash a directory path, files in that dir will be available to the unstashed step in later stages.
Jenkins doc - here

Copy file from Jenkins master to slave in Pipeline

I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin

Jenkins pipeline stages - passing whole file

Running a Jenkins pipeline (based on Groovy) with stages containing many nodes, I need to pass a list from some file on NodeA on stageA to nodeB on StageB.
In stageA NodeA I run
DEVenv = readFile 'somefile.txt'
In stageB I run
println DEVenv
So far so good, I get the output in the console.
Now how to pass the output of that println DEVenv to a file?
println DEVenv > otherfile.txt
doesn't do the trick :-(
I'm sure it's not such a big deal but I've been churning the internet for a couple of hours to no avail.
You can write content to a file using the writeFile step:
writeFile file: 'otherfile.txt', text: DEVenv
Btw. In order to transfer workspace contents to another node, you are supposed to use the stash/unstash steps (not sure, if you use that already).

Execute a script from jenkins pipeline

I have a jenkins pipeline that builds a java artifact,
copies it to a directory and then attempts to execute a external script.
I am using this syntax within the pipeline script to execute the external script
dir('/opt/script-directory') {
sh './run.sh'
}
The script is just a simple docker build script, but the build will fail
with this exception:
java.io.IOException: Failed to mkdirs: /opt/script-directory#tmp/durable-ae56483c
The error is confusing because the script does not create any directories. It is just building a docker image and placing the freshly built java artifact in that image.
If I create a different job in jenkins that executes the external script as
its only build step and then call that job from my pipeline script using this syntax:
build 'docker test build'
everything works fine, the script executes within the other job and the pipeline
continues as expected.
Is this the only way to execute a script that is external to the workspace?
What am I doing wrong with my attempt at executing the script from within
the pipeline script?
The issue is that the jenkins user (or whatever the user is that runs the Jenkins slave process) does not have write permission on /opt and the sh step wants to create the script-directory#tmp/durable-ae56483c sub-directory there.
Either remove the dir block and use the absolute path to the script:
sh '/opt/script-directory/run.sh'
or give write permission to jenkins user to folder /opt (not preferred for security reasons)
Looks like a bug in Jenkins, durable directories are meant to store recovery information e.g. before executing an external script using sh.
For now all you can do is make sure that /opt/script-directory has +r +w and +x set for jenkins user.
Another workaround would be not to change the current directory, just execute sh with it:
sh '/opt/script-directory/run.sh'
I had a similar concern when trying to execute a script in a Jenkins pipeline using a Jenkinsfile.
I was trying to run a script restart_rb.sh with sudo.
To run it I specified the present working directory ($PWD):
sh 'sudo sh $PWD/restart_rb.sh'

How to re-use groovy script in Jenkins Groovy Post Build plugin?

I have some groovy code which I am planning to re-use in Jenkins Groovy Post Build plugin of multiple jobs. How can I achieve this? Is there a place I can store the script in a global variable and call that in the jobs where ever I need?
You can load any groovy file living on the Jenkins master within the groovy postbuild and execute it. For example, you could have a special directory on the c drive where all the common scripts live. I'll update my answer later with some code that shows you how to load the script in.
Update
Assuming you have a test.groovy file on your C: drive, it should be as simple as the following in Groovy Postbuild:
evaluate(new File("C:\\test.groovy"))
Please view the comment section of the Groovy Postbuild for more examples and possibly other ways.
Here is the solution that worked for me:
Installed Scriptler plugin for Jenkins and saved the Groovy script in that. Now the script is available in JENKINS_HOME/scriptler/scripts directory. This way we can avoid manual step of copying files to Jenkins master.
Used the groovy file in Post build:
def env = manager.build.getEnvironment(manager.listener) evaluate(new File(env['JENKINS_HOME'] + "\\scriptler\\scripts\\GroovyForPostBuild.groovy"))
This is a copy of my answer to this similar question on StackOverflow:
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}

Resources