I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin
Related
I am using copy artifact 1.46 in Jenkins 2.263.4 and want to copy a file from one job to another. However, it fails to do so. The error is always:
ERROR: Failed to copy artifacts from TestPack with filter: **
Have tried this with both a scripted pipeline job and a freestyle one, on both Windows and Centos, but same result. I know it has found the job, because I get an error if the job name if wrong. The job I want to copy from only has a single text file in its root directory.
My pipeline script is:
node ("${env.Node}") {
stage('dodeploy') {
copyArtifacts(projectName: 'TestPack');
}
}
I have tried copyArtifacts with and without a filter and with and without a target. In the freestyle project I tried similar settings settings, but get exactly the same error.
Feel I must be missing something obvious, but cannot see what.
Turns out that I was not interpreting the 'Artifacts' part of 'copyArtifacts' literally enough. It looks as though copyArtifacts can only copy files that have previously been archived in a post build step (or pipeline stage).
I have a multibranch pipeline in Jenkins. I want to include my script file (jenkinsfile) as svn file external into my development branches to organize the script centralized for all branches. Unfortunately the scan of the multibranch pipeline isn't able to find the script file as it is only looking inside the declared branch and not in the included svn external locations.
Has anyone an idea how can I fix this?
Below is an example of my svn structure, job config and further information.
SVN:
root/
scripts/
jenkinsfile
code/
version1/
branchX/
...
version11/
branchY/
...
SVN external property for branchX, branchY, etc.
Local path: jenkinsfile
URL: ^/scripts/jenkinsfile
Revision Peg: 12345
Multibranch job configuration:
Subversion
Project Repository Base: http://.../root/code/
Include branches: version1/branchX, version11/branchY
Build configuration
Mode: by Jenkinsfile
Script path: jenkinsfile
Log message of scan in multibranch pipeline:
...
Checking candidate branch /code/version1/branchX#HEAD
‘jenkinsfile’ not found
Does not meet criteria
...
I already tried to disable the lightweight checkout of the subversion scm plugin according to this advice:
Multibranch pipeline with jenkinsfile in svn:external
(I've added -Djenkins.scm.impl.subversion.SubversionSCMFileSystem.disable=true under <service><arguments>... in jenkins.xml)
But jenkins is still not able to find the script. And in fact if I put my script directly in e.g. branchX the disabled lightweight checkout leads to a double checkout into my workspace (first one to read the script file and second one as it's my first stage in the script itself).
Maybe my whole setup is wrong too or not the ideal way of doing?
I would be pleased about your help and tips. Thanks and Greetings!
If you are working on a linux or bsd(osx) system, you could create a hard-link from root/scripts/jenkinsfile to root/code/version#/branchX/jenkinsfile for each active branch
That way, each branch will have its own jenkinsfile available locally, enabling you to use the lightweight checkout, and any change you introduce to the jenkinsfile in any location will be available to all other branches (the file system will keep a single copy of the file, regardless of being accessible form many different locations).
The bash command to create such link will be
ln root/scripts/jenkinsfile root/code/version#/branchX/jenkinsfile
You will need to remember to create a new link each time a branch is created, or automate that using hooks
I try to configure Jenkins' seed job, where whole business is in provided DSL script. I want to seperate that script from its configuration, which I want to locate in additional yml file. When I try to read that file:
#Grab('org.yaml:snakeyaml:1.17')
import org.yaml.snakeyaml.Yaml
def workDir = SEED_JOB.getWorkspace()
def config = new Yaml().load(("${workDir}/config.yml" as File).text)
I receive error
java.io.FileNotFoundException: /var/lib/jenkins/workspace/test.dsl/config.yml (No such file or directory)
I suppose that Jenkins is looking for the file on a master host, not an agent node where workspace is located.
Is it possible to read yml file in DSL build step on the agent node? Or maybe I have to execute that seed job always on my master host?
This seems not possible as the jobDsl script is executed on master. You can try force to run the job on master with label master.
From the documentation in section Script location:
Job DSL scripts are executed on the Jenkins master node, but the seed job's workspace which contains the script files may reside on a build node. This mean that direct access to the file specified by FILE may not be possible from a DSL script. See Distributed builds for details.
I have a simple Jenkinsfile where I want to load some data from the workspace. I am using the pipeline plugin to leverage the Jenkinsfile inside of the repository. The build is farmed off to a matching Jenkins agent. When I try to use "readFile" I get the following message:
java.io.FileNotFoundException: /path/to/jenkins/workspace/XXXXX/project/data.json (No such file or directory)
I also get the same message when trying to load a Groovy file from the workspace.
My Jenkinsfile looks like:
node('master') {
stage "Start"
echo "Starting"
stage "Load File"
def myJson = readFile "data.json"
}
Any ideas why I can't read these files?
Thanks,
Tim
When Jenkins processes a Jenkinsfile it does not automatically pull down the entire source repository. You need to execute "checkout scm" to pull down the contents of the repository. If you fail to do so no other files will be available to the pipeline script.
I'm new to Jenkins. I have a requirement where I need to run part of a job on the Master node and the rest on a slave node.
I tried searching on forums but couldn't find anything related to that. Is it possible to do this?
If not, I'll have to break it into two separate jobs.
EDIT
Basically I have a job that checks out source code from svn, then compiles and builds jar files. After that it's building a wise installer for this application. I'd like to do source code checkout and compilation on the master(Linux) and delegate Wise Installer setup to a Windows slave.
It's definitely easier to do this with two separate jobs; you can make the master job trigger the slave job (or vice versa).
If you publish the files that need to be bundled into the installer as build artifacts from the master build, you can pull them onto the slave via a Jenkins URL and create the installer. Use the "Archive artifacts" post build step in the master build to do this.
The Pipeline Plugin allows you to write jobs that run on multiple slave nodes. You don't even have to go create other separate jobs in Jenkins -- just write another node statement in the Pipeline script and that block will just run on an assigned node. You can specify labels if you want to restrict the type of node it runs on.
For example, this Pipeline script will execute parts of it on two different nodes:
node('linux') {
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
sh "make"
step([$class: 'ArtifactArchiver', artifacts: 'build/program', fingerprint: true])
}
node('windows && amd64') {
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
sh "mytest.exe"
}
Some more information at the Pipeline plugin tutorial. (Note that it was previously called the Workflow Plugin.)
You can use the Multijob plugin which adds an the idea of a build phase which runs other jobs in parallel as a build step. You can still continue to use the regular freestyle job build and post build options as well