How to configure Jenkins "Pipeline script from SCM" with SVN - jenkins

In my Jenkins project, I've configured "Pipeline script from SCM" with Subversion as the SCM, a Repository URL corresponding to a repository on our test VisualSVN server (for sake of discussion, call it https://foo.bar.com/svn/Kofax/) and a Script Path of "Jenkins files/Jenkinsfile".
When I trigger a build, I get the following console output. The repository path appears to have been improperly concatenated (e.g. it has two copies of 'svn'), but I don't know where the extraneous characters are coming from:
Started by remote host
org.tmatesoft.svn.core.SVNException: svn: E160013: '/svn/Kofax/!svn/bc/10/Jenkins%20files/Jenkinsfile' path not found: 404 Not Found (https://foo.bar.com)
I've tried removing the repository name from the Repository URL, but then Jenkins says it can't connect to the repository (as you might expect). I've also tried using an underscore rather than a space in the Script Path.
Any suggestions would be appreciated.

Unchecking the Lightweight Checkout option allowed the build to continue, so I think this might be a bug in Jenkins.
Also it appears that the repository name has to be part of the Script Path, so in my case the correct Script Path was Kofax/Jenkins_files/JenkinsFile.
Hope this helps some other Jenkins newbie.

Related

Jenkins Copy Artifact - does not copy any files

I am using copy artifact 1.46 in Jenkins 2.263.4 and want to copy a file from one job to another. However, it fails to do so. The error is always:
ERROR: Failed to copy artifacts from TestPack with filter: **
Have tried this with both a scripted pipeline job and a freestyle one, on both Windows and Centos, but same result. I know it has found the job, because I get an error if the job name if wrong. The job I want to copy from only has a single text file in its root directory.
My pipeline script is:
node ("${env.Node}") {
stage('dodeploy') {
copyArtifacts(projectName: 'TestPack');
}
}
I have tried copyArtifacts with and without a filter and with and without a target. In the freestyle project I tried similar settings settings, but get exactly the same error.
Feel I must be missing something obvious, but cannot see what.
Turns out that I was not interpreting the 'Artifacts' part of 'copyArtifacts' literally enough. It looks as though copyArtifacts can only copy files that have previously been archived in a post build step (or pipeline stage).

Jenkins Multibranch Pipeline: script / jenkinsfile as svn external

I have a multibranch pipeline in Jenkins. I want to include my script file (jenkinsfile) as svn file external into my development branches to organize the script centralized for all branches. Unfortunately the scan of the multibranch pipeline isn't able to find the script file as it is only looking inside the declared branch and not in the included svn external locations.
Has anyone an idea how can I fix this?
Below is an example of my svn structure, job config and further information.
SVN:
root/
scripts/
jenkinsfile
code/
version1/
branchX/
...
version11/
branchY/
...
SVN external property for branchX, branchY, etc.
Local path: jenkinsfile
URL: ^/scripts/jenkinsfile
Revision Peg: 12345
Multibranch job configuration:
Subversion
Project Repository Base: http://.../root/code/
Include branches: version1/branchX, version11/branchY
Build configuration
Mode: by Jenkinsfile
Script path: jenkinsfile
Log message of scan in multibranch pipeline:
...
Checking candidate branch /code/version1/branchX#HEAD
‘jenkinsfile’ not found
Does not meet criteria
...
I already tried to disable the lightweight checkout of the subversion scm plugin according to this advice:
Multibranch pipeline with jenkinsfile in svn:external
(I've added -Djenkins.scm.impl.subversion.SubversionSCMFileSystem.disable=true under <service><arguments>... in jenkins.xml)
But jenkins is still not able to find the script. And in fact if I put my script directly in e.g. branchX the disabled lightweight checkout leads to a double checkout into my workspace (first one to read the script file and second one as it's my first stage in the script itself).
Maybe my whole setup is wrong too or not the ideal way of doing?
I would be pleased about your help and tips. Thanks and Greetings!
If you are working on a linux or bsd(osx) system, you could create a hard-link from root/scripts/jenkinsfile to root/code/version#/branchX/jenkinsfile for each active branch
That way, each branch will have its own jenkinsfile available locally, enabling you to use the lightweight checkout, and any change you introduce to the jenkinsfile in any location will be available to all other branches (the file system will keep a single copy of the file, regardless of being accessible form many different locations).
The bash command to create such link will be
ln root/scripts/jenkinsfile root/code/version#/branchX/jenkinsfile
You will need to remember to create a new link each time a branch is created, or automate that using hooks

Copy file from Jenkins master to slave in Pipeline

I have some windows slave at my Jenkins so I need to copy file to them in pipeline. I heard about Copy To Slave and Copy Artifact plugins, but they doesn't have pipeline syntax manual. So I don't know how to use them in pipeline.
Direct copy doesn't work.
def inputFile = input message: 'Upload file', parameters: [file(name: 'parameters.xml')]
new hudson.FilePath(new File("${ENV:WORKSPACE}\\parameters.xml")).copyFrom(inputFile)
This code returns and error:
Caused: java.io.IOException: Failed to copy /var/lib/jenkins/jobs/_dev/jobs/(TEST)job/builds/107/parameters.xml to d:\Jenkins\workspace\_dev\(TEST)job\parameters.xml
Is there any way to copy file from master to slave in Jenkins Pipeline?
As I understand copyFrom is executed on your Windows node, therefore the source path is not accessible.
I think you want to look into the stash/unstash steps (Jenkins Pipeline: Basic Steps), which work across different nodes. Also this example might be helpful.
Pipeline DSL context runs on master node even that your write node('someAgentName') in your pipeline.
Try to use stash/unstash, but it is bad for large files.
Try External Workspace Manager Plugin. It has
pipelines steps and good for large files.
Try to use an intermediate storage. archive() and sh("wget $url") will be helpful.
If the requirement is to copy an executable to the test slave and to publish the test results, this is easy to do without the Copy to Slave plugin.
A shared folder should be created on each test slave (normal Windows shared folder).
After build: Build script copies the executable to the shared directory on each slave. A simple batch script using copy command is sufficient for this.
stage ('Copy to slaves') {
steps {
bat 'call "copy-to-slave.bat"'
}
}
During test: The test script copies the executable to another directory and runs it.
After test: Post-build action "Publish Robot Framework test results" can be used to report the test results. It is not necessary to copy the test result files back to the master first.
I recommend on Pipeline: Phoenix AutoTest plugin
Jenkins plugin website:
https://plugins.jenkins.io/phoenix-autotest/#documentation
GitHub repository of plugin:
https://github.com/jenkinsci/phoenix-autotest-plugin

Jenkins Pipeline as Code with Docker Error

For one of my projects that I have on GitHub, I wanted to build it as a docker image and push it to my docker hub. The project is a sbt one with a Scala codebase.
Here is how my JenkinsFile is defined:
#!groovy
node {
// set this in Jenkins server under Manage Jenkins > Credentials > System > Global Credentials
docker.withRegistry('https://hub.docker.com/', 'joesan-docker-hub-credentials') {
git credentialsId: '630bd271-01e7-48c3-bc5f-5df059c1abb8', url: 'https://github.com/joesan/monix-samples.git'
sh "git rev-parse HEAD > .git/commit-id"
def commit_id = readFile('.git/commit-id').trim()
println comit_id
stage "build" {
def app = docker.build "Monix-Sample"
}
stage "publish" {
app.push 'master'
app.push "${commit_id}"
}
}
}
When I tried to run this from my Jenkins server, I get the following error:
java.io.FileNotFoundException
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:167)
at jenkins.plugins.git.GitSCMFile$3.invoke(GitSCMFile.java:159)
at jenkins.plugins.git.GitSCMFileSystem$3.invoke(GitSCMFileSystem.java:161)
at org.jenkinsci.plugins.gitclient.AbstractGitAPIImpl.withRepository(AbstractGitAPIImpl.java:29)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.withRepository(CliGitAPIImpl.java:65)
at jenkins.plugins.git.GitSCMFileSystem.invoke(GitSCMFileSystem.java:157)
at jenkins.plugins.git.GitSCMFile.content(GitSCMFile.java:159)
at jenkins.scm.api.SCMFile.contentAsString(SCMFile.java:338)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:101)
at org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition.create(CpsScmFlowDefinition.java:59)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:232)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:404)
Finished: FAILURE
Since this is running inside a VM on Azure, I thought the VM was not able to reach outside, but that seems not to be the case as I was able to ssh into the VM and git pull from the Git repo. So what is the problem here? How could I make this work?
for me unchecking "lightweight checkout" fixed the issue
I experienced the exact same error. My setting:
Pipeline build inside a dockerized Jenkins (version 2.32.3)
In the configuration of the job, I specified a check out into a subdirectory: Open the configuration, e.g. https://myJenkins/job/my-job/configure. At the bottom, see section Pipeline -> Additional Behaviours -> Check out into a sub-directory with Local subdirectory for repo set to, e.g., my-sub-dir.
Expectation: Upon check out, the Jenkinsfile ends up in my-sub-dir/Jenkinsfile.
Via the option Script path, you configure the location of the Jenkinsfile so that Jenkins can start the build. I put my-sub-dir/Jenkinsfile as value.
I then received the exception you pasted in your question. I fixed it by setting Script Path to Jenkinsfile. If you don't specify a sub-directory for check out, then still try double checking values for Script Path.
Note: I have another Jenkins instance at work. There I have to specify Script Path including the custom check out sub-directory (as mentioned in Expectation above).
GO TO Job-->Config-->Pipline and uncheck checkbox lightweight checkout"
lightweight checkout : selected, try to obtain the Pipeline script contents >directly from
the SCM without performing a full checkout. The advantage of this mode
is its efficiency; however, you will not get any changelogs or polling
based on the SCM. (If you use checkout scm during the build, this will
populate the changelog and initialize polling.) Also build parameters
will not be substituted into SCM configuration in this mode. Only
selected SCM plugins support this mode.

Jenkins GitHub pull request builder - get branch name for execute shell

I am using Jenkins GitHub pull request builder plugin for running my unit tests when a pull request is made vis a vis a web hook. For the build step, I need to know the name of the branch that is being merged in (e.g. I need develop branch if merging that into master branch). Is there a way to get access to this in the Jenkins execute shell? Thanks,
Your link has the answer:
The plugin makes some very useful environment variables available.
ghprbActualCommit
ghprbActualCommitAuthor
ghprbActualCommitAuthorEmail
ghprbPullDescription
ghprbPullId
ghprbPullLink
ghprbPullTitle
ghprbSourceBranch
ghprbTargetBranch
sha1
You'll want to use $ghprbSourceBranch to get the value of the branch being built somewhere else in your script.
In the "Branch Specifier", enter ${sha1}.

Resources