Can I change the working directory during a Jenkins job for all successive steps?
My job's first step checks out a git project. Unfortunately this project has a mix of technologies; it's not a java/maven project (so the trick of 'mvn -f subdir/pom.xml' doesn't apply) nor is it a gradle project. So I'd like to change to a subdirectory of the checked-out project and start running Jenkins plugins, like invoking shell scripts, like running tox to test python code, like running docker to build images, etc.
Maybe Jenkins wants every step to begin in $WORKSPACE, and allowing a directory change during the job would break some vital assumptions?
I know this has been asked before. Similar questions but answers specific to maven:
Jenkins Maven Build -> Change Directory and
Jenkins: How To Build multiple top-level projects from a git repository?
Similar question but answer specific to gradle:
Change directory during a build job on jenkins
You can separate out job based on sub folders and use filter to checkout in you SCM configuration so only sub folder that you want for that job will get cloned in your workspace. As your first step of your build use batch/shell command to move all file from sub folder to workspace. and then run all the steps that you want.
Related
Is there a way (like with Jenkins maven plugin) to set the .ivy2/cache dir
local to the workspace of a sbt job in Jenkins? The motivation is to be able to perform a 'clean' build each time.
If not, is there some other way I can validate that all sbt dependencies are resolved from external repositories during the build, and not from local cache?
For changing default ivy home, take a look at this SO post.
One other possible solution is to add Execute Shell step prior to Build using sbt:
rm -rf ~/.ivy2/cache
I'm creating a jenkins pipeline. We have a bash script wich need to be executed but which is not in the repository itself.
How do I have to execute this script? I tried:
configFileProvider: use the configFileProvider to get the script in a variable. The execution did not work and I'm also thinking this is not the way to do it? This is meant for config files and not real scripts?
I have a shared library which contains resources/. From here I've copied the script and executed it with sh after copying the content in a file.. This went well but I have the script in my workspace which I do not prefer. I want to execute commands on my workspace but when I perform git commit's etc. I don't want the file to be in my workspace if that's possible.
What is the right way to execute a managed script (from managed scripts or from in git) in my jenkins pipeline?
Without pipelines I use: Managed scripts. When I execute it I see in the logs:
executing script 'test-xxx.sh'
[test-xxx] $ /bin/bash /tmp/build_step_template3284004xxx.sh param1 param2 param3
This is the ideal solution I want to replicate in Jenkins pipelines.
My script which is NOT in my workspace but is executed on my workspace and is temporary.
I used to use the Scriptler plugin, however it had security issues raised against it.
When I moved all my stuff from freestyle builds to pipelines (expressed in Groovy) in 2017/8 I migrated that functionality to Shared Libraries (as an aside: right place to put code that runs against the Jenkins object model).
All paths in a Jenkins job (freestyle or pipeline) are %JENKINS_WORKSPACE% relative. Jenkins does not natively like you going above this directory on the filesystem.
I would highly recommend all the stuff your build process needs be in the primary Jenkins workspace (preferably via a git checkout or a shared library). Versioning everything in git will always be your friend. If this is not what you want to do, other options could be:
Shared Workspace
This post
I am new to Jenkins 2 and pipeline feature, and I am setting up a project to use the Jenkinsfile for pipeline.
I can see there are 3 workspace created:
project-xxxxx
project-xxxxx#script
project-xxxxx#tmp
When I run tox in pipeline, it complains about no tox.ini found, I suspect it's in side folder project-xxxxx which is empty, but the project files are inside project-xxxxx#script
Should I use checkout scm to populate the workspace with project files? Or am I suppose to use the project files in project-xxxxx#script and how do I do it properly?
Can someone please explain to me how those 3 folders work together?
You should not have to worry about the workspace in a pipeline. You start the build, you get a workspace and anything you checkout out or copy into it should be there.
How do you start the pipeline? Inline script, from scm or via a Multi-Branch or similar job type?
How are you getting files into your workspace?
I'm running a jenkins job on a slave and i want to store the generated artifacts in the server.Since the job is currently running on the slave the artifacts are also created there.
I tried using post build actions --->archive the artifacts.But it throws the following build error
ERROR: No artifacts found that match the file pattern "**/*.gz". Configuration error?
ERROR: '**/*.gz' doesn't match anything: '**' exists but not '**/*.gz'
Any help in this regards is highly appreciated.
Sounds like Copy To Slave Plugin is what you need
It can copy to slave (before build) and from slave (after build)
Copy files back to master node:
To activate this plugin for a given job, simply check the Copy files back to the job's workspace on the master node checkbox in the Post-build Actions section of the job. You then get the same two fields as for the Copy files to slave node before building section (note that label in the screenshot is old):
if you want to copy artifacts from JobA to the workspace of some other Job, you can do it using the Copy Artifact Plugin which is very simple to understand.
In case you just want to archive the artifacts already in JobA, then you are already in this direction and need to check what you are missing... are you sure that the artifacts are in the current workspace?
Doron
I have created a Jenkins job which used to run in the master and I have the build.xml file in the master.
Now I have added a slave node and added the setting Restrict where this project can be run so that my job always runs on a particular slave.
Now my build jobs are failing and I can see:
[EnvInject] - Loading node environment variables.
Building remotely on demo_slave_inst2 (slave1) in workspace /root/slave/workspace/demo_job
FATAL: Unable to find build script at /root/slave/workspace/demo_job/autobvt.xml
Build step 'Invoke Ant' marked build as failure
Recording test results
Finished: FAILURE
This autobvt.xml file already exists in the master. So looks like I need to copy this file over to the slave node manually which does not looks like quite handy.
How I can instruct jenkins to copy this as part of the build?
Use "Copy data to Workspace" http://wiki.jenkins-ci.org/display/JENKINS/Copy+Data+To+Workspace+Plugin using which you can copy the files from master to slave and run them as a part of build process (No manual effort needed!)
I sorted the issue using the Copy to Slave plugin.