In my system, I am downloading new build everyday in 1 folder and then use it for further causes but after running jenkins job I want to delete files in the folder (not workspace) which will delete specific folders from same directory. This will help me downloading new build every time based on different jenkins job running on same machine.
EG:
I am downloading x.x build and then running jenkins job on machine and then if I want to run other job which requires x.y build, it will just see if SOME build is already there in folder. if it is there, it will not download any kit after that. So, now simplest thing I can do is delete x.x after every jenkins run (post build ) so it will download x.y next time..
Please help.
Thanks in advance
You can delete the whole folder using this syntax to delete a folder called bin:
stage('Setup') {
steps {
dir ('bin') {
deleteDir()
}
}
}
If my understanding is right, consider my below assumptions
If your jenkins is running on a Unix server, then you can configure a post build step as suggested by nerdwaller above
In the job configuration, in the build step, select the option "execute unix command"
In the box for the shell script, you can use rm -rf <<directoryname>>
Else, if your jenkins is running on a windows server, then select "execute batch command" from the build step and give the appropriate command like rmdir /Q /S nonemptydir
However, my best approach would be to use a platform independent tool like ant to delete the folders using Ant Delete Task and it can be configured similarly like the above two approaches instead selecting "invoke ant" in the build step/ post build step.
This will help you to achieve what you need.
Related
New to Jenkins so apologise in advance as I'm sure this answer is out there somewhere. Just not sure exactly how to search for what I'm after. I'm struggling a bit with the copyback process in Jenkins.
When I build, I'm running some unit tests that create some log files which I want to be stored as part of the Jenkins build. I'm running on Windows 10 and everything is running on my laptop (I'm purely trying to learn Jenkins so this is fine for me).
So my test results will always appear in C:\TestLogs\*.log. I want the results copied to my build directory which is URL: http://localhost:8080/job/loadrunner_test/1/ absolute: C:\Program Files (x86)\Jenkins\jobs\loadrunner_test\builds\1
I'm a bit confused with which plugin I should use in my post build step? Copy Artifact plugin looks as if it's meant to pass data between builds. For each build, I just want to copy C:\TestLogs*.* to the current build directory so I can see them when I click on the link for #1 in the Build History.
Many thanks!
Tim
WindowsDir
Jenkins Build
You can copy it with additional step.
Select Execute Windows batch command for that step and add this line:
xcopy C:\TestLogs C:\Program Files (x86)\Jenkins\jobs\jenkins_test\builds\%BUILD_NUMBER% /s /e
You can also check configuration for your test if you can set path location.
I am having a Jenkins job that runs Nunit tests on remote machine.
I am using Jenkins's Workspace Cleanup Plugin pluggin (https://wiki.jenkins-ci.org/display/JENKINS/Workspace+Cleanup+Plugin) to clean my workspace.
the problem is that I want to task kill some process on my machine (because otherwise I could not delete the workspace - some files will be in use and threfore could not be deleted) and I want to do it before the delete action takes place (it is always the first action on the job).
I know that there is an option in the pluggin- "External Deletion Command" - but this runs the command on all the files in the workspace where as I need it to run only once (not on a the sepsific workspace files - i.e. only this command: "c:/workspace/taskill nunit")
is there a way to do so?
Thanks
If I can suggest a different approach to use an app called LockHunter which has an API to unlock and delete your workspace. It's much more "sergical" than removing a random task and hope it's the one you meant to.
You can trigger it from command line using "run before SCM" and it'll handle the deletion and unblocking of your specific workspace.
You can also use:
"cmd /c wmic /INTERACTIVE:OFF Path win32_process Where \\"CommandLine Like '%workspace%'\\" call terminate"
Where %workspace% is your current workspace. This will go over all the tasks that are currently running and check the command line path, then it'll call terminate for anything it found.
Good luck!
Is it possible to configure Jenkins to get source code into a subdirectory of a %WORKSPACE%? Right now the source gets pulled into %WORKSPACE% and for the build output I explicitly specify a directory outside of the %WORKSPACE%.
Ideally I would like to have something similar to this:
%WORKSPACE%\source for source code and %WORKSPACE%\artifacts for build outputs. Is it possible to have this configuration?
Create a 'run batch command' build step and use xcopy, this is presuming jenkins is running on a Windows machine, if it's a deployment directory then make it a post build step.
cd c:/
xcopy /Y "c:/program files 86/junkies/workspace/app" "c:/path to new directory"
This is just a guess at your directories, replace with correct ones, the /Y forces it to be overwritten every time it's copied.
I have been tasked with looking into using Jenkins as a build server. So far I have managed to pull a project from git, restore the Nuget packages, build the project and run the unit tests. However I am struggling to find out how to generate the artifact.
The way the business would like to have the build server generate a zip file to a directory on the build server or a remote server for the systems team then to pick up and deploy to the relevant location. E.g. given a windows service project the built bin directory would be zipped up and put in the relevant artifact directory.
I thought that in order to do this I add an archive the artifacts post-build action. However I am getting the below error:
‘Watchdog.WinService.Monitor/bin/Release/*.zip’ doesn’t match anything:
‘Watchdog.WinService.Monitor’ exists but not
‘Watchdog.WinService.Monitor/bin/Release/*.zip’
If I look in the workspace for this project I can browse to the bin directory and see all the files so I unsure what I have done wrong.
Can someone please let me know if what I am trying to accomplish is possible, and also if our approach to using Jenkins is correct?
The problem is that you try to create the artifact using the archive artifatcs step.
But the step is to collect artifacts and show them on the job page.
That means you need to create the artifact first e.g. using a shell or batch script.
You can combine this with the Flexible Publish Plugin.
When you select this as post build step you can create a conditional action that runs the artifact archive task and as condition executes the script that creates the zip file.
So if that fails the task won't be executed. Also it may causes your job to 'fail' but that may not be the case in your job.
I'm looking at a Jenkins job and trying to understand it.
I have an Execute shell command box in my Build section:
> mkdir mydir
> cd mydir
>
> svn export --force https://example.com/repo/mydir .
When Jenkins is done executing that command, and moves on to the next build step, what is its working directory?
workspece-root/ or workspace-root/mydir ?
As the next step, I have Invoke top-level Maven targets (still in the Build section).
What I really want to know is: why does that execute successfully?
Is it because Jenkins automatically moves back to the workspace-root/ folder after executing a shell command box, or is it because the next job is a "top-level" job, and Jenkins therefore changes back to the workspace-root/?
Each build step is a separate process that Jenkins spawns off. They don't share anything, neither current directory, nor environment variables set/changed within the build step. Each new build step starts by spawning a new process off the parent process (the one running Jenkins)
It's not that Jenkins "move back" to $WORKSPACE. It's that Jenkins discards the previous session.
I lately saw that if you print the CWD , I would get the Project_NAME.
E.g
D:\jenkins\workspace\My_Project
Any script you might be running wont be found. Hence we can do a "CD path" before we start out scripts.
Slav's explanation is very good and I thought of complementing it by providing a real world example that shows how multiple Windows batch commands look like even if they work in the same directory:
Command 1
REM #ensures that all npm packages are downloaded
cd "%WORKSPACE%"
npm install
Command 2
REM #performs a prod-mode build of the project
cd "%WORKSPACE%"
ng build --prod --aot=true -environment=pp
So, each one ensure that current working directory points to the current project directory.