How to automatically clean up workspace directories of deleted Jenkins Jobs? - jenkins

I have a Jenkins instance with hundreds of jobs. Recently I noticed that the storage space on the server was almost full and the main reason was - the JENKINS_HOME/workspace/ directory included multiple workspace folders of deleted jobs (freestyle projects).
Even if I manually delete all redundant folders, they will start accumulating again.
I can not use the Workspace Cleanup Plugin to delete Workspace after each build of each project or just periodically delete the whole 'workspace' directory, as for many projects, the workspace overview in Jenkins is needed.
I can write some bash script for this job, that compares the JENKINS_HOME/workspace/ and JENKINS_HOME/jobs/, but this is a bit 'hardcoding'.
Is there a Plugin that can periodically delete leftover workspaces?
Or maybe I can configure Jenkins to also delete workspaces when a Project is deleted? (all I could find is an open bug ticket, that was last updated in 2018)

How about creating a separate Jenkins that runs periodically and does the cleanup? You can write some simple groovy code to get a list of Jobs configured, and compare it to what you have in workspace directory and delete.

Related

Jenkins Deleting Workspace

I have Jenkins pipeline projects, and everything works fine as long as I run the project at least once per month. If I wait more than a month Jenkins will delete the workspace for that pipeline project, causing the project to do a brand new git checkout and compile. This results in a super slow build, since all of the intermediate object files/etc are regenerated from scratch.
I cannot find what setting in Jenkins is causing it to clean up these older workspaces. If I modify the pipeline to check out to a custom directory instead of the workspace directory then it works fine, so it doesn't appear to be the git plugin itself, or anything like that.
'Discard old builds' is disabled in the General settings for these projects.
Can someone point me to the setting that is causing 'older' workspaces to get cleaned up for some reason?

Jenkins workspace root getting overwritten or not accepted

I am implementing Jenkins into already established Perforce workflow.
Each of the workspaces we have in Perforce(and there is a lot of them) is using the Drive letter( for example D:\ ) as the root directory for the workspace.
I am using p4Plugin in Jenkins to sync the code before running the actual scripts. And Jenkins has it's own workspace which is being used every time I start to sync the code.
I tried using the Spec file, for workspace behavior in P4 Plugin, where I would specify the root to be D:\ but whenever it loads it will still create jenkins workspace root.
I also tried using the Static workspace behavior, and that will work, but the problem is that in order for that workflow to work, the person needs to create a workspace manually on the worker of jenkins setup, and then create the job, which is then defeating the purpose of using jenkins at first. Plus we need a workspace per job.
Which made me think, if I use an already existing workspace with D:\ being the root, and use a Temp workspace behavior in jenkins, that it will copy the root settings as well as other ones. But unfortunatelly it also sets the sync to be to the jenkins workspace.
In short, all I want is to be able to use the D:\ drive to sync all the code instead of putting them into the jenkins root directory and syncing the code to the project folders inside.(ex C:\JenkinsData\syncProject...)
That's the design of the p4 plugin. It puts the workspace where jenkins asks us to.
See property jenkins.model.Jenkins.workspacesDir here: https://wiki.jenkins.io/display/JENKINS/Features+controlled+by+system+properties
I don't think the default in that wiki is correct.
On all your master and slaves, you can try to change that to just D:\
That assumes your client view definitions (right hand side) will not overlap.
Otherwise:
A "form-in client" trigger script can alter the root. The script should only change jenkins relevant clients, so you'll need to pass something to the script in the trigger definition to signify that it is for a jenkins job. Examples could be a client naming convention and/or the clientip.
Your Perforce Admin, if that's not you, will have to assist.

Job configurations are missing when copied from folders in Jenkins

This is reproducible 100%.
We are working on different branches of the release, but each branch should run the same jobs, with some minor change. So ideally I want to copy all the jobs from one working branch to a new branch.
I select a New item -> folder and select copy from another folder.
The new folder contains the all the jobs from source folder, but all the job configurations are missing. In another word, I have jobs created just with job names, I need to refill everything else. This is essentially useless.
I googled and did not see any related errors. Anyone have any good advice on copy jenkins folders ? I am jenkins 1.651.3, ubuntu 14.04
I tried the same on jenkins 2.19.1 and worked with out the issue your are seeing.
The best way to create similar array of jobs for new branches is via groovy & using https://jenkinsci.github.io/job-dsl-plugin/
create a job where you execute a groovy script to iterate over a list of branches and creates jobs .
DSL plugin is available for jenkins 1.642 and above
Note that manipulating content in JENKINS_HOME is not advised and is typically restricted
I should also mentioned it turns out just our computer issue. Lack of ram. After we added more ram, it' all working perfectly!

Jenkins - Deleting artifacts automatically

JENKINS
I am noticing that the every time I run one of my jobs in Jenkins, there are two files created in the /workspace/build/distributions dir. The two files have the extensions of .tar and .tgz. Every time, I run the job, another set of these files are created. So, if I run the job 3 times, there will be 6 files all together. I have noticed that during the dependency check phase, these artifacts slow things down. Therefore, I wanted to remove them automatically before each time this job runs. I have attempted the configs in the image below. In addition, I have tried the workspace cleanup plugin and that completely deleted the workspace. That is definitely not what I wanted.
Therefore, what would be the best way to go about this.
What scm plugin are you using? Some of the scm plugins allow you to clean the workspace before an update (e.g. SVN's "Emulate clean checkout" and Git's "Clean before checkout" options).
If you're not using a scm plugin, can you remove the files in a batch/shell script during the first build step?
Or perhaps you can go about it from the reverse direction. Can you get rid of the files as the last build step of the job? That way, they are gone when the next build comes along.

Renaming a Jenkins Job

I have renamed a Jenkins Job from the Jenkins GUI. I changed the Project name in the Configure menu and hitting Save afterwards.
However the workspace name for this Jenkins job has not been changed. What I am finding is upon the job execution a new workspace is getting created with this given new name and none of the contents of the old workspace is getting copied.
So the issue is contents of the old workspace is not copied to the new workspace.
What should I do instead?
I know there are several questions in SO in this area. However those do not answer my question.
Renaming job in jenkins/hudson
Rename a job in Jenkins
So please check this before marking this question as a duplicate.
I was able to workaround this is issue using the Use custom workspace option.
To change this location, I need to choose configure job and click on the Advanced button in the Advanced Project Options section.
After opening the settings, you will find some more configuration options for your job. Look for the Use custom workspace option on the right hand side and check the box.
Reference: Jenkins: Change Workspaces and Build Directory Locations
Workspaces are volatile by nature and may reside on a build node which has gone offline, therefore your build job should not rely on files being present in the workspace. However sometimes you will benefit from a speed-up by reusing unchanged files existing in workspace and decide not to clean them.
When you start a build, a new workspace is (as you noted) created, this is the correct behaviour, you should not need to store files in your workspace between builds but set up your system to load all sources from your vcs. This way you will always be able to make a fresh build from source, there are also a few options available to clear the workspace from old files.
If you do not want to populate the workspace from a source code addon you can always use the custom shell script feature to run a few shell commands to copy the needed files.

Resources