Though I have seen many were mentioning to use Deleteworkspace plugin to use and corresponding script in post build actions of the pipeline.
here we are using the multibranch pipeline.. I'm facing the challenges to maintain the storage space ..as it is running out for every build.
my management is not recommended to delete the workspace folder..instead choose the files in workspace and delete.
upon using the command du-sh ./*/ ..i can see workspace storage is 2.5 GB, but when i cd into workspace .. the ls -lart ..got to see ...all files with less size only 1 GB approxiamtely ( not upto 2.5GB)..
can someone. assist me, where i have missed ..
and help me to understand to restore the space in Jenkins
Related
I check to discard old build, limit Max of the build to keep in Jenkins and delete the old workspace. However, when I run du -sh /var/lib/* Jenkins directory had 21G. I want under it and list files .I see just 236K. Please advise.
In addition to beung a duplicate question(see link), I hope you are aware ls -Sh only lists the space occupied by files and of inodes, if it's a directory. You must run du -hs to get the size of a directory.
Could be space by .m2 (maven local repo), .subversion (SVN data, in each dir), plus check the obvious Jenkins dirs: logs, jobs and workspaces.
I am new to jenkins . I am trying to deploy php codes to production server via "Publish over ssh" plugin . I enabled it in "post-build actions". Everything is fine but transfer is too slow [ 2 hours for 40MB transfer]
Here is the scenario:
Entire project is setup in local.Total size is nearly 700MB.
All codes pushed to BitBucket.
Now i configured build in Jenkins with "Send build artifacts over ssh " as post-build option.Inside transfer set i added " **/*. * " for source file option .
It is taking hours and hours to transfer entire project . Within 2 hours it transferred only 140MB.
Is it normal ? Do i need to change any settings? Network connections between the server in which Jenkins runs and the production server is fine .
"rsync over ssh " solved the problem of code transfer to production server.Now its taking only 2-3 seconds for a build.
Yes, 2 hours for 40MB is unexpectedly slow. However, it is not uncommon to get abysmally slow archiving of artifacts from agent to master.
Below are links to the 2 open tickets I know of tracking this. There are many others reported that have been closed over the last decade, but in my environment, I'm getting ~13Mbps transfers, despite the 10Gbps links we have between all nodes in our Jenkins cluster.
JENKINS-18276
JENKINS-63517
This isn't directly related to your question of sending artifacts via SSH, but it may help others who come upon this while trying to reduce the time it takes to archive artifacts in general. I used the pipeline utility plugin to zip everything I wanted to archive into a zip file, then I unzipped it when I want to un-archive it later.
That reduced the time by about 50x for me, presumably because it was just 1 file to transfer/archive/unarchive instead of a bunch of small files, and zipping/unzipping was less that 1 second in my case--much less time than archiving the individual files separately.
zip zipFile: 'artifacts.zip'
archiveArtifacts 'artifacts.zip'
then
copyArtifacts(filter: '**/*', projectName: currentBuild.projectName, selector: specific(params.archived_build_number))
unzip zipFile: 'artifacts.zip', quiet: true
The zip function also has a parameter to archive the zip, but I showed it separately here to be explicit. Again, nothing about this is specific to your SSH scenario, but I thought it would be useful to post it anyway.
I wonder if it is possible to remove only one build (including artifacts) from job workspace.
I tried to "Delete Build" in Build History but all it does is remove build reference from Build History table. I know I can ssh to a server and delete files from the command line but I am looking for a way to do it from Jenkins web interface.
After installing Workspace Cleanup Plugin I am able to wipe out current workspace but I want to keep my other builds in the workspace.
In your Jenkins instance, to be able to have folder/per build - set flag "Use custom workspace" in your job's settings. Here is a brief help info from the setting description:
For each job on Jenkins, Jenkins allocates a unique "workspace directory."
This is the directory where the code is checked out and builds happen.
Normally you should let Jenkins allocate and clean up workspace directories,
but in several situations this is problematic, and in such case, this option
lets you specify the workspace location manually.
One such situation is where paths are hard-coded and the code needs to be
built on a specific location. While there's no doubt that such a build is
not ideal, this option allows you to get going in such a situation.
...
And your custom directory path would look like this:
workspace\$JOB_NAME\$BUILD_NUMBER ~> workspace\my-job-name\123
where $JOB_NAME will be "my-job-name" and $BUILD_NUMBER is the build number, eq. "123".
There is one nasty problem with this approach and this is why I wouldn't recommend to use it - Jenkins will not be able to reclaim disk space for outdated builds. You would have to handle cleanup of outdated builds manually and it is a lot of hassle.
Alternative approach, that gives you more control, tools and is able to keep disk space usage under control (without your supervision) is to use default workspace settings and archive your build output (files, original source code, libraries and etc.) as a post-build action. Very-very handy and gives you access to a whole bunch of great tools like, Copy Artifact Plugin or ArtifactDeployer Plugin in other jobs.
Hope that info helps you make a decision that fits your needs best.
I also use "General/Advanced/Use custom workspace" (as in #pabloduo's answer) on a Windows machine with something like:
C:\${JOB_NAME}\${BUILD_NUMBER}
Just wanted to add a solution for getting rid of the build job's workspaces.
I use Groovy Events Listener Plugin for this.
Using the plug-in's standard configuration I just use the following Groovy script:
if (event == Event.JOB_DELETED){
new File(env.WORKSPACE).deleteDir()
}
And now the custom workspace is deleted when the build job is deleted.
Just be aware that this would also delete non-custom workspaces (because the event is triggered for all jobs on your Jenkins server).
We have a Jenkins job where html reports are generated and placed in workspace/reports. We have been using "Workspace Cleanup Plugin" so we can keep only the last report. We would like to keep the reports created during the past X days and delete the older ones. Is there a way or a plugin that can automate this?
It's not a good idea to put something into job's own workspace as an archive that needs to be accessed later. Someone may eventually clean the workspace (either manually or via jenkins pre- or post-build step) and all your precious data and statistics will be gone.
I'd suggest to publish reports using a different directory and then run a cleanup manually in it. You may as well define Jenkins global parameter (plugin) such as REPORT_ROOT=/home/${USER}/reports and use it in job config to save html reports to ${REPORT_ROOT}/${JOB_NAME}
To cleanup you'd want to run find ${REPORT_ROOT} -type f -mtime +2 -delete provided that there is no parent directory created for each separate report. This can be run in a separate job or as subproject to the job that publishes html report or in the job itself. -mtime +2 means older than 2 days.
If each build has it's own directory for html reports than the cleanup would be find ${REPORT_ROOT}/${JOB_NAME} -type d -mtime +2 -delete
Use PeriodicBackupPlugin.
It helps to take back backup of your data periodically and has a BackupExecutor that will check existing backups in each location and delete number of backups older than X number of days defined in configuration.
Check the image for info on configuration.
There is a Configuration Slicing Plugin that could help you
I wonder are there features for jenkins to capture the result /data in a node and persist it in master.
I come up with the scenario that I need to check some folders in two machines to see whether they have same no of files & same size.
If hudson can save some result like "ls -ltR" in master , then I can gather at both node the results in two jobs then compare.
Are there any elegant solution to this simple problem?
currently I can connect two machines to each other via SSH and solve the problem, while this connection is not always available.
(With SSH I believe the best way is to use rsync -an /path/to/ hostB:/path/to/)
Simple problem, only slightly elegant solution :
Write a simple job listdir which does DIR > C:\logs\list1.txt .. list
Go to Post-build Actions
Add Archive the artifacts for example from above: C:\logs\*.*
Now run a build and go to http://jenkinsservername:8080/job/listdir/
You'll see the list1.txt which you can click on, and see the contents.
I have given a Windows example, you can of course replace DIR with ls -ltr
Or use archive artifacts in combination with the Copy Artifacts Plugin to pull the result of another job in the job where the comparison shall be done.