I have a couple jenkins builds that run every second hour or so, since jenkins stores the data and metadata for the builds this takes up a lot of space but most of that space goes to the jars that are stored.
Jenkins keeps every jar for every build and most of them don't really change from one build to another so I was wondering if there's a way to
a) store only the jars that changed, which would be the best case scenario, something using symbolic links or something;
b) don't store the jars at all, we don't really check the builds by using the jars as a debug tool so we don't really need them. Of course I could put a cron to erase them, but I'd prefer do that from inside jenkins if possible.
Jenkins only stores jars and such if you have an "Archive the artifacts" post-build action in your job. If you don't have this, it doesn't archive anything except for logs and results.
If you're wanting to store SOME stuff but just not the jars, you can change the Excludes line in the advanced setting of the "Archive the artifacts" post-build action.
Related
Is it possible in Jenkins to only delete specific artifacts created by a build job?
In my situation I create 4 artifacts - 3 of which are 2kb .txt files and one of which is 0.5gb tar.gz. I'd like to keep the 3 smaller artifacts indefinitely, but aggressively remove the tar.gz artifacts after 5 builds.
Using logrotator this doesn't seem to be an option (it can only remove all artifacts or none) but I was hoping there might be a way to write a Groovy library that could be called after each new build, or a pluggin that could handle this for me?
See: https://javadoc.jenkins.io/hudson/tasks/LogRotator.html
Does anyone have any pointers on how to get started with this?
You can safely remove build artifacts directly on filesystem level.
If you want to trigger this directly from within Jenkins, the easiest way is probably to write a Groovy post-build script that takes care of the deletion.
I was recently put in charge of all Jenkins-related work at my job, and was tasked with storing build artifacts from our declarative pipelines in a place where:
- They are accessible to everyone on the team
- They can be stored for long periods of time
Ideally they would be visible on the Jenkins interface, where they appear when using the default 'archiveArtifacts' command. I know this saves them in the JENKINS_HOME directory. The problem is that I have to discard old builds to avoid running out of space and the artifacts are deleted with them. Furthermore, I don't have access to the server that Jenkins runs on because it's managed by a separate team, so I can't go into JENKINS_HOME.
I looked into a few ARMs like Nexus and Artifactory, but from my understanding those are only supposed to be used for full releases. I'm looking to save artifacts after every new merge, which can happen multiple times a day.
I'm currently saving them on a functional user's home directory, but I'm the only one with direct access to it so that's no good. I also looked into plugins like ArtifactDeployer, which doesn't support pipelines and only does as much as a 'cp' command as far as I could tell.
I ended up creating some freestyle jobs that copy artifacts from the pipelines and save them directly in their workspace. This way they're stored on our Jenkins slaves and visible through the interface to anyone who has permission to view job workspaces.
Nexus does not care what kind of artifacts you drop there. Its a good idea to use it.
I've used jenkins for quite a few years but have never set it up myself which I did at my new job. There are a couple questions and issues that I ran into.
Default workspace location - It seems like the latest Jenkins has the default workspace in Jenkins\jobs[projectName]\workspace and is overwritten (or wiped if selected) for every build. I thought that it should instead be in Jenkins\jobs[projectName]\builds[build_id]\ so that it would be able to store the workspace state for every build for future reference?
Displaying workspace on the project>Build_ID page - This goes along with the previous as I expected each 'workspace' for previous builds to show here. Currently in my setup this individual page gives you nothing except the Git revision and what repo changes triggered the build. As well as console output. Where are the artifacts? Where is the link to this build's workspace that was used?
Archiving Artifacts in builds - When choosing artifacts, the filter doesn't seem to work. My build creates a filestructure with the artifacts in it inside workspace. I want to store this and the artifacts filter says it starts at workspace. So I put in 'artifacts' and nothing gets stores (also where would this get stored?). I have also tried '/artifacts' and 'artifacts/*'.
Any help would be great! Thanks!
It does seem like you are confused about several aspects of Jenkins.. I think your question basically boils down to the following.
What is a difference between a workspace and a build?
So, here are some thoughts on this topic:
Builds are historical data. They (usually) don't change like a workspace does during building/checkout.
Builds contain information about a run (e.g. its status, build number, change log, etc) and any artifacts that you tell it to archive (logs, test results, etc). They (usually) don't contain source code like a workspace.
Builds are stored in the Jenkins\jobs\[projectName]\builds\[build_id]\ directory. This is a directory managed by Jenkins and you (usually) do not need to modify anything in this directory. However, workspaces are directories meant for the build and you can do pretty much anything with them and place them anywhere (it does not need to be in the default Jenkins\jobs\[projectName]\workspace directory.
Workspaces should be able to be wiped at any given time. To restore it, just rebuild the job with the same parameters/revision. If you need to keep something after a build, tell Jenkins to archive it before the build is done.
In regard to saving the entire state, I don't think you need to do that. As mentioned in #4, you should be able to reproduce the same build by kicking off the same revision/parameters as the build in question. If you cannot get back to the original state from the same revision/parameters, then that might be something to strive for as debugging is going to be a nightmare. :)
A workspace is an aspect of the project and not a build and that is why there is no link to the workspace from that page. Again, a build is just saved data from a previous run. A project uses the workspace to build stuff and that is why you can get to the workspace from that page.
In regard to how to save artifacts, you must specify the names of the files you want to save. Unless you are trying to save a file called "artifacts", then you should probably use something else. How about **/*.log for all log files? or **/*.xml for all xml files?
Hope this helps.
I'd like to get a hint how (which plugin) it is possible run SINGLE Jenkins job by the user chosen way. User MUST be able to choose the job he/she wants to run and choose the rule of execution:
E.g:
Create only jar files;
Create jars and send them over ssh
Create jars, generate documentation, etc...
I've found out a few plugins (Artifactory, Release plugin) but seems they don't support such logic.
I know that such thing can be implemented by creating several jobs, but this would require additional disk space.
Many Thanks!
In order to solve my issue, I've decided to create a few Jenkins jobs with the same custom workspace. So that, when a IT engineer runs any of these "connected" (which have the same workspace) jobs the workspace is updated (have a look at the CVS rules for your job) and that's why we avoid wasting of space.
Additionally, its (job) behaviour can be configured easily => the sets of rules (shell scripts, gradle, batch etc) and their sequence in order to achieve the desired result.
The last advantage, but not the least one, is that the security (access control) is still very easy to configure.
I think, that is the correct way.
I have a fairly complicated Jenkins job that builds, unit tests and packages a web application. Depending on the situation, I would like to do different things once this job completes. I have not found a re-usable/maintainable way to do this. Is that really the case or am I missing something?
The options I would like to have once my complicated job completes:
Do nothing
Start my low-risk-change build pipeline:
copies my WAR file to my artifact repository
deploys to production
Start my high-risk-change build pipeline:
copies my WAR file to my artifact repository
deploys to test
run acceptance tests
deploy to production
I have not found an easy way to do this. The simplest, but not very maintainable approach would be to make three separate jobs, each of which kicks off a downstream build. This approach scares me for a few reasons including the fact that changes would have to be made in three places instead of one. In addition, many of the downstream jobs are also nearly identical. The only difference is which downstream jobs they call. The proliferation of jobs seems like it would lead to an un-maintainable mess.
I have looked at using several approaches to keep this as one job, but none have worked so far:
Make the job a multi-configuration project (https://wiki.jenkins-ci.org/display/JENKINS/Building+a+matrix+project). This provides a way to inject the job with a parameter. I have not found a way to make the "build other projects" step respond to a parameter.
Use the Parameterized-Trigger plugin (https://wiki.jenkins-ci.org/display/JENKINS/Parameterized+Trigger+Plugin). This plugin lets you trigger downstream-jobs based on certain triggers. The triggers appear to be too restrictive though. They're all based on the state of the build, not arbitrary variables. I don't see any option provided here that would work for my use case.
Use the Flexible Publish plugin (https://wiki.jenkins-ci.org/display/JENKINS/Flexible+Publish+Plugin). This plugin has the opposite problem as the parameterized-trigger plugin. It has many useful conditions it can check, but it doesn't look like it can start building another project. Its actions are limited to publishing type activities.
Use Flexible Publish + Any Build Step plugin (https://wiki.jenkins-ci.org/display/JENKINS/Any+Build+Step+Plugin). The Any Build Step plugin allows making any build action available to the Flexible Publish plugin. While more actions were made available once this plugin was activated, those actions didn't include "build other projects."
Is there really not an easy way to do this? I'm surprised that I haven't found it and even more surprised that I haven't really seen any one else trying to do this? Am I doing something unusual? Is there something obvious that I am missing?
If I understood it correct you should be able to do this by following these Steps:
First Build Step:
Does the regular work. In your case: building, unit testing and packaging of the web application
Depending on the result let it create a file with a specific name.
This means if you want the low-risk-change to run afterwards create a file low-risk.prop
Second Build Step:
Create a Trigger/call builds on other projects Step from the Parameterized-Trigger
plugin.
Entery the name of your low-risk job into the Projects to build field
Click on: Add Parameter
Choose: Parameters from properties File
Enter low-risk.prop into the Use properties from file Field
Enable Don't trigger if any files are missing
Third Build Step:
Check if a low-risk.prop file exists
Delete the File
Do the same for the high-risk job
Now you should have the following Setup:
if a file called low-risk.prop occurs during the first Build Step the low-risk job will be started
if a file called high-risk.prop occurs during the first Build Step the high-risk job will be started
if there's no .prop File nothing happens
And that's what you wanted to achieve. Isn't it?
Have you looked at the Conditional Build Plugin? (https://wiki.jenkins.io/display/JENKINS/Conditional+BuildStep+Plugin)
I think it can do what you're looking for.
If you want a conditional post-build step, there is a plugin for that:
https://wiki.jenkins-ci.org/display/JENKINS/Post+build+task
It will search the console log for a RegEx you specify, and if found, will execute a custom script. You can configure fairly complex criteria, and you can configure multiple sets of criteria each executing different post build tasks.
It doesn't provide you with the usual "build step" actions, so you've got to write your own script there. You can trigger execution of the same job with different parameters, or another job with some parameters, in standard ways that jenkins supports (for example using curl)
Yet another alternative is Jenkins text finder plugin:
https://wiki.jenkins-ci.org/display/JENKINS/Text-finder+Plugin
This is a post-build step that allows to forcefully mark a build as "unstable" if a RegEx is found in console text (or even some file in workspace). So, in your build steps, depending on your conditions, echo a unique line into console log, and then do a RegEx for that line. You can then use "Trigger parameterized buids" and set the condition as "unstable". This has an added benefit of visually marking the build different (with a yellow ball), however you only have 1 conditional option with this method, and from your OP, looks like you need 2.
Try a combination of these 2 methods:
Do you use Ant for your builds?
If so, it's possible to do conditional building in ant by having a set of environment variables your build scripts can use to conditionally build. In Jenkins, your build will then be building all of the projects, but your actual build will decide whether it builds or just short-circuits.
I think the way to do it is to add an intermediate job that you put in the post-build step and pass to it all the parameters your downstream jobs could possibly need, and then within that job place conditional builds for the real downstream jobs.
The simplest approach I found is to trigger other jobs remotely, so that you can use Conditional Build Plugin or any other plugins to build other jobs conditionally.