I am trying to configure global build discard option in Cloudbees jenkins using groovy, For now, iam manually configuring global build discard in configure system..
But i couldnt find enough documents with the groovy.I could see scripts only for fetching the list of jenkins jobs and adding build discard property in the pipeline.
Cloudbees recommends this to run build discarder,
ExtensionList.lookupSingelton(BackgroundGlobalBuildDiscarder.class)doRun()
Related
When running Jenkins jobs that are kicking off Terraform scripts, the workspaces are initialised each time. Is there a way to preserve the downloaded plugins when Terraform initialises. Would the -plugin-dir option be the best?
We have a small java dev environment which uses gradle, Jenkins, and Git. We use an in-house built Gradle plugin to increment build numbers using a file to store the current number. The build number is baked into each build as part of its version data. The build number file is checked into the git workspace for the project.
We are now adding Jenkins to the environment for CI. Jenkins has its own build number which we can access via env var $BUILD_NUMBER.
The downside to our in-house Gradle build number plugin is that it uses a local file and thus builds by multiple developers do not sync build numbers. If we use Jenkins BUILD_NUMBER than that is completely different sequence than the Gradle build number plugin.
What is the best practice for this type of scenario?
If you state that only builds provided by your CI are valid for future usage it seems that you have to rely on Jenkins BUILD_NUMBER.
If you want Jenkins job BUILD_NUMBER to be started from specific value do the following:
Manage Jenkins -> Script Console
Jenkins.instance.getItemByFullName("YOUR_JOB_NAME").updateNextBuildNumber(YOUR_BUILD_NUMBER)
I installed the Deploy Plugin on my Jenkins in order to automate the deployment of my Maven built war packages to Tomcat 7. The problem is that I am able to use the plugin to deploy to a remote Tomcat server only if they are made within the same job that uses the deploy plugin. In other words, I have not been able to set up a standalone job that deploys artifacts made by a different job.
For example, I have a job named pack.foo. It uses the source code in /var/lib/project/module to create module.war and put it in /var/lib/project/module/target. However, because of the Maven version setup, the artifact posted on pack.foo's artifact page is something like module-2.0.0-SNAPSHOT.war.
The only way I am able to deploy module.war is if I add a Post-build Action to pack.foo and specify **/module.war to be a remote Tomcat manager URL (provided I have the manager's credentials in Jenkins config). Then the job's console output logs that /var/lib/project/module/target/module.war was deployed to that URL:
Deploying /var/lib/project/module/target/module.war to container Tomcat 7.x Remote with context
[/var/lib/project/module/target/module.war] is not deployed. Doing a fresh deployment.
Deploying [/var/lib/project/module/target/module.war]
How can I use this, or another plugin, to deploy a WAR artifact that was made in a separate Jenkins job? I would like to have separate jobs for artifact creation and deployment. The plugin wasn't finding **/module-2.0.0-SNAPSHOT.war or even **/module.war built by another job even though there was definitely a file on disk that matched that pattern.
See the paragraph on the Deploy Plugin's page you linked:
How to rollback or redeploy a previous build
There may be several ways to accomplish this, but here is one suggested method:
Install the Copy Artifact Plugin
Create a new job that you will trigger manually only when needed
Configure this job with a build parameter of type "Build selector for Copy Artifact", and a copy artifact build step using "Specified by build parameter" to select the build.
Add a post-build action to deploy the artifact that was copied from the other job
Now when you trigger this job you can enter the build number (or use any other available selector) to select which build to redeploy. Thanks to Helge Taubert for this idea.
How do you maintain the Jenkins job configuration in SCM along side the source code?
As source code evolves, so does the job configuration. It would be ideal to be able to keep the job configuration in SCM, for the following benefits:
easy to see who a history of the changes, including the author and the description
able to rebuild old branch/tag by checking out the revision and build just work
not having to scroll through the UI to find the appropriate section and make change
I see there is a Jenkins Job Builder plugin. I prefer a solution along the lines of Travis CI, where the job configuration is maintained in a YAML file (.travis.yml). Any good suggestions?
Note: Most of our projects are using Java & Maven.
Update 2016: Jenkins now provides a Jenkinsfile which provides exactly this. This is supported by the core Jenkins developers and actively developed.
Benefits:
Creating a Jenkinsfile, which is checked into source control, provides a number of immediate benefits:
Code review/iteration on the Pipeline
Audit trail for the Pipeline
Single source of truth for the Pipeline, which can be viewed and edited by multiple members of the project.
I've written a plugin that does this!
Other than my plugin, you have some (limited) options with existing Jenkins plugins:
Use a single test script
If you configure your Jenkins to simply run:
$ bash run_tests.sh
You can then check in a run_tests.sh file into your SCM repo and you're now tracking changes for how you run tests. However, this won't track configuration of any plugins.
Similarly, if you're using Maven, the Maven Project Plugin simply runs a specified goal for your repo.
The Literate Plugin does allow Jenkins to run the commands in your README.md, but it hasn't yet been released.
Track changes to Jenkins configuration
You can use the SCM Sync configuration plugin to write configuration changes to SCM, so you at least have a persistent record. This is global, across all projects on your Jenkins instance.
There's also the job config history plugin, which stores config history on the filesystem.
Write Jenkins configuration from SCM
The Jenkins job builder project you mentioned lets you check config changes into SCM and have them applied to your Jenkins instance. Again, this is across all projects on your Jenkins instance.
Write Jenkins configuration from another job
You can use the Job DSL Plugin with a repo of groovy scripts. Jenkins then polls that repo, executes the groovy scripts, which create job configurations.
Discussions
Issue 996 (now closed) discusses this, and it has also been discussed on the mailing list: 'Keeping track of Hudson's configuration changes', and 'save hudson config in svn'.
you can do this all with the workflow plugin and a lot more. Workflow is one of the most advanced technics to use jenkins and it has a very strong support.
It is based on a groovy DSL and allows you to keep the whole configuration in the SCM of your choise (e.g. GIT, SVN...).
I am using Maven as a build tool and Jenkins as a CI tool. Currently I have a Jenkins job configured with a Maven build step.
I started using SonarQube and was wondering what is the advantage of using the Jenkins SonarQube plugin and configuring the SonarQube analysis as a post-build-action over simply adding sonar:sonar to the goals of my existing Maven build step.
Thanks and best regards,
Ronald
You can save a lot of configuration. So, if you use jenkins sonar plugin you can centralize database credentials and sonar credentials but if you make a decision about execute sonar:sonar in each jenkins job you will configure each with the same credentials.
I just found: Why use sonar plugin for Jenkins rather than simply use maven goal "sonar:sonar"?
And to add one reason: Using the Jenkins SonarQube plugins one can specify "Skip if triggered by SCM Changes". This is nice if you trigger your Jenkins job for each commit but only want to do a SonarQube analysis at a scheduled time, e.g. one per night.
And here is a summary of the the points made by "emelendez":
Centralize database credentials and sonar credentials Use jenkins
Use jenkins sonar plugin configuring SonarRunner for non Java projects
I've just changed to maven-sonar-plugin from the Jenkins SonarQube plugin to avoid divergence of information between the pom.xml and sonar-project.properties.
For example, developers elsewhere had bumped the project version number in the pom.xml, but they don't use the Jenkins builds and didn't care about the sonar-project.properties (or probably understand it). By switching to the maven plugin instead, the project version is defined once and referenced in the sonar property set within the pom.
The downside is that I no longer have the SonarQube link from the project's Jenkins page.
I'm not sure where the responsibility might be for adding this link back for projects using maven-sonar-plugin... The link is "owned" by the Jenkins SonarQube Plugin, but this is not being used here. Meanwhile the maven-sonar-plugin component is integrating with maven not Jenkins.
Something would need to observe the build and extract the SonarQube link which is emitted as a [INFO] ANALYSIS SUCCESSFUL, you can browse http://... line in the log.