Retrieving old builds for re-deployment in Jenkins - ant

I have built an automated deployment system using Jenkins, Subversion and ANT on a set of environments. It all works, allowing me to deploy old tagged releases to a set of environments or automatically deploy the latest build using the Subversion Release Manager within Jenkins.
The problem is on the client site, we have to utilise Perforce (which does not currently have a Release Manager plugin within Jenkins [don't really want to write one, but possible]). What is the best way to setup up Jenkins to be able to deploy certain releases to environments? I started looking at Ivy and Artifactory as a possibility.
If anyone has any suggestions, or any guides online, that would be great!

We currently build a WAR file for each Subversion checkin - and make this available with the copy artifact plugin. Redeploy is merely a matter of executing the deploy task for this WAR.

Related

What is the best practice for CI development?

We are starting to develop CI workflow for our systems in my company.
Currently we just making few basic tasks like build, tests, and upload to Nexus.
The tech stack is a Java project which build in Gradle and Jenkins makes our build.
Currently i'm working with some basic Groovy script to make what we need, but each time i'm copy and paste my updated code to Jenkins and running the job from Jenkins UI to see the results, and to me it seems like not a very good approach for developing such automation code.
My question is, what is the best practice to build and run Jenkins jobs?
Is it possible to run it straight from Intellij ?
Do we need to create a Jenkins project which should be saved as a repository and then deploy it to Jenkins machine?
Do we need to use some Intellij plugins in order to work with Jenkins?
More best practices are welcome :)
Jenkins has an API - so you can do whatever you want!
But in general, for small to medium teams it's better to use Jenkinsfile and let Jenkins pull code changes (or pull-requests) from SCM and trigger builds. You can also configure hooks to trigger builds if your SCM supports this (Github & bitbucket supports this).
If you are eventually pushing your artifacts to a docker image, I would highly recommend docker multi-stage builds.
If you are completely new to CI/CD stuff - Atlassian has a lot of good resources https://www.atlassian.com/continuous-delivery/principles/continuous-integration-vs-delivery-vs-deployment

Migrate Jenkins jobs from Cloud Bees to another Jenkins server

I have a Jenkins server at CloudBees server and it has a lot of jobs.
I have created new Jenkins server at AWS Ec2 instance.
Now, I need to migrate all Jenkins jobs from CloudBees to New Jenkins Server(AWS EC2instance)
How can I do this task? Is there any way to migrate all jobs by CLI?
Use Backup Plugin or thinBackup
You first need to ensure that you do not use proprietary CloudBees features (RBAC, Folders+ plugins). This is the only thing that's really specific to migrating from a CloudBees Jenkins.
After that, standard steps for migrating Jenkins apply:
ensure that you have same plugins installed on the new Jenkins
align credentials and credentials-IDS
API tokens need special handling
After that, you can just copy all $JENKINS_HOME/jobs/*/config.xml files (if using folders, copy recursively).
You can also copy job configs via CLI or REST API, but usually the fastest way is to copy directly on filesystem level.

Deploying Jenkins Artifact Built by Another Job

I installed the Deploy Plugin on my Jenkins in order to automate the deployment of my Maven built war packages to Tomcat 7. The problem is that I am able to use the plugin to deploy to a remote Tomcat server only if they are made within the same job that uses the deploy plugin. In other words, I have not been able to set up a standalone job that deploys artifacts made by a different job.
For example, I have a job named pack.foo. It uses the source code in /var/lib/project/module to create module.war and put it in /var/lib/project/module/target. However, because of the Maven version setup, the artifact posted on pack.foo's artifact page is something like module-2.0.0-SNAPSHOT.war.
The only way I am able to deploy module.war is if I add a Post-build Action to pack.foo and specify **/module.war to be a remote Tomcat manager URL (provided I have the manager's credentials in Jenkins config). Then the job's console output logs that /var/lib/project/module/target/module.war was deployed to that URL:
Deploying /var/lib/project/module/target/module.war to container Tomcat 7.x Remote with context
[/var/lib/project/module/target/module.war] is not deployed. Doing a fresh deployment.
Deploying [/var/lib/project/module/target/module.war]
How can I use this, or another plugin, to deploy a WAR artifact that was made in a separate Jenkins job? I would like to have separate jobs for artifact creation and deployment. The plugin wasn't finding **/module-2.0.0-SNAPSHOT.war or even **/module.war built by another job even though there was definitely a file on disk that matched that pattern.
See the paragraph on the Deploy Plugin's page you linked:
How to rollback or redeploy a previous build
There may be several ways to accomplish this, but here is one suggested method:
Install the Copy Artifact Plugin
Create a new job that you will trigger manually only when needed
Configure this job with a build parameter of type "Build selector for Copy Artifact", and a copy artifact build step using "Specified by build parameter" to select the build.
Add a post-build action to deploy the artifact that was copied from the other job
Now when you trigger this job you can enter the build number (or use any other available selector) to select which build to redeploy. Thanks to Helge Taubert for this idea.

Does Sonatype's Nexus Repository offer any benefit with Jenkins?

So I'm setting up a CI solution using Jenkins and I've been instructed to use SonaType's Nexus Repository as a binary repository that ties into Jenkins. The idea as I understand is that it will provide immediate rollback to previous compiled binaries.
Some of the other engineers who have experience with Jenkins have questioned this decision, because they believe Jenkins can already do this. Apparently Jenkins will store build results for immediate rollback deployment anyway, so the inclusion of Nexus is of dubious benefit.
Is it true that Jenkins can already offer immediate rollback without a third-party service or plugin? If so, what is the benefit of using Nexus with Jenkins if any?
One of the benefits of use an artifact repository (Nexus, Jfrog Artifactory..) with Jenkins(or another CI tool like Bamboo) is that you can deploy your artifacts to a repository in Nexus(or Artifactory) with their control version (include SNAPSHOT in maven) previously to send these artifacts to each environment (integration environment, production environment...).
This is a good practice because when you do an install of your projects, for example:
mvn install
Your projects downloads all dependencies from the artifact repository (Nexus, Artifactory...) and theses dependencies are organized and availables for al your team.

Do I specify binary artifact settings in the build scripts or the CI server?

I'm prototyping a new build system using Jenkins, Gradle, and Artifactory. There seems to be conflicting or rather overlapping features in these tools, in regards to specifying the build artifacts and their destination. I see three paths going forward:
Specify the artifact settings on the particular task in Jenkins, using the Jenkins Artifactory plugin.
Specify the artifact settings in the Gradle build scripts, using the Gradle Artifactory plugin.
Specify generic maven repo settings in the Gradle build scripts, using the standard Gradle "maven" plugin.
I see pro's and con's to all of these approaches, but nothing is missing a critical feature for our builds, as far as I can see.
To further my confusion, the Gradle Artifactory plugin wiki states:
Build Server Integration - When running Gradle builds in your
continuous integration build server, it is recommended to use one of
the Artifactory Plugins for Jenkins, TeamCity or Bamboo to configure
resolution and publishing to Artifactory with build-info capturing,
via your build server UI.
So, some questions to get the conversation going:
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
What about version control for artifact changes done through the CI interface?
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
Is there a killer tool integration feature that separates one of these approaches from the other?
How useful is the build info object? Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
I am really hoping to hear from existing users of these tools and what pitfalls/requirements may have led them to one of the approaches above (or perhaps even a better one that I haven't considered yet).
Does it make sense to clutter the build scripts with artifact logic? It might help to add that developer's don't deploy. Currently, I only see build artifacts being uploaded from the Jenkins task.
I'd say that's the way to go. Your build server is the single point of truth, and only artifacts built in the build server should be deployed.
Does leaving all of this build logic in the task configuration expose us to issues, in the event that the CI server is down?
That one is simple - you shouldn't deploy while your CI server is down. Building on local machine might produce wrong artifacts, which shouldn't be deployed.
What about version control for artifact changes done through the CI interface?
Not sure I understood your question.
I've seen simple Bamboo configurations that specify the build artifacts through the CI server UI, rather than the pom's. Is this just a bad build practice?
This configuration ignores Maven's ability to deploy, and I am not sure I can find a good scenario to justify it. The only thing I can think of is deferred deploy, but Artifactory plugin can take care of that.
Is there a killer tool integration feature that separates one of these approaches from the other?
Now we got to the essence :)
Well, the advantage of defining what you deploy in your build script (in case of Gradle) gives you the flexibility to fine-tuning every aspect of the deployment (think about the dynamic properties you might want to add in certain cases). Another very serious advantage is that your build is source, which means it is versionable in your version control.
The advantage of defining the deployment details in the build server configuration is that the build server is the only place the deployment should occur. So, if you don't have the deployment details in your build script, you know for sure it won't be deployed standalone.
So, how can you combine between the two to get the advantages of both worlds?
Code your deployment logic in your Gradle script using the Artifactory plugin DSL. Provide details like username and password from properties, which exists on build server only.
How useful is the build info object?
Extremely useful. The information in buildInfo was harvested during the build process and the buidInfo is the only place it exists. Having this information is the only option you will be able to reproduce this build in the future.
Is that only available in the Jenkins Artifactory plugin and not the Gradle Artifactory plugin?
'artifactory' and 'artifactory-publish' Gradle plugins both generate the buildInfo object, regardless of where are they running (be it your local machine or Jenkins build server).

Resources