continuous deployment of a grails app - grails

I am working on a grails app and need regularly to deploy hot fixes to a remote server. I am using jenkins with grails plugin for automation.
My point is the following:
Most of the time i fix a few classes, with no big changes in the app (such as new database schema, new plugins....). However each time i create a patch i have to upload trough ssh a 75M war file, which takes between 15 to 20 min. Most of the data is not needed (ie all the packaged jars). What would be sufficient is to upload only the fresh compiled classes from WEB-INF/classes/ and reload the servlet container (in my case jetty).
Anybody experienced with this, preferably with jenkins?

Check the nojars argument for the war task: http://www.grails.org/doc/1.3.7/ref/Command%20Line/war.html
This way you can place all your .jars (which are usually the biggest files inside a .war) in some other directory on the server and just reference that directory in your Jetty classpath.
Or you could write a shell script to explode the .war file (after all it's just a regular .zip file), add the compiled classes and then re-package it.

You could try using CloudBees to do continuous delivery releases. They also use deltas to upload your changes, and deployments don't affect the user experience at all.
An easy to use plugin is available to make the process seamless from within your Grails app and in a Jenkins build. I've written a blog post about how to get it all working easily.

I remember seeing this subject on the mailing list...
http://grails.1312388.n4.nabble.com/Incremental-Deployment-td3066617.html
...they recommend using rsync or xdelta3 to only transfer updated files. Haven't tried it, but it might help you?

Maybe the Cloudfoundry Micro Cloud is an option, a deployment just transfers the deltas and not the whole war file.

Related

What is the appropriate location of Azure Devops/TFS Release "support files" (Powershell, Configuration etc.)?

On TFS 2018.2, I am building a release pipeline implying the use of :
Applications configuration files
PowerShell scripts
HTML/Markdown templates (for release notes)
My applications configuration files are located on a net share for now and that works fine but I would like to version them later on.
I was about to store other files on my existing TFVC repository but I did not find a way to get them (with their directory) without adding the entire repository as a release input artifact.
I do not want to add them to my build artifacts since these files will be used for all my releases, no matter the applications I am building.
What is the recommended way
to store these files
to get them on release execution ?
I have been tempted to use the library but I feel this would be a misuse of it since it has been designed for secure files...
The correct solution to this problem is something you've already hit upon: Add them to your build artifacts. In fact, it's better than pulling them from a separate repo for a very important reason:
Your deployment scripts are going to evolve along with your application. You lose the connection between "this version of the application was deployed with these particular scripts" if the scripts come from a separate location.
You have a lot of options to control the circumstances under which they get pubilshed/downloaded:
You can use conditions on the publish artifacts tasks to control when they get published
You can use artifact filters on the release definition to control when they get downloaded as part of a release

Continuous Integration with BitBucket

I'm developing a private webapp in JSF which is available over the internet and now reached a stage where I wanted to introduce CI (Which I'm fairly new to) into the whole process. My current project setup looks like this:
myApp-persistence: maven project that handles DB access (DAOs and hibernate stuff)
myApp-core: maven project, that includes all the Java code (Beans and Utils). It has a dependency on myApp-persistence.jar
myApp-a: maven project just with frontend code (xhtml, css, JS). Has a dependency on myApp-core.jar
myApp-b: maven project just with frontend code (xhtml, css, JS). Has a dependency on myApp-core.jar
myApp-a and myApp-b are independent from each other, they are just different instances of the core for two different platforms and only display certain components differently or call different bean-methods.
Currently I'm deploying manually, i.e. use the eclipse built-in export as war function and then manually upload it to the deployments dir of my wildfly server on prod. I'm using BitBucket for versioning control and just recently discovered pipelines in BitBucket and implemented one for each repository (every project is a separate repo). Now myApp-persistence builds perfectly fine because all dependencies are accessible via the public maven repo but myApp-core (hence myApp-a and myApp-b, too) fails of course because myApp-persistence isn't published on the central maven repo.
Is it possible to tell BitBucket somehow to use the myApp-persistence.jar in the corresponding repo on BitBucket?
If yes, how? And can I also tell BitBucket to deploy directly to prod in case the build including tests ran fine?
If no, what would be a best practice to do that? I was thinking of using a second dev server (already available, so no big deal) as a CI server but then still I would need some advise or recommendations on which tools (Jenkins, artifactory, etc.) to use.
One important note maybe: I'm the only person working on this project so this might seem like an overkill but for me the process of setting that up is quite some valuable experience. That said, I'm not necessarily looking for the quickest solution but for the most professional and convenient solution.
From my point of view, you can find the solution in this post-https://christiangalsterer.wordpress.com/2015/04/23/continuous-integration-for-pull-requests-with-jenkins-and-stash/. It guides you step by step how to set up everything. The post is from 2015 but the process and idea are still the same. Hope it helps.

Any quick way to convert VS .net manual build into Jenkins?

We are migrating 50+ .net project from TFS to GitHub, at the same, we want to use Jenkins to automate the build. Currently all the builds are done inside the Visual Studio manually. I know how to automate this build using MSBuild and we already have a lot of these projects building inside Jenkins.
My question: is there a way to set up these 50+ project quickly w/o creating them one by one manually? Anyway to script them? e.g. a Jenkins project has everything inside a folder, I can copy a sample project/folder to create a new one and modify something. Or create a Jenkins project using a script reading a config file? Any idea can save some time is appreciated.
Not a direct answer but too long for a comment so here it goes anyway. Following the Joel test (which in no way is dogmatic for me but does make a lot of good points), and in my experience, you should already have an msbuild file now to build all those projects 'in one click'. Then, setting up a build server, in fact any build server, is just a matter of making it build that single parent project. This might not work for everyone, but for several projects I've worked on this had the following advantages:
the entire build process gets defined by developpers, working locally on their machine, using 'standard' tools
as such they don't need to spend hours in a web interface figuring out the appropriate build steps, dependencies and whatnot (also those hours would have been worthless in the end if switching to a different build server)
since a complete build is now just a matter of msbuild master.proj, possibly along with some options to define configuration/platform/output directories getting this running on any build server should be painless and quick
in the same manner this makes it easy to test different build servers with a minimum of time and migrate between them (also no need to ask SO questions on how to set everything up :)
this also makes it easy for other developpers to get complete builds as well without having to go round via a build server
Anecdote: we once had Jenkins running on multiple different projects as well. It took us days to get everything running, with the templates etc, and we found the web intercae slow and cumbersome (and getting to know the API would have taken even more days). Then one day I got sick of this and made a bunch of msbuild scripts which could build everything from one msbuild command. That took much less time than setting up Jenkins, a couple of hours or so. Then I took a TeamCity installation we already had and made it build the new master project. Took like an hour and everything worked. Just recently I took the same project and got it working on Visual Studio Online, again in no time.
If those projects are more or less similar to build, you will probably be interested in using the template plug-in for jenkins. There you configure a dummy project such that it does what is common to (most of) the 50+ projects.
Afterwards you create a separate project for each: Create the first project and make it use the template project for each of the steps which can be shared with the template project (use build step from other project). All subsequent projects can be created as slightly adopted copy of this first 'real' project.
I use it such that the variable $JOB_NAME (the actual project name in jenkins that is) is part of the repository path and I can thus clone from http://example.org/$JOB_NAME/
Configured that way, I can include the source code management step in the templating job and use it unmodified. Similar with the build step and post-build step: they are run by a script which is somewhat universal accross all my projects (mostly calling make and guessing deployment / publication paths upon $JOB_NAME again).

Jenkins plugins directory backup and restore

I'm trying to work out a way to backup and restore jenkins so we can provision a new jenkins automatically.
I cannot work out a way to backup+restore /path/to/jenkins/plugins without including the binaries. We would like the backup to be in xml file format, just as everything else in jenkins. My assumption was that we could somehow backup xml files, and jenkins could restore the plugin binaries if they are missing, since it has access to maven.
I would prefer to avoid using config management tools to install plugins, as I then I have to manage versions of plugins in a way that feels too controlled. I'm happy to just backup what ever is there, and restore it elsewhere when needed. The developers should be free to install plugins, without involving me or puppet.
Googl'ing the issue is difficult since "plugin" is used in so many other contexts.
Below link says it governs plugins as well, but I cannot see how this is - maybe I'm missing something.
http://jenkins-ci.org/content/keeping-your-configuration-and-data-subversion
I have ported the idea to use git and it generally works, except that plugins do not re-appear by magic on the new machine - only the default plugins come back.
Can anyone suggest?
If you don't want to back up the plugin binary files, you can use the Jenkins REST API to get the list of current plugins:
http://jenkins:8080/pluginManager/api/json?tree=plugins[shortName,version]&pretty=true.
(You can use tree=plugins[*] to see a more complete list of fields in the API.)
Save this data as part of your configuration backup and use the Jenkins API to restore the plugins when you're re-deploying.
There's additional documentation and how to update plugins on the pluginManager API page: http://jenkins:8080/pluginManager/api
The best idea I've come up with to date, is to split the instance into an OS disk and a Jenkins disk mounted on /var/lib/jenkins. Use your cloud provider's snapshot feature to backup the jenkins disk periodically. Jenkins is for many organisations I believe, always going to be a "flake" server, or a pet, that needs nurturing and does not benefit much from automation, other than what is used to maintain the OS.
To backup Jenkins components, you can use Handy Backup. A best practice is to set up at least daily backup in a differential or mixed (full/differential) mode. This is an advantage before using any plug-in, due to assured regularity of backups.

Hot deploy project with a lot of static content in Weblogic

I have a Java EE application with a lot of static content: Javascript, images, css and such. Right now we are using weblogic plugin for eclipse to deploy our applications for testing purposes, but it's getting pretty slow and it's only going to get slower. Since we have a lot of javascript, it's often that we have to make small changes and test them in quick succession which is becoming a big headache.
We also want to move away from using weblogic plugin on Eclipse. We want to find a solution to deploying in a test environment in a way that it only copies content that was changed since the last deploy. We thought about using an Ant script, but all solutions I found on the internet involves making an .EAR and copying it to the autodeploy folder in the test domain on the server. Which would not solve the problem since generating the ear would cause even further overhead.
Is there any way to make this work?
If you deploy to a local weblogic environment on the development machine you can use an exploded deployment (i.e. an unzipped war or ear). All changes to static content and jsps will be available immediately for testing.

Resources