Automated Deployment in Rails - ruby-on-rails

I'm working on my first rails app and am struggling trying to find an efficient and clean solution for doing automated checkouts and deployments.
So far I've looked at both CruiseControl.rb (having been familiar with CruiseControl.NET) and Capistrano. Unfortunately, unless I'm missing something, each one of them only does about half of what I want (with each one doing a different half).
For what I've seen so far:
CruiseControl
Strengths
Automated builds on repository checkouts upon commit
Also runs unit/functional tests and reports back
Weaknesses
No built-in deployment mechanisms (best I can find so far is writing your own bash scripts)
Capistrano
Strengths
Built for deployments
Weaknesses
Has to be kicked off via a command (i.e. doesn't do automated checkouts upon commit)
I've found ways that I can string the two together -- i.e. have CruiseControl ping the repository for changes, do a checkout upon commit, run the tests, etc. and then make a call to Capistrano when finished to do the deployment (even though Capistrano is also going to do a repository checkout).
Basically, when all is said and done, I'd like to have three projects set up:
Dev: Checkout/Deployment is entirely no touch. When someone commits a file, something checks it out, runs the tests, deploys the changes, and reports back
Stage: Checkout/Deployment requires a button click
Prod: Button click does either a tagged check out or moves the files from stage
I have this working with a combination of CruiseControl.NET and MSBuild in the .NET world, and it was fairly straightforward. I would guess this is also a common pattern in the ruby deployment world, but I could easily be mistaken.

I would give Hudson a try (free and open source). I started off using CruiseControl but got sick of having to relearn the XML configuration every time I needed to change a setting or add a project. Then I started using Hudson and never looked back. Hudson is more or less completely configurable over the web. It was initially a continuous integration tool for Java but has plugins for other development stack such as .NET and Ruby on Rails. There's a Rake plugin. If that doesn't work, you can configure it to execute any arbitrary command line after running your Rake builds/tests.
I should also add it's extremely easy to get Hudson going:
java -jar hudson.war
Or you can drop the war in any servlet container.

I would use two system to build and deploy anyway. At least two reasons: you should be able to run it separately and you should have two config files one for deploy and one for build. But you can easily glue the two systems together.
Just create a simple capistrano task, that tests and reports back to you. You can use the "run" command to do anything you want.
If you don't want any command line tool there was webistrano 2 years ago.
To could use something like http://github.com/benschwarz/gitnotify/tree/master to trigger the build deploy if you use git as repository.

At least for development automated deployments, check out the hook scripts available in git:
http://git-scm.com/docs/githooks
I think you'll want to focus on the post-receive hook script, since this runs after a push to a remote server.
Also worth checking out Mislav's git-deploy on github. Makes managing deployments pretty clean.
http://github.com/mislav/git-deploy

Related

Is it necessary to use both Jenkins and GitHub?

My former web developer setup my site so that it uses Jenkins and GitHub. I understand the very basics of GitHub and even less of Jenkins. But in theory, when I make minor text changes to my website, can't GitHub manage the process of pushing those changes to the server? Or is there some good reason that Jenkins is also involved?
Thank you.
Yes. It's not a must but using both Jenkins and Github will make your life easy. Github and Jenkins are two tools that help you to do different functions.
Github will mainly help you to manage your codebase, resolve conflicts, etc. So it will basically behave as a repository. You can commit your changes and get other's updates and always be up to date. There are tons of other advantages but I'll keep it simple for understanding purpose.
Jenkins is an open-source automation server. In your case, you can automate the product building. For example if you have a test environment or even when you deploy the changes t live, you can do all that with just a click. And you can separately build tests and live environments and With concepts like pipeline, you can even integrate the building with tests, etc.
But if you are talking about your local environment, yes git is enough because you can build the project manually. but in production have git and jenkins both will be a handy option.
Read more on Jenkins

FitNesse running remote test cases locally?

The background is : I am trying to implement an automated integration test solution. I want to have a FitNesse server running which QA/Users can maintain the test cases. During our nightly build, we want to have the test run locally in the build machine. (In our build script, we are going to startup Jetty, and FitNesse test cases are invoking the RESTful APIs)
When I am looking into the fitnesse-maven-plugin (http://mojo.codehaus.org/fitnesse-maven-plugin/), in the description of goal fitnesse:run, it said that:
This goal uses the fitnesse.runner.TestRunner class for calling a remote FitNesse web page and executes the tests or suites locally into a forked JVM
However, when I am using this plugin with FitNesse version 2009xxxx or 2008xxxx (with a special patch of this maven plugin), I found that the test is not running locally. Instead, I saw new test results created in the remote FitNesse wiki server.
May I know if it is due to change of behavior of FitNesse? (Coz the fitnesse maven plugin is depending on a much older version of FitNesse) Also, with the original Test Runner being deprecated, is it possible to have the behavior I am looking for? (Pages defined in remote server, but run locally in build machine)
Or, is such way of work no-longer a recommended approach to use FitNesse? (If so, I will need to change the approach of the automated test)
One solution I've used is the wiki import option feature. This can import the latest changes from the remote wiki to your local build server.
http://fitnesse.org/FitNesse.UserGuide.WikiImport
You can also tell it to auto-update when running the tests rather than having to re-import manually whenever they change.
Another possibility is to use a source control plugin to automatically commit changes by QA/Users from the remote wiki and pull them down as part of your build.

ec2 deployments in rails

I know this question has probably been asked so many times here, but I didn't really find a good answer. I'm trying to find a simplest solution to do basically just checkout a git project on an ec2 instance, checkout a specific branch and then restart apache server.
I'm not sure if Capistrano is what I need. I'm fine with some shell script or ruby script which basically just invokes commands like 'git clone....', 'git checkout branch...' and 'restart apache server'
Is there a framework which lets me do this so I don't really have to write a script from scratch.
First off, are we talking about "Deployment" or about "just get a codebase and copy to the server"?
In the first case, Deployment is a set of common practices, as following:
"server" as abstractions
keep versions and rollback ability
database migration, rollback and backup functionality
manage background processing
It's actually mean manage the full-stack application, not only "restart httpd" or something else.
Capistrano developed with 37signals contributions and they are using it on many projects. A lot of projects use capistrano or the same tool to do deployments. Look at this again, it's easy to setup and use.

Creating an on Demand Deployment from TFS using a label

I have build scripts that builds, test, version and packages my projects as artifacts to a staging area for each of our environments ready for a versioned release to a given environment (and labels the changeset).
I want to stop doing this automatically and only deploy on demand.
My problem is I am using TFS and the friction is just immense.
I basically want to have an easy way to
Get a specific version from source control
build it for a specific enviroment -DONE
deploy it. -DONE
The last 2 steps are trival. The "getting by label" it not that much fun with TFS.
Any ideas/pointers other than use stop using TFS?
just ask on twitter next time :-)
Seriously though, have a look at TFS Deployer on codeplex. The way it works is that you do a normal build with versioning of the output like you normally would, but you pull out all the deployment stuff from it.
Next, setup TFS Deployer - it listens for changes in the Build Quality and the fires off a powershell script that you write that does all the deployment work. For example, when you change the quality of a build to something like "deploy to UAT" it can fire off a powershell script that then does whatever you need it to. To do a deploy you just go to build explorer, set the quality to whatever you want and let powershell do the rest - you'll get an email of the results as well so you know if it works or fails.
Go have a look at it and if you get stuck just ping me and I'll help you out.

Deploying a website to Production From Team Build Server

I have a team foundation server with build server, when I run a build it deploys to a website on that box. However I also want to do the same on Production which is a server on an external network and not part of the same domain.
I thought about looking at TFS Deployer but that just seemed to work within a network, I'm going to test it out as soon as I get a chance but I thought the best idea was to ask here when working with something so critical.
Is it a really bad idea to have a way of easily deploying to production?
Does anyone here deploy to production using whatever method? How do you do it?
Essentially the accepted answer will go to the person who can tell me the best method for achieving a deployment but pointing me in the right direction is sure to get an up vote as long as it's not too obvious.
Depending on the infrastructure you have available to you you can use wix to create msi's and use SMS configuration manager to deploy them to a target collection. This is the direction that we are moving to but have not reached yet. We also integrated wix into our build process to create the MSI artifacts. The reason we wanted to go down this path was because we are using CruiseControl.net as our continuous integration server and we have a nant script that we use to perform both the build process and the deployment process. They are both separate targets in the nant file, but what we wanted was a consistent model of deployment to all environments including production.
What are are doing currently is we are manually moving zips (which are artifacts of our current build process) to production. When the zips are unpackaged in the production environment we have to remove all the web.config, app.config etc from the zips and if we have new entries in the configs they are made manually.
Found msdeploy http://blogs.iis.net/msdeploy/archive/2008/01/22/welcome-to-the-web-deployment-team-blog.aspx

Resources