Best alternative for data fixes in rails? - ruby-on-rails

When a rails project grows a lot, you can find yourself having trouble with fixes for the data in the production database.
I have normally used migrations or specific rake tasks for this, but I was wondering if a system similar to migrations existed for keeping the database fixes and run them when needed.

I know you probably figured this out by now but there IS a gem for this... it is called datafix
https://github.com/Casecommons/datafix
basically you create a datafix, like a migration, and a spec for it, then you can run it as needed on the server.

The following gems can be also used for this purpose:
nondestructive_migrations
datafix
migrake
migration-data
data-migrate
I prefer nondestructive_migrations and datafix they are very similar - nondestructive_migrations simpler implementation building on rails migrations.

Related

How do you make rake db:schema:dump have the charset and collation of the fields in schema.rb?

One of our fields needs to be case sensitive. We can write a migration to change the collation which works fine but this change is not reflected in schema.rb. It will create issues for example when running tests and the cloned test database will not have the collation we want for that field.
We use mysql.
I have searched for a way to make this happen with no results..
I managed to find this on github but not sure how this was accomplished https://github.com/cantino/huginn/blob/db792cdd82eb782e98d934995964809d9e8cb77d/db/schema.rb
I think there is no "official" way (provided by Rails or ActiveRecord gems) to accomplish that kind of dump. Following the git history, on the Huginn repo itself, you can find the code you need to achieve this dump. Take a look to this commit: https://github.com/cantino/huginn/commit/db792cdd82eb782e98d934995964809d9e8cb77d
The most relevant code is currently here: https://github.com/cantino/huginn/blob/master/lib/ar_mysql_column_charset/main.rb
So if you need this feature, you'll probably need to copy/paste this extension into your project.
UPDATE
I made a deeper review of Huginn repo (git history and issues), and as you can read in this comment, this functionality was extracted into a gem: https://github.com/kamipo/activerecord-mysql-awesome.
As mentioned in #house9's comment, you can use structure.sql instead. Add this to your project's application.rb:
config.active_record.schema_format = :sql
Then run bundle exec rake db:structure:dump to generate the actual SQL structure. This retains charsets and collations (which should ideally be there in schema.rb, but alas).
"structure" is by nature less portable than "schema", but it's usually a good thing for all team members and environments to be using the same database and version anyway.

db:migrate for a Models gem

So we abstracted our models into a gem because multiple applications use the same model set. The trouble is performing creating and performing migrations. Because it is a gem we basically removed rails.
It can't perform rails g or rake.
If we try to keep the config and script folder which allows that, the other applications will complain when they use the models gem.
We're hacking around this by allowing one specific application to perform all migrations.
Perhaps the better question is: What is the best way to modularize common models such that you retain rails g and rake db:migrate?
I probably explained this poorly, please ask any questions.
Thanks,
Justin
Are you using version control? You could look into just using a git submodule for the models folder which would allow you to use rails generators on all applications and keep them all in sync. Basically a submodule is a git repository inside an existing repository.
The commands are simple as well, to get started look into this guide here overall it should help you reduce the complexity of your application.

How to handle one-off deployment tasks with capistrano?

I am currently trying to automate the deployment process of our rails app as much as possible, so that a clean build on the CI server can trigger an automated deployment on a test server.
But I have run into a bit of a snag with the following scenario:
I have added the friendly_id gem to the application. There's a migration that creates all the necessary tables. But to fill these tables, I need to call a rake task.
Now, this rake tasks only has to be called once, so adding it to the deployment script would be overkill.
Ideally, I am looking for something like migrations, but instead of the database, it should keep track of scripts that need to be called during a deployment. Does such a beast already exist?
Looks like after_party gem does exactly what you want.
I can't think of anything that does exactly what you want, but if you just need to be able to run tasks on remote servers in a one off fashion you could always use rake through capistrano.
There's an SO question for that here: How do I run a rake task from Capistrano?, which also links to this article http://ananelson.com/said/on/2007/12/30/remote-rake-tasks-with-capistrano/.
Edit: I wonder if it's possible to create a migration which doesn't do any database changes, but just invokes a rake task? Rake::Task["task:name"].invoke. Worth a try?
I would consider that running that rake task is part of the migration to using friendly_id. Sure, you've created the tables, but you're not done yet! You still have to do some data updates before you've truly migrated.
Call the rake task from your migration. It'll update the existing data and new records will be handled by your app logic in the future.

Testing rails application with a I18N database backend

I use Rails 2.3 i18n with a database backend plugin :
http://github.com/dylanz/i18n_backend_database
This stores my translations and locales in two DB tables. What would be the best way to get these tables working with my tests? I'm guessing I could write a rake task that would copy the tables from the development DB to the test DB.
Any suggestions?
You could put the data in a seeds.rb file and run that task when loading your test environment. The benefit of this is that you'll also have some way of regaining a basic data structure if you, for some reason, wipe your computer.
One thing you could try is using fixtures for this. Do a google search for db:fixtures:dump or db:fixtures:export_all Rolling own your own implementation should be pretty easy as well.

After you download an open-source Rails project, do you have to db:migrate?

This is probably a stupidly easy question for a Rails person but is causing me no end of confusion. I downloaded several open source Rails projects but am not able to run them. Usually, are you supposed to do a db:migrate before you try to run a Rails project? I thought they were supposed to just run.
I guess it depends on how the database is configured. If it's pointing to a sqlite db, then its probably all ready to go, otherwise if its a full blown RDBMS, then yes the database would need to be migrated assuming of course that the settings in database.yml are configured correctly.

Resources