This one's really an off shoot of another question I asked, earlier today.
Basically, it seems like it's not the rails way to have database level constraints (foreign keys) in RoR. At least, they're not natively supported. There's this Foreigner Gem or I could go primitive and use raw SQL through execute in rails migrations.
My question is, are there any pitfalls I need to be aware of when using the execute route. Here are a few that I that I'm aware of:
Writing db seeds/fixtures can get tricky, perhaps impossible for
some cases
Managing db migrations becomes difficult, as the foreign keys would never be dumped in db/schema.rb
Polymorphic foreign keys would not be possible (I don't even know what they are, so, I shouldn't miss those)
Are there any other pitfalls that I should be aware of?
Related
I've been working with Rails for a few years and am very used to ActiveRecord, but have recently landed a task that would benefit from (some) NoSQL data storage.
A small amount of data would be best placed in a NoSQL system, but the bulk would still be in an RDBMS. Every NoSQL wrapper/gem I've looked at, though, seems to necessitate the removal of ActiveRecord from the application.
Is there a suggested method of combining the two technologies?
Not sure what NoSQL service you are looking into, but we have used MongoDB in concert with Postgres for a while now. Helpful hint, they say you need to get rid of ActiveRecord, but in reality, you don't. Most just say that because you end up not setting up your database.yml and/or running rake commands to setup AR DB.
Remember also that Postgres has HStore and JSON datatypes which give similar functionality as NoSQL datastores. Also, if the data you are looking to store outside of your AR DB is not very complex, I would highly recommend looking into Redis.
If you look at the Gemfile.lock of this project, you can see that it uses ActiveRecord with Mongoid.
Even if you use other gems that don't need ActiveRecord, you shouldn't care. If you are using it, you should have a valid reason to do so.
I have to add foreign keys to several different tables in our Rails app. Is it better for me to add all the keys in one migration or to make several single-purpose migrations, one for each table being altered?
Generally speaking, "several single-purpose migrations" is better. Make sure your migration run up AND down. On a side note, your migration filename should be descriptive enough for a 3rd party to understand what the migration does.
When I do migrations I group changes based on common purpose. For instance if there are 3 changes that are for one feature of my app and 2 for another, I'll migrate those 3 together and then those 2 as another migration. That way in the future if I need to do a rollback, I'm only rolling back the changes that pertain to the area I'm working on.
I think, splitting the migrations or not is a question of what makes your code more clear. If adding multiple foreign keys is intended to be one task so do it in one migration. If your foreign keys come along with significant changes to the related models/controllers/views, it might be more clear to split them, so you can keep track of which migration belongs to which changes in the rest of your application.
When you are using git (or something similar), it might be helpful to bind corresponding changes in one commit.
From a performance point of view, it doesn't make a difference.
I have a rails project that uses mongodb, the issue i am having is when i have records (documents) made from a previous model. (i'm gettin klass errors, just for the older records)
Is there a quick way to fix those mongodb documents the rails way, using some command.
Or is there a command i can run with mongoid for it to open the specific model up in mongo, then i can poke with the document manually (removing unneeded associations).
The concept of schema migration would need to exist in mongoid and I don't think it does. If you have made simple changes like renaming or removing fields then you can easily do that with an update statement, but for anything more complicated you will need to write code.
The code you will need to write will most likely need to go down to the driver level to alter the objects since the mapping layer is no longer compatible.
In general you need to be careful when you make schema changes, in your objects, since the server doesn't have that concept and can't enforce them. It is ultimately up to your code, or the framework you are using, to maintain compatibility.
This is generally an issue when you mapping system without doing batch upgrades to keep things at the same schema, from the mapping layer perspective.
Here is one issue which I am unable to debug. While doing an rake db:test:clone_structure, the foreign keys that are not copied from development database to test database. Is there anything that I am missing?
Your problem is that Rails (or ActiveRecord) doesn't understand foreign keys inside the database, nor does it understand CHECK constraints or anything else fancier than a unique index. Rails is generally nice to work with but sometimes there is more attitude than good sense in Rails.
There is Foreigner for adding FK support to ActiveRecord but that doesn't know about Oracle. You might be able to adapt the PostgreSQL support to Oracle but I don't know my way around Oracle so that might not be a good idea. Foreigner also doesn't support CHECK constraints (yet).
The quick solution would be to dump the FKs and CHECKs as raw SQL and pump that SQL into your test and production databases. Then wrap a quick script around that that does a rake db:test:clone_structure followed by the raw SQL FK and CHECK copying.
Sorry that there's no easy way to do this but once you get outside the bounds of what a framework wants to do things get ugly (and the more comprehensive the framework, the uglier things get). A little bit of SQL wrangling wrapped around the usual rake command isn't that nasty though.
Applications have bugs or get bugs when updated, some hidden that they get detected months or years later, producing orphaned records, keys pointing nowhere etc. even with proper test suites.
Allthough Rails doesn't enforce referential integrity on the database level - and for some good reasons discussed elsewhere it will stay like that - it would still be nice to have tools that can check if the database is in a consistent state.
Since the models describe what 'should be', wouldn't it be possible that an offline tool validates the integrity of all the data. It could be run regularly, before backing up data or just for the sake of the developers good sleep.
Is there anything like this?
I don't know of such a tool. At least you are aware of the dangers of referential integrity hazards. So why make yourself suffer? Just use foreign key references in the first place, as dportas suggested.
To use it in a migration, add something like this:
execute('ALTER TABLE users ADD FOREIGN KEY (language_id) REFERENCES languages(id)')
to make the language_id column of users reference a valid row in the languages table.
Depending on your DBMS this will even automatically create an index for you. There are also plugins for rails (check out pg_on_rails) which define easy to use alias functions for those tasks.
Checking the integrity only on the backup file is pointless, as the error has already occured then, and your data may be messed up already. (I've been there)
On the other hand, when using foreign key contraints as stated above, every operation which will mess up the integrity will fail.
Think of it as going to the dentist when you feel pain (=having surgery) vs. brushing your teeth ONCE with a magical tooth paste that guarantees that your teeth will be fine for the rest of your life.
Another thing to consider: An error in your application will be much more easier to locate, because and exception will be raised at the code which tries to insert the corrupt data.
So please, use foreign key contraints. You can easily add those statements to your existing database.
How about using the DBMS to enforce RI? You don't need Rails to do that for you.