In Django, you fully describe your models in models.py. In Rails with ActiveRecord, you describe part of a model in in the /models directory, and part of it in migrations. Then ActiveRecord introspects model properties from the existing database tables.
But I find migrations, columns, and tables to be a headache.
How can I do like Django -- just declare all model properties instead of introspecting them from the database tables?
And for extra credit, explain where and why this would be a bad idea. :)
If you hate on Migrations, try going NoSQL. No migrations!
So you'd just add properties to your document when you need them. In your code, handle the fact that they may not exist and bam!
I took the following model definition (notice you don't inherit form activerecord) from a blog about tekpub Also recommend the Herding Code podcast
class Production
include MongoMapper::Document
key :title, String, :required => true
key :slug, String, :unique => true, :required => true, :index => true
key :description, String
key :notes, String
key :price, BigDecimal, :numeric => true
key :released_at, Date, :default => Date.today
key :default_height, String, :default => '600'
key :default_width, String, :default => '1000'
key :quotes, String
#royalty info
key :producers, String
timestamps!
end
Try the auto_migrations plugin. I don't think it's a bad idea for development, but I would switch to migrations after going to production when there is critical data in the database.
You may also be interested in replacing ActiveRecord with DataMapper, which works in Rails 3. It has the style you are talking about, with the description of the data fields of a model in the model code instead of a separate database schema file.
I think DataMapper is what you are asking for. Once set up, you'd either use DataMapper.auto_migrate! or DataMapper.auto_upgrade!. The former will drop tables if they exists before creating them, thus destroying any data. That would be bad for production. The latter is how you avoid losing data, and should be just fine for production.
Without knowing exactly what its doing, I'd guess it's inspecting tables during startup to determine whether to make database changes. That can drag down start up time, especially with a lot of models/tables. Which is actually one of the good reasons to consider NoSQL - specifically Mongo as mentioned above. It's fast. Really fast, and thus the start up cost is much, much less. MongoMapper is the way to go. The tekpub blog post is a must read.
I first heard about DataMapper in reading about Merb, so it makes sense that it's in rails 3. I don't know whether you may be able to get it working in rails 2.x.
Related
This is most likely a noob question since people use this gem and a lot of people love it, but I don't get the purpose. I'm looking at a project and its been used here many times in places such as t.references :foreign_key_table_name , :foreign_key => true, add_foreign_key :table :foreign_key_table_name, :options, and in a create t.foreign_key :foreign_key_table_name. Hope those weren't confusing since they're out of context.
But I don't get how this is different from what rails does built in with t.references :foreign_key_table_name or from me just adding t.integer :foreign_key_table_name_id? does it simply make it more readable by making clear that this is a 'foreign key'? I could just add a comment instead of a gem if thats the case... The only advantage I see is that you can move options such as :dependent into the migration instead of having it in the model, but who cares?
Some database engines support legit foreign key constraints: if someone tries to save a Child with a parent_id of 5, but there's no Parent with id 5, then the database itself (not Rails) will reject the record if there's a foreign key constraint linking children.parent_id and parents.id.
A foreign key can also specify what happens if the parent is deleted: in MySQL, for example, we can delete or nullify the dependent records, like how Rails does with :dependent, or even just straight-up reject the deletion and throw an error instead.
Since not all database engines offer this functionality, Rails offers to emulate it with :dependent, and it's nice to have it on the software level so that dependent child records can fire their destroy callbacks when the parent is deleted. Since the feature is engine-independent and therefore pretty much schema-independent, Rails doesn't handle the creation/deletion of foreign keys. That's where foreigner comes in: if your engine supports foreign key constraints, and you want that extra confident in your data integrity, foreigner can help with that.
Resurrecting an old question here, but…
Having rails enforce the relationship is fine, within rails itself.
However, if your project grows to have code that also accesses these tables from other languages, that will not have the benefit of rails enforcing the relations. These foreign key constraints are baked into the SQL tables themselves, so can protect non-rails code.
This will also protect you if you need to perform datafixes or otherwise manipulate your data via native SQL.
Another reason is that some documentation tools for SQL look at foreign keys on the DB, so it is cool to have a gem that generates them. Rails 4 added the ability to define foreign keys in the same migration that creates the table with:
t.references :something, foreign_key: true
And the generators will do this for you if you use the references type. Rails adds an index on something_id by default when using foreign_key like this
Can someone give me a short introduction to doing DB migrations in Rails using Mongoid? I'm particularly interested in lazy per document migrations. By this, I mean that whenever you read a document from the database, you migrate it to its latest version and save it again.
Has anyone done this sort of thing before? I've come across mongoid_rails_migrations, but it doesn't provide any sort of documentation, and although it looks like it does this, I'm not really sure how to use it.
I should point out I'm only conceptually familiar with ActiveRecord migrations.
If you want to do the entire migration at once, then mongoid_rails_migrations will do what you need. There isn't really much to document, it duplicates the functionality of the standard ActiveRecord migration. You write your migrations, and then you use rake db:migrate to apply them and it handles figuring out which ones have and haven't been ran. I can answer further questions if there is something specific you want to know about it.
For lazy migrations, the easiest solution is to use the after_initialize callback. Check if a field matches the old data scheme, and if it does you modify it the object and update it, so for example:
class Person
include Mongoid::Document
after_initialize :migrate_data
field :name, :type => String
def migrate_data
if !self[:first_name].blank? or !self[:last_name].blank?
self.set(:name, "#{self[:first_name]} #{self[:last_name]}".strip)
self.remove_attribute(:first_name)
self.remove_attribute(:last_name)
end
end
end
The tradeoffs to keep in mind with the specific approach I gave above:
If you run a request that returns a lot of records, such as Person.all.each {|p| puts p.name} and 100 people have the old format, it will immediately run 100 set queries. You could also call self.name = "#{self.first_name} #{self.last_name}".strip instead, but that means your data will only be migrated if the record is saved.
General issues you might have is that any mass queries such as Person.where(:name => /Foo/).count will fail until all of the data is migrated. Also if you do Person.only(:name).first the migration would fail because you forgot to include the first_name and last_name fields.
Zachary Anker has explained a lot in his answer.using mongoid_rails_migrations is a good option for migration.
Here are some links with example that will be useful for you to go through and use mongoid_rails_migrations
Mongoid Migrations using the Mongo Driver
Embedding Mongoid documents and data migrations
Other then this the Readme is should be enough with this example to implement mongoid migration
I have the same need.
Here is what I came up with: https://github.com/nviennot/mongoid_lazy_migration
I would gladly appreciate some feedback
I have just started with Rails and coming from a .net background I find the model inheriting from ActiveRecord is hard to understand, since the don't contain the corresponding attributes for the model. I cannot imagine a new developer exposed to a large code where the models only contains references to other models and business logic.
From my point of view the DataMapper model is much easier to grasp but since ActiveRecord is the defacto standard it feels weird to change the ORM just for this little problem.
DataMapper
class Post
include DataMapper::Resource
property :id, Serial # An auto-increment integer key
property :title, String # A varchar type string, for short strings
property :body, Text # A text block, for longer string data.
property :created_at, DateTime # A DateTime, for any date you might like.
end
ActiveRecord
class Post < ActiveRecord::Base
end
I'm not sure if this is an issue and that people get used to the models without attributes, or how does experienced rails user handle this?
I don't think using the database manager or looking at loads of migrations scripts to find the attributes is an option?
Specifying attr_accessible will make the model more readable but I'm not sure if it's a proper solution for my problem?
Check out the annotate_models plugin on github. It will insert a commented schema for each model in a comment block. It can be installed to run when migrate is.
You don't have to "look at loads of migration scripts to find the attributes" - they're all defined in one place in db/schema.rb.
A few tips:
Load up the Rails console and enter
Post.column_names for a quick
reminder of the attribute names.
Post.columns gives you the column
objects, which shows the datatypes
db/schema.rb contains all the
migration code in one place, so you
can easily see all the column
definitions.
If you are using a
decent editor/IDE there should be a way to
allowing you to jump from the model file
to the migration file. (e.g. Emacs
with ROR or Rinari)
I want to encrypt database because confidential data is being stored. I use mongodb with mongoid. It possible for this kind of database? And what alternatives can you recomend, if it is not?
P.S. Main purpose is: if anybody hack the server and steal DB, it would be unencryptable.
UPDATE: thanks for nickh, I found very many soultions for ActiveRecord, but nothing for Mongoid and other Mongo clinets. It would be great to find some soultion for Mongo and Mongoid!
I have gotten attr_encrypted working with Mongo and Mongoid. It takes only a few tweaks.
Make sure that all of the encrypted_ fields that are automatically created by attr_encrypted are explicitly created in the model. For instance, if you have:
attr_encrypted :email, :key => 'blah blah blah', :encode => true
you need to have:
field :email, :type => String
field :encrypted_email, :type => String
Also notice you need to tell it to encode the encrypted string otherwise Mongo will complain loudly.
Lastly, if you're encrypting a hash, do this:
field :raw_auth_hash, :type => Hash
field :encrypted_raw_auth_hash, :type => String
attr_encrypted :raw_auth_hash, :key => 'blah', :marshal => true, :encode => true
I've had a lot of success with the attr_encrypted gem. However, I've only used it with ActiveRecord. I don't know if it works with MongoMapper or Mongoid.
Regardless of how you implement this, I strongly recommend only encrypting certain fields. Don't encrypt every field in every table. Doing that will make it difficult to use associations, search using LIKE, etc.
Try the mongoid-encrypted-fields gem - it is seamless as it handles encryption using mongoize/demongoize methods.
Just define your field like:
field :ssn, type: Mongoid::EncryptedString
Then you can access it like normal, but the data is stored encrypted.
http://ezcrypto.rubyforge.org/
Using postgreSQL with the ezcrypto gem atm - works reasonably well although there are limitations in using associations between models with encrypted fields (this maybe down to my inability to find the correct up-to-date fork of this project).
The encrypted fields are stored in the postgreSQL database as the BYTEA datatype and will usually require for single quotes to be escaped (another issue with the plugin),
PostgreSQL does also have access to its own encryption / decryption modeul 'pgcrypto' which also returns a BYTEA datatype. Not sure how this would integrate with Rails activerecord and associations between models (probably badly :D).
I use MongoDB in an app with the Mongoid ruby adapter. Ryan Bates (the demigod of Rails) recently made an outstanding railscast on this very issue http://railscasts.com/episodes/250-authentication-from-scratch.
I'm using this in a MongoDB app and it works perfectly for encrypting data. His tutorial video is mostly for encrypting passwords, but you can adapt it to any other field value you want.
I also have used attr_encrypted with much success but I'm not sure if it will work with MongoDB; only used it with ActiveRecord.
Simple question that used to puzzle me about Rails:
Is it possible to describe a Model's structure from within the model rb file?
From what I understand a model's data structure is kept within the migration, and the model.rb file is supposed to contain only the business logic.
Why is it so? Why does it make more sense to migrate the database with a rake task than to extract it from the class?
The reason migrations are stored separately is so that you can version your database. This would be unwieldy if done inline in the model.
Other ORMs (like DataMapper) do store the schema in the model definition. I think it's really convenient to be able to see model attributes right there, but it is unfortunate to not have the history of your database structure.
What I really wish is that running the migrations would just insert some comments at the top of the model file detailing the schema. That should be a simple hack.
Migrations do not simply show the state of the database schema.
They define the transitions from one state to another.
In a comment to cam's post, you said having the schema in the model would do the same thing, if you had the model's source stored in a VCS, you could look up the previous versions of the schema.
Here is why that is not equivalent to migrations:
Schema Version 1
string :name
string :password
string :token
Schema Version 2
string :username
string :displayname
string :password
string :token
So, what did I do here? What happened to "name"? Did I rename it to username? Or maybe I renamed it to displayname? Or did I drop it entirely?
You don't know. There's no way to tell. You only see the "before" and "after" of the schema. You don't see the transition.
Let's instead look at what I really did with this migration:
class UpdateNameFields < ActiveRecord::Migration
def self.up
rename_column :users, :name, :username
add_column :users, :displayname
User.update_all("displayname = username")
end
def self.down
remove_column :users, :displayname
rename_column :users, :username, :name
end
end
See, I had been using "name" for usernames. But you wouldn't be able to tell that without the migration here. Plus, in an effort to not have my new displayname column be blank on all my existing records, I have seeded it with everyone's existing usernames. This lets me gently introduce this new feature - I can use it and know that existing records aren't going to just see a blank field.
Note that this is a trivial example. Because it was so trivial, you could take a guess that it was one of a couple possible options. But had it been a much more complex transition, you simply would not know.
Transitions from one schema to another can involve a more than just adding/deleting/renaming columns. I gave a little example above in my User.update_all. Whatever code you might need to execute to migrate data to the new schema, you can put in the migration.
When people say migrations are about "versioning the database", they don't just mean versioning the snapshot of the schema. They mean the ability to move between those schema states, and triggering all of the actions involved in going from one state to another.