How to share a database, migrations, and models between Rails projects? - ruby-on-rails

I know there are some questions to this effect already on StackOverflow, but they tend to be pretty outdated and don't adequately address how migrations are supposed to work in the following scenario, which should be fairly common:
You have some kind of application, implemented in Rails.
You have some kind of admin application for your data, and it is a separate application implemented in Rails.
Both applications operate on the same database and models.
My question is: what is the best way to factor our models such that both of these applications don't have to duplicate model code?
We are concerned with the following:
For the shared models, where should database migrations live?
What if each individual application wishes to add additional models on top of shared models? Where do these migrations live?
What is the best way to move existing migrations into the proposed shared migration scheme?
Thanks.

I don't know if this is THE approach, and would love to see other ideas, but what we do in one of our products that matches this model:
For the shared models, where should database migrations live?
We keep all of our migrations under the admin system. You don't need them to exist twice, so that's where they go.
What if each individual application wishes to add additional models on top of shared models? Where do these migrations live?
We share all models. It might only be relevant to one application at the moment, say - a favourited_items concept might only matter to the end user. But at a later date the admin might want to know what items are most frequently favourited.
Secondly, if you want to investigate anything via console, it's really quite helpful if you don't need to visit separate applications because they don't both have models for every table.
Functionality in shared models that differs per application detects the rails environment variable, which we have extended to include more context. E.g.: if Rails.env == 'admin_production'
What is the best way to move existing migrations into the proposed shared migration scheme?
Again, migrations should only ever exist once, and the shared database knows which have already been run, so unless you're renaming the migrates you just need to pick a location and move the files.

Related

Ruby on Rails database workflow

I'm a bit of a Rails beginner, so I'm sorry if this question is easy. I'm wondering how I am supposed to modify the Rails database as a developer without letting users of the site modify data.
More information:
Let's say I am trying to create a website that lists books along with their information. I want users to be able to see these books, but not add/delete them. As a developer, I want a way to add books without using the command line (hard to edit mistakes). What's the easiest way for me to do this? Would there be any differences between the development database and a live hosted one?
I think there are a few approaches to do this.
different user roles. This technically breaks the rule of without letting users of the site modify data, but being able to differentiate between normal and admin users means you can limit who actually can add data into the database. I used cancancan as a way to authorize requests but I know there are others.
I agree doing it using the command line isn't ideal, but rails do support rake tasks. You can create a task that will handle most of the logic and all you need to do is something like
rake create_book["name here"]
This will make the process less error-prone.
Create data using rails migrations. Rails can generate the skeleton file for you, and you just ran any ActiveRecord methods. However, the main purpose of migration is to update the database schema, but they can seed the database as well. here's the example from the dcos
Would there be any differences between the development database and a live-hosted one?
Yea, they should be totally separate database instances. You don't want to have your development database be the same as the live one. This would cause major problems. Rails have a concept of environments where you can use different configurations, so you can pick and choose what database URL to use.
like #davidhu said here The best approach really is the use of authorization. If only admin can see a page to CRUD the books then you don't have to worry about normal users doing same plus it makes it easy for you as the admin to make changes or add to the collection. Add the gem (or reinvent the wheel) then Rails will take care of the rest for you.

How do I link two rails apps databases and models together?

So I have an e-commerce site and I was tasked with linking my rails(4) app with another website(our manufacturing vendor) to show orders placed and to update inventory from their side. Aka I need to share a database and models between two rails apps. The vendor website is not built out just an FYI, I have to build it from scratch.
Side Note: I know a custom admin panel would be easier but, my boss, who doesnt know tech at all seems to be adamant on having these on separate URLs, even though he owns both companies, the one I work for and the manufacturing company.
I need just maybe a roadmap, or just a point in the right direction because I am not exactly sure what I am looking for. Just some solid information about how to do this.
I currently use heroku and AWS for my existing rails app.
I use PostgreSQL as my DB.
You would need to use establish_connection
I've only done this sort of thing once so the below might be wrong but will hopefully give you an idea. Not sure what you mean by link the models. But this would allow you to query the vendor db and have models you'd want to namespace that would relate to those databases.
I'd setup an entry in database.yml for the vendor database (probably one for each environment)
I would then have a base vendor model that all the others inherit from
class VendorApplicationRecord < ActiveRecord::Base
self.abstract_class = true
establish_connection "vendor_#{ Rails.env }"
end
Assuming you setup database.yml to have vendor_development e.t.c
Sharing database is very easy. Actually, it's just the same as not sharing database. You just put appropriate configuration in the database.yml and declare models.
Now you might be thinking "hey, that's duplication of code! Why don't I just extract models in a gem or something?". To which I say, don't. Just duplicate what's necessary. It is very likely that in one app you'll need full-blown models, with validations, before/after_save hooks and whatnot. And the other app will only read data, so it doesn't need any of that extra complexity.

Strategies for Rails development/test development with complex legacy schema

When developing a Rails app with a legacy schema that is in-use by an existing application, if tables have NOT NULL constraints on foreign id columns throughout the schema, in order to create/save models for tests, models need to exist for those associations, and their associations, etc. So, it isn't an easy as just creating one model at a time as you need it and testing with it.
As far as testing goes, this seems to be a problem if you are using FactoryGirl and want to create and save model instances to return from controllers, etc. when all of the association dependencies are involved. Another option is to mock, but mocking can be a little more time consuming and it doesn't allow you to do integration testing as easily. Another is to use fixtures, but those are time consuming and brittle. Another is to pre-populate the test database with production data, but that doesn't solve the need for factories, etc./known data to expect in tests, and Rails usually expects to start with a clean DB for the test environment.
What strategies for developing models, tests, etc. do you use when you have an existing complex schema that you are bolting a Rails app onto- not just for reading data, but also writing to the existing schema that is in use by an existing production application? (i.e. the "rebuild the ship while at sea" problem)
When we first started out on this problem, I was looking for something that could autogenerate models from the existing schema. I found Dr. Nic's magic model generator, but it only stated that worked in Rails 2.x. I got a Rails 2.x environment running (forget the specific version it worked in) but it didn't seem to help much, so I wrote a script to generate our models. However, when we started development we then had a load of models, so we started to try to move out the ones we didn't need, then needed to comment out associations to models that were no longer there, but some of those were required/NOT NULL foreign keys, so we're now having to move them back in and uncomment those more than we thought we would.
We did write and uncover some things that were helpful during the process, not all of which we've used, but they might be helpful to others:
https://github.com/influitive/apartment
https://github.com/karledurante/secondbase
https://github.com/drnic/composite_primary_keys
https://github.com/garysweaver/activerecord-define_nils
https://github.com/garysweaver/activerecord-native_db_types_override
https://github.com/garysweaver/modelist
https://github.com/garysweaver/stepford
https://github.com/garysweaver/structure_sql_to_migration
As for test data setup, I started writing a tool to automate the development of FactoryGirl factories called Stepford and wrote Modelist to help us test models, resolve circular dependencies, and identify model dependencies, so we don't have to include all automatically generated models.
So far really, the only answer I've come up with and have heard other say also is that rebuilding an existing application, even piecemeal, with different technologies is difficult, slow, and error-prone.

Should I use multiple databases?

I am about to create an application with Ruby on Rails and I would like to use multiple databases, basically is an accounting app that will have multiple companies for each user. I would like to create a database for each company
I found this post http://programmerassist.com/article/302
But I would like to read more thoughts about this issue.
I have to decide between MySQL and PosgreSQL, which database might fit better my problem.
There are several options for handling a multi-tenant app.
Firstly, you can add a scope to your tables (as suggested by Chad Birch - using a company_id). For most use-cases this is fine. If you are handling data that is secure/private (such as accounting information) you need to be very careful about your testing to ensure data remains private.
You can run your system using multiple databases. You can have a single app that uses a database for each client, or you can have actually have a seperate app for each client. Running a database for each client cuts a little against the grain in rails, but it is doable. Depending on the number of clients you have, and the load expectations, I would actually suggest having a look at running individual apps. With some work on your deployment setup (capistrano, chef, puppet, etc) you can make this a very streamlined process. Each client runs in a completely unique environment, and if a particular client has high loads you can spin them out to their own server.
If using PostgreSQL, you can do something similar using schemas.
PostgresQL schemas provide a very handy way of islolating your data from different clients. A database contains one or more named schemas, which in turn contain tables. You need to add some smarts to your migrations and deployments, but it works really well.
Inside your Rails application, you attach filters to the request that switch the current user's schema on or off.
Something like:
before_filter :set_app
def set_app
current_app = App.find_by_subdomain(...)
schema = current_app.schema
set_schema_path(schema)
end
def set_schema_path(schema)
connection = ActiveRecord::Base.connection
connection.execute("SET search_path TO #{schema}, #{connection.schema_search_path}")
end
def reset_schema_path
connection = ActiveRecord::Base.connection
connection.execute("SET search_path TO #{connection.schema_search_path}")
end
The problem with answers about multiple databases is when they come from people who don't have a need or experience with multiple databases. The second problem is that some databases just don't allow for switching between multiple databases, including allowing users to do their own backup and recovery and including scaling to point some users to a different data server. Here is a link to a useful video
http://aac2009.confreaks.com/06-feb-2009-14-30-writing-multi-tenant-applications-in-rails-guy-naor.html
This link will help with Ruby on Rails with Postgresql.
I currently have a multi-tenant, multi-database, multi-user (many logons to the same tenant with different levels of access), and being an online SaaS application. There are actually two applications one is in the accounting category and the other is banking. Both Apps are built on the same structure and methods. A client-user (tenant) can switch databases under that user's logon. An agent-user such as a tax accountant can switch between databases for his clients only. A super-user can switch to any database. There is one data dictionary i.e. only one place where tables and columns are defined. There is global data and local data. Global data such as a master chart-of-accounts which is available to everyone (read only). Local data is the user's database. A new user can get a clone of a master database. There are multiple clones to choose from. A super-user can maintain the clone databases.
The problem is that it is in COBOL and uses ISAM files and uses the CGI method. The problem with this is a) there is a perception that COBOL is outdated, b) getting trained people, c) price and d) online help. Otherwise it works and I'm happy with it.
So I'm researching what to replace it with and what a minefield that is.
It has past time and the decission for this has been to use PostgreSQL schemas, making multitenant applications, I have a schema called common where related data is stored.
# app/models/organisation.rb
class Organisation < ActiveRecord::Base
self.table_name = 'common.organisations'
# set relationships as usual
end
# app/models/user.rb
class User < ActiveRecord::Base
self.table_name = 'common.users'
# set relationships as usual
end
Then for migrations I have done that with this excellent tutorial. http://timnew.github.com/blog/2012/07/17/use-postgres-multiple-schema-database-in-rails/ use this, this is way better than what I saw in other places even the way Ryan Bates did on railscasts.
When a new organisation is created then a new schema is created with the name of the subdomain the organisation. I have read in the past that it's not a good idea to use different schemas but it depends on the job you are doing, this app has almost no soccial component so it's a good fit.
No, you shouldn't use multiple databases.
I'm not really sure what advice to give you though, it seems like you have some very basic misunderstandings about database design, you may want to educate yourself on the basics of databases first, before going further.
You most likely just want to add a "company id" type column to your tables to identify which company a particular record belongs to.

Generate new models and schema at runtime

Let's say your app enables users to create their own tables in the database to hold their own, custom data. Each table would have it's own schema. What are some good approaches?
My first stab involved dynamically creating migration files and model files bu I'd like to run this on heroku where you can't write to the filesystem.
I'm thinking eval may be the way to go to create and run the migration class and the model class. But I want to make sure the model class exists when a new process of the app is spawned. Can probably do this by storing these class definition with each user as they create new tables and then run through them all at startup. But now it's convulted enough that I may be missing something obvious.
It's probably a better idea not to generate new classes on runtime. Besides all of the security risks, each thread's startup time will be abominable if you ever get a significant number of users.
I would suggest rethinking your app design and aim at generic tables to hold the user's custom data. If you have examples of data structures that users can create we might be able to help.
Have you thought about a non-sql database for those tables? Look at CouchDB - there are several plugins on Github that integrate it with rails. Records in the database are JSON documents, with arbitrary key-value structure. May be perfect for a user-defined schema.
There is (was?) a cool Wiki project, called Informl. It was a Wiki, not just for web pages but for web applications. (Get it? It's informal because it's a Wiki, it's got forms because it is an application, and it's user-generated, thus Web 2.0, which means that according to an official UN resolution it is legally required to have a name which is missing a vwl.)
So, in other words, it was not just about user-generated content, but also user-generated structured data.
They did this by generating PostgreSQL-specific SQL at runtime to create new tables and then have ActiveRecord reload the schemas.
The code is up on RubyForge. It's based on Rails 1.2.3. I guess you could do much better than that today, especially with the upcoming extensibility interfaces in Rails 3.

Resources