Integrate new rails app with existing external MS SQL Server database - ruby-on-rails

I need to create a new rails application that will use an existing external Microsoft SQL server database as its primary database. This DB is running on an external server that i'll need to connect to. Here where i work there's a separate DB Admins section so im not sure what steps i need to perform. I think the easy part is connect to the server and set the app to work with that DB. But i get confused after that.
Should i create a db dump and migration?
Should i add models to represent existing database structure?
In case i need to create new models and tables in the db...
Should i generate its models from the rails generator and run migrations on the db or contact the DB admins to create the tables, and then specify manually the models inside the app? <- this really confuses me
Thank you in advance and sorry for my english! :)
EDIT!! -> The Rails Way
I've found another great way to integrate multiple database data... You can define a model and specify in which database it's located (so db connection is never closed)... This is definitely The Rails Way and should be followed...
Here's a nice tut:
Multiple Databases in Rails
TL;DR
Define your secondary_db in database.yml
Define your model and specify its database and table name
Class ExternalDatabaseModel < ActiveRecord::Base
establish_connection(:secondary_db)
self.table_name = "the_table"
self.primary_key = 'someWeirdIDColumn'
end
Then use as any other rails model, specify relationships and more :)
Hope it helps to somebody!

I would suggest you take a look at tiny tds gem: https://github.com/rails-sqlserver/tiny_tds. It's really use to config and use... you can use it to connect on your external sql server db and make the queries you want. Don't know which type of system you're working, but if you're constantly retrieving data, making external requests may get your system slower. Depending on your application, you could consider synchronizing the data, making a local database redundant. Hope it helps you, good luck!
EDIT: about the models part: if you're going to retrieve the data from that sql server every time you need, than you don't need to store it locally. But if you're doing that redundancy, you should create the models with the attributes you'll need, than when you retrieve from the server, you save it normally, just attributing the values to your model and calling save! method.

Related

How to access multiple data sources in a single Ruby on Rails model?

The application I'm currently working on requires me to access multiple data sources from within a single model. I want to know what's the best way to accomplish it. Can I do something like the following?
class SomeModel < ActiveRecord::Base
establish_connection :data_source1
# Get some data from data_source1 and store in a instance variable
establish_connection :data_source2
# Get some data from data_source2 and store in a instance variable
end
Appreciate for your suggestions.
This is a super bad idea as #jvillian has said. What you want here is two different models each connected to their respective data source, something ActiveRecord does allow by default.
That part can be a bit tricky but won't confuse ActiveRecord.
The big issue here is it prefers to cache the schema when it connects and persists that for the life of the process. If the databases differ in any way you'll have chaos, plus if you keep switching the connection you may completely mess up the transaction planner which has to take into account which models live in which database for rollback purposes.

Can a Drupal based website's schema be imported to Rails?

I'm working on a web site that needs to be re-written in Rails. The website was before in Drupal, and there are almost 100,000 records in the database. Now, in Drupal there are tables that do not make any place in Rails in my opinion. For example,
Table name: node_type
It stores information regarding modules in Drupal.
Table name: node
It stores information for node(s) in Drupal.
Table name: semaphore # I've no idea what it is!
Table name: rdf_mapping # No idea
I've not been working with Drupal, so all I want to ask: Is it possible to have a schema for Rails, in which the existing 100,000 records can be imported from Drupal? If so, how? If not so, what are the other options that I'm left with? Or I have to design an entirely new database schema?
Drupal's database schema is not extensively documented for a reason... it's considered implementation details, is not a public API and should not be accessed directly, especially by outside application.
It is also very hard to document because for a given site, any enabled module can add its own tables and alter existing ones (usually adding columns). Plus you have module like Fields (part of Drupal core) that create tables dynamically depending on defined content types.
For a RoR developer, the Drupal schema will probably look weird and be uncomfortable to work with. I would follow suggestion from others, create a new schema for your new application and create a migration script to get the data from the old Drupal database to your new database. I don't knwon about RoR, but try to find a good data migration that allows replay, updates and rollback, etc. You will probably have to migrate the data multiple times to fixes bugs in the process.
Well, I don't have straight forward answers, but I have some ideas what I would do simply to not make so much changes in the database, or as per the comment you can write down an sql script to migrate the data according to the rails schema like types for each tables. Now, I am just intended here to share my thoughts, but I believe there might be more explicit solutions and this is do-able in many ways, may be you need some customizations(?) overriding the default conventions. According to my thoughts, you can try the following things.
Generated Related model skipping migrations
Define tables explicitly to each models like the following snippets:
class Semaphore < ActiveRecord::Base
self.table_name = "semaphore"
end
You have to define foreign keys and primary keys explicitly for both record id and associations.
You have generate time stamp or you can explicitly avoid that like the following ways
ActiveRecord::Base.record_timestamps = false
These are basic things I can see is important.

How can someone add a sql server table from a different database to a ruby on rails application?

I have a database labor which contains the tables that the rails application has created and i have a different database called data_collection that contains a table called employees that i need to pull information so that my application can use it. It doesn't need to be edited from the application only readable. All of these databases are sql server. How would i go about doing this?
Thanks
you could define an activerecord connection for each Model you have , here some example .
link
have a nice day !
You could also see if your friendly DBA has set up a Linked server connection for you to use. Or you could try using the OPENROWSET feature in TSQL. Or you could just do what #andrea suggests.

Should I use multiple databases?

I am about to create an application with Ruby on Rails and I would like to use multiple databases, basically is an accounting app that will have multiple companies for each user. I would like to create a database for each company
I found this post http://programmerassist.com/article/302
But I would like to read more thoughts about this issue.
I have to decide between MySQL and PosgreSQL, which database might fit better my problem.
There are several options for handling a multi-tenant app.
Firstly, you can add a scope to your tables (as suggested by Chad Birch - using a company_id). For most use-cases this is fine. If you are handling data that is secure/private (such as accounting information) you need to be very careful about your testing to ensure data remains private.
You can run your system using multiple databases. You can have a single app that uses a database for each client, or you can have actually have a seperate app for each client. Running a database for each client cuts a little against the grain in rails, but it is doable. Depending on the number of clients you have, and the load expectations, I would actually suggest having a look at running individual apps. With some work on your deployment setup (capistrano, chef, puppet, etc) you can make this a very streamlined process. Each client runs in a completely unique environment, and if a particular client has high loads you can spin them out to their own server.
If using PostgreSQL, you can do something similar using schemas.
PostgresQL schemas provide a very handy way of islolating your data from different clients. A database contains one or more named schemas, which in turn contain tables. You need to add some smarts to your migrations and deployments, but it works really well.
Inside your Rails application, you attach filters to the request that switch the current user's schema on or off.
Something like:
before_filter :set_app
def set_app
current_app = App.find_by_subdomain(...)
schema = current_app.schema
set_schema_path(schema)
end
def set_schema_path(schema)
connection = ActiveRecord::Base.connection
connection.execute("SET search_path TO #{schema}, #{connection.schema_search_path}")
end
def reset_schema_path
connection = ActiveRecord::Base.connection
connection.execute("SET search_path TO #{connection.schema_search_path}")
end
The problem with answers about multiple databases is when they come from people who don't have a need or experience with multiple databases. The second problem is that some databases just don't allow for switching between multiple databases, including allowing users to do their own backup and recovery and including scaling to point some users to a different data server. Here is a link to a useful video
http://aac2009.confreaks.com/06-feb-2009-14-30-writing-multi-tenant-applications-in-rails-guy-naor.html
This link will help with Ruby on Rails with Postgresql.
I currently have a multi-tenant, multi-database, multi-user (many logons to the same tenant with different levels of access), and being an online SaaS application. There are actually two applications one is in the accounting category and the other is banking. Both Apps are built on the same structure and methods. A client-user (tenant) can switch databases under that user's logon. An agent-user such as a tax accountant can switch between databases for his clients only. A super-user can switch to any database. There is one data dictionary i.e. only one place where tables and columns are defined. There is global data and local data. Global data such as a master chart-of-accounts which is available to everyone (read only). Local data is the user's database. A new user can get a clone of a master database. There are multiple clones to choose from. A super-user can maintain the clone databases.
The problem is that it is in COBOL and uses ISAM files and uses the CGI method. The problem with this is a) there is a perception that COBOL is outdated, b) getting trained people, c) price and d) online help. Otherwise it works and I'm happy with it.
So I'm researching what to replace it with and what a minefield that is.
It has past time and the decission for this has been to use PostgreSQL schemas, making multitenant applications, I have a schema called common where related data is stored.
# app/models/organisation.rb
class Organisation < ActiveRecord::Base
self.table_name = 'common.organisations'
# set relationships as usual
end
# app/models/user.rb
class User < ActiveRecord::Base
self.table_name = 'common.users'
# set relationships as usual
end
Then for migrations I have done that with this excellent tutorial. http://timnew.github.com/blog/2012/07/17/use-postgres-multiple-schema-database-in-rails/ use this, this is way better than what I saw in other places even the way Ryan Bates did on railscasts.
When a new organisation is created then a new schema is created with the name of the subdomain the organisation. I have read in the past that it's not a good idea to use different schemas but it depends on the job you are doing, this app has almost no soccial component so it's a good fit.
No, you shouldn't use multiple databases.
I'm not really sure what advice to give you though, it seems like you have some very basic misunderstandings about database design, you may want to educate yourself on the basics of databases first, before going further.
You most likely just want to add a "company id" type column to your tables to identify which company a particular record belongs to.

Generate new models and schema at runtime

Let's say your app enables users to create their own tables in the database to hold their own, custom data. Each table would have it's own schema. What are some good approaches?
My first stab involved dynamically creating migration files and model files bu I'd like to run this on heroku where you can't write to the filesystem.
I'm thinking eval may be the way to go to create and run the migration class and the model class. But I want to make sure the model class exists when a new process of the app is spawned. Can probably do this by storing these class definition with each user as they create new tables and then run through them all at startup. But now it's convulted enough that I may be missing something obvious.
It's probably a better idea not to generate new classes on runtime. Besides all of the security risks, each thread's startup time will be abominable if you ever get a significant number of users.
I would suggest rethinking your app design and aim at generic tables to hold the user's custom data. If you have examples of data structures that users can create we might be able to help.
Have you thought about a non-sql database for those tables? Look at CouchDB - there are several plugins on Github that integrate it with rails. Records in the database are JSON documents, with arbitrary key-value structure. May be perfect for a user-defined schema.
There is (was?) a cool Wiki project, called Informl. It was a Wiki, not just for web pages but for web applications. (Get it? It's informal because it's a Wiki, it's got forms because it is an application, and it's user-generated, thus Web 2.0, which means that according to an official UN resolution it is legally required to have a name which is missing a vwl.)
So, in other words, it was not just about user-generated content, but also user-generated structured data.
They did this by generating PostgreSQL-specific SQL at runtime to create new tables and then have ActiveRecord reload the schemas.
The code is up on RubyForge. It's based on Rails 1.2.3. I guess you could do much better than that today, especially with the upcoming extensibility interfaces in Rails 3.

Resources