One Rails model with two database choices, chosen on instantiation - ruby-on-rails

My Rails app (let's call it "Mira") will be interfacing with an existing app (let's call it "Jira"). Mira will store information about Jira and will be able to directly manipulate its database (because Jira, we'll say, has an incomplete API).
Since I want to directly manipulate Jira's database, it makes sense to have models representing each of Jira's tables in my Mira app. That way I can use ActiveRecord to manipulate it.
But in fact! There are two Jiras. A Staging instance and a Production instance.
So now I want my model that was happily interfacing with one instance of Jira to be able to use a different database.
It would be super sweet if I could do this when I instantiate my model, perhaps like this:
Jira::CustomField.new(:staging)
or something like that.
Thoughts? Better ways to accomplish this? Is my goal as stated even possible?

As the documentation for ActiveRecord::Base discusses, it is easy to have different Rails model objects connecting to different databases using the establish_connection method.
However, if you want the same class to connect to multiple databases based on configuration, that will be kind of a pain. Do you need to use ActiveRecord here or could you use DataMapper? That would work better in this scenario I think. Check out What ORM to use in one process multiple db connections sinatra application? for an example

Related

Rails 5 App with Elasticsearch as database instead of a relational one

Is it possible to use Elasticsearch as database for a Rails Application?
I have gone through many sites, blogs, and videos to find the answer of this, but was unable to and this being the closest.
I am not sure how can it be done, what goes in the database/config.yml and will the schema be getting generated after migrate?
Yes, of course it is, but you cannot use ActiveRecord ORM, basically you'll have to create your own adapter.
If you want to go quick, I would advise you to create the activerecord models, just like any regular app, then use Searchkick and create mappings from your models.
You need to be aware that if you're not using a database to hold the values you'll need to create a repository to handle the CRUD operations in Elasticsearch.
Another option is to use https://github.com/elastic/elasticsearch-rails, but in both cases you need to have the Rails models.
If you really want to go for ElasticSearch only, in you controllers you need to call your own created repositories to fetch and save the records in ElasticSearch.
No, only these databases: MySQL, PostgreSQL, SQLite are supported if you want to use ActiveRecord, and there are also mappers for Mongo and the like.
There are some mappers and adapters out there though but I wouldn't touch them with a 10 foot barge pole - some things just shouldn't exist in this world.

In-memory ActiveRecord for production use

I'm currently extending some functionality from Redmine 1.4 issue tracker (it uses Rails 2.3.14 and Ruby 1.8.7).
One thing I don't really like about it for my current use case is that some classes that encapsulate business logic (roles, workflows) are database based instead of code based - our customer shouldn't be able to change roles and workflows without our intervention, and while I could just preload a pre-baked set of db data with the roles and workflows I want, I still think it would be a solution which is not the best for maintainability's sake.
What I'd like to do is something like that:
class Person < SomeInMemoryActiveRecordBaseClass
end
p = Person.new(:name => "Somebody)
p.save
And then I'd like to be able to invoke ActiveRecord methods on any Person object and from the Person class, even though there's no real database behind it.
This may seem a silly thing to do (If I were creating the project from scratch I'd simply not use an AR-based object, but since I'm building on something that already exists this is not an option), but the objects I need to work with in-memory are used in plenty of places and they're supposed to be AR-based; changing their usage everywhere would really be time consuming.
I've seen some projects, like nulldb, which are test-oriented, not production-oriented, and they don't implement all methods. Besides that, the AR interface seems quite sql-oriented and I don't know if it's really feasible to get an sql-emulation layer "right".
Is there any library that could suit my use-case of creating a fully-functional activerecord in-memory database?
(I've already thought about using a different connection and leveraging an inmemory sqlite db, but it seems a bit overengineered to me).

Building consumable uri/urls for a model (rails/datamapper/SOA)

Perhaps you can help me think this through to greater detail.
I need to build or make available a uri for a model instance that can be referenced or used by another application which may or may not be a rails application.
e.g.
I create a standard Post with content; I want to build a URL for that post another application can consume or reference by looking at the model in the database (or another less sticky fashion). Datamapper has a URI field, I want to build a canonical uri, store it there and have another application be able to access, announce, manipulate, etc.
Basically, I have several applications that may be in different places, that need to access the same model, to do differing things with the model. I need a way to make that happen clearly without putting them all in one monster application.
I've looked at Pubsubhub, RSS, etc. but haven't found any concrete examples of what I'm trying to do. Do I need to create an common API for the applications, etc?
DataMapper is very flexible about using existing databases.
Many people come to DataMapper because it can create and tear down the database structures without migrations. However, you do not have to work with it in that way.
I have had good success with using a large set of models owned by a central 'housekeeping' app and then declaring a small subset of the same models in separate 'interface' apps.
Some trial and error is required to figure out what works but it can certainly be done. I'd suggest putting your models in modules and including them across apps if possible.
A final point it sounds like you want URIs/URLs to be the primary interface. If that is the case I strongly suggest you look at Sinatra. It is entirely oriented around URLs (and I find Rails routes very obtuse).

One model - many DB connections in Rails?

(using Ruby on Rails and ActiveRecord)
In short: Is there some way how a model can use different DB schema for each request?
Why I need id: I have a working intranet application working for one company but I decided I would like to offer it to other companies. The Rails way would be to add company_id to each model and alway use this scope. But making separate DB schema for each company would make much more sense. Is there some standard way how to do it?
thanks!
What would be wrong with having a separate instance of your application for each company?
Adding company_id to all the models is absolutely the way to go. What you're talking about is very difficult to manage in the long haul, and it may be tricky to ensure the correct connection is used to store the correct data.
Although layering in differentiation like that is annoying, it can be done and proven in a fairly short period of time, and after that things will be easier to manage. With named_scope it is not hard to filter using attributes like that.
The simple alternative is to deploy the application more than once, with a different database.yml for each company, where the data is isolated on the application level, not within the application.
It would be easy to do this with Passenger (mod_rails) and a bit of shell scripting.

Generate new models and schema at runtime

Let's say your app enables users to create their own tables in the database to hold their own, custom data. Each table would have it's own schema. What are some good approaches?
My first stab involved dynamically creating migration files and model files bu I'd like to run this on heroku where you can't write to the filesystem.
I'm thinking eval may be the way to go to create and run the migration class and the model class. But I want to make sure the model class exists when a new process of the app is spawned. Can probably do this by storing these class definition with each user as they create new tables and then run through them all at startup. But now it's convulted enough that I may be missing something obvious.
It's probably a better idea not to generate new classes on runtime. Besides all of the security risks, each thread's startup time will be abominable if you ever get a significant number of users.
I would suggest rethinking your app design and aim at generic tables to hold the user's custom data. If you have examples of data structures that users can create we might be able to help.
Have you thought about a non-sql database for those tables? Look at CouchDB - there are several plugins on Github that integrate it with rails. Records in the database are JSON documents, with arbitrary key-value structure. May be perfect for a user-defined schema.
There is (was?) a cool Wiki project, called Informl. It was a Wiki, not just for web pages but for web applications. (Get it? It's informal because it's a Wiki, it's got forms because it is an application, and it's user-generated, thus Web 2.0, which means that according to an official UN resolution it is legally required to have a name which is missing a vwl.)
So, in other words, it was not just about user-generated content, but also user-generated structured data.
They did this by generating PostgreSQL-specific SQL at runtime to create new tables and then have ActiveRecord reload the schemas.
The code is up on RubyForge. It's based on Rails 1.2.3. I guess you could do much better than that today, especially with the upcoming extensibility interfaces in Rails 3.

Resources