Writing to 2 or more different data layers - ruby-on-rails

looking for a way in rails to allow me to write into 2 different data layers (databases) at the same time. first layer being the most important and should hold the request until finished, others can be processed in the background.
for example if i have a Person model and i create a new one, i want the entry to be save in MongoDB for example but later saves to MySQL, cassandra and so on.
any ideas and links are welcome.

I am not sure about any rails solution but there is one java based ORM that can help you in achieving exactly this.
You can try exploring https://github.com/impetus-opensource/Kundera. You would love it.
Kundera is a JPA 2.0 compliant Object-Datastore Mapping Library for NoSQL Datastores and currently supports Cassandra, HBase, MongoDB and all relational datastores (Kundera internally uses Hibernate for all relational datastores).
In your case you can use your existing objects along with JPA annotations to store them in Cassandra,MongoDB, MySQL etc.
Since this is in java you can build a java based service that you call from your rails app.

Related

Rails 5 App with Elasticsearch as database instead of a relational one

Is it possible to use Elasticsearch as database for a Rails Application?
I have gone through many sites, blogs, and videos to find the answer of this, but was unable to and this being the closest.
I am not sure how can it be done, what goes in the database/config.yml and will the schema be getting generated after migrate?
Yes, of course it is, but you cannot use ActiveRecord ORM, basically you'll have to create your own adapter.
If you want to go quick, I would advise you to create the activerecord models, just like any regular app, then use Searchkick and create mappings from your models.
You need to be aware that if you're not using a database to hold the values you'll need to create a repository to handle the CRUD operations in Elasticsearch.
Another option is to use https://github.com/elastic/elasticsearch-rails, but in both cases you need to have the Rails models.
If you really want to go for ElasticSearch only, in you controllers you need to call your own created repositories to fetch and save the records in ElasticSearch.
No, only these databases: MySQL, PostgreSQL, SQLite are supported if you want to use ActiveRecord, and there are also mappers for Mongo and the like.
There are some mappers and adapters out there though but I wouldn't touch them with a 10 foot barge pole - some things just shouldn't exist in this world.

MongoDB with PostgreSQL in One Rails App

Can I use MongoDB and a PostgreSQL in one rails app? Specifically I will eventually want to use something like MongoHQ. So far I have failed to get this working in experimentation. And it concerns me that the MongoDB documentation specifically says I have to disable ActiveRecord. Any advice would be appreciated.
You don't need to disable ActiveRecord to use MongoDB. Check out Mongoid and just add the gem plus any models along side any of your existing ActiveRecord models. You should note that MongoHQ is just a hosting service for MongoDB and can be used alongside any Object Document Mapper (ODM).
For further details check http://mongoid.org/en/mongoid/docs/installation.html. Just skip the optional 'Getting Rid of Active Record' step.
On a recent client site I worked with a production system that merged MySQL and MongoDB data with a single Java app. To be honest, it was a nightmare. To join data between the two databases required complex Java data structures and lots of code, which is actually databases do best.
One use-case for a two database system is to have the pure transactional data in the SQL database, and the aggregate the data into MongoDB for reporting etc. In fact this had been the original plan at the client, but along the way the databases became interrelated for transactional data.
The system has become so difficult to maintain that is is planned to be scrapped and replaced with a MongoDB-only solution (using Meteor.js).
Postgres has excellent support for JSON documents via it's jsonb datatype, and it is fully supported under Rails 4.2, out of the box. I have also worked with this and I find it a breeze, and I would recommend this approach.
This allows an easy mix of SQL and NoSQL transactions, eg
select id, blast_results::json#>'{"BlastOutput2","report","results","search","hits"}'
from blast_caches
where id in
(select primer_left_blast_cache_id
from primer3_output_pairs where id in (185423,185422,185421,185420,185419) )
It doesn't offer the full MongoDB data manipulation features, but probably is enough for most needs.
Some useful links here:
http://nandovieira.com/using-postgresql-and-jsonb-with-ruby-on-rails
https://dockyard.com/blog/2014/05/27/avoid-rails-when-generating-json-responses-with-postgresql
There are also reports that it can outperform MongoDB on json:
http://www.slideshare.net/EnterpriseDB/the-nosql-way-in-postgres
Another option would be to move your Rails app entirely to MongoDB, and Rails has very good support for MongoDB.
I would not recommend running two databases, based on personal observations on how it can go bad.

How should I go about using a rdbms and mongodb in a rails app?

I'm currently testing the waters with mongoid and have so far begun on an ecommerce store. Now of course mongoid doesn't have transactions so I'd like to ideally use mongoid for most of the app including authentication, authorization, product information etc.
However, the lack of transactions necessitate a return to an rdbms. The rdbms would be used purely to record financial transactions.
Is this possible in rails and has anyone done it?
I have limited experience with rails in general but I imagine having the secure part mounted as a engine and urls scoped under secure.myapp.com or myapp.com/secure/ and the user would be redirected to the ssl while rack takes care of things like shared sessions.
Would this work? Or has anyone found a better way of implementing this?
It is possible to mix mongoDB and a traditional RDMS, but you may have to do some extra coding on your part if you want ActiveRecord objects to communicate with MongoDB objects, since the ORMs are different. Keep in mind that while it is true that MongoDB does not support transactions across multiple documents, it does support 'transactional' atomic updates - which means that if all the data you are updating is contained within a single document you don't have to worry about transactions. MongoDB also supports safe updates, allowing you to verify that data has been written to n different replica servers and has been persisted to disk.
As for shared sessions between HTTPS and HTTP - this is not something you have to worry about. You'll define your session store as either MongoDB, MySQL, Memcached or, my recommendation, Cookies. As long as you define your domain as '.myapp.com' the cookies will be shared across all subdomains of your application regardless of the protocol.
While I can't comment directly on the rails aspect of the question, as with the first poster's response, MongoDB does support transactional updates. It's probably simpler to implement your entire system in Mongo, or in an RDBMS.
The real question is what is the motivation behind using mongo here? What are you hoping to gain from a document database model? Do you just want to rip RoR objects directly to mongo?
Just a suggestion, (abstractly) but you could just strictly define your objects up front, and represent that definition in your RDBMS. It will probably save you a lot of time if you don't have a clear motivation for using Mongo. Mongo is an awesome technology, but it's best for sorting through data and cataloging data, rather representing strict data structures (not that it's incapable of doing so, necessarily, but with a document database, you have a lot more flexibility with the content of each object within your db).
Good luck!

Extensible Rails application that connects to several databases

I am implementing a Rails application that needs to aggregate search results from N independent heterogeneous databases. An example use case would be:
User queries "xpto"
The query is submitted to all the databases registered on the system
The results are transformed and combined in a predefined format
User gets the results
The application needs to provide an extensibility point for the introduction of new databases in the system. A database here can be of different kinds - a flat file, SQL database, REST interface to an SQL database, etc.
If I was working in C#/Java, ignoring speed issues, I would define a plug-in management system where each host would have a plug-in that would know how to query and transform the results of the host. New hosts would be easily introduced by defining a new plug-in and configuring the host in the system.
I am new comer to rails and I am looking for either ideas, tools or design patterns that can help me to solve this problem.
My best guess wolud be to write a custom ActiveRecord Adapter that would query all your databases and combine the results.
From the API reference:
Connections are usually created through ActiveRecord::Base.establish_connection and retrieved by ActiveRecord::Base.connection. All classes inheriting from ActiveRecord::Base will use this connection. But you can also set a class-specific connection. For example, if Course is an ActiveRecord::Base, but resides in a different database, you can just say Course.establish_connection and Course and all of its subclasses will use this connection instead.

Generate new models and schema at runtime

Let's say your app enables users to create their own tables in the database to hold their own, custom data. Each table would have it's own schema. What are some good approaches?
My first stab involved dynamically creating migration files and model files bu I'd like to run this on heroku where you can't write to the filesystem.
I'm thinking eval may be the way to go to create and run the migration class and the model class. But I want to make sure the model class exists when a new process of the app is spawned. Can probably do this by storing these class definition with each user as they create new tables and then run through them all at startup. But now it's convulted enough that I may be missing something obvious.
It's probably a better idea not to generate new classes on runtime. Besides all of the security risks, each thread's startup time will be abominable if you ever get a significant number of users.
I would suggest rethinking your app design and aim at generic tables to hold the user's custom data. If you have examples of data structures that users can create we might be able to help.
Have you thought about a non-sql database for those tables? Look at CouchDB - there are several plugins on Github that integrate it with rails. Records in the database are JSON documents, with arbitrary key-value structure. May be perfect for a user-defined schema.
There is (was?) a cool Wiki project, called Informl. It was a Wiki, not just for web pages but for web applications. (Get it? It's informal because it's a Wiki, it's got forms because it is an application, and it's user-generated, thus Web 2.0, which means that according to an official UN resolution it is legally required to have a name which is missing a vwl.)
So, in other words, it was not just about user-generated content, but also user-generated structured data.
They did this by generating PostgreSQL-specific SQL at runtime to create new tables and then have ActiveRecord reload the schemas.
The code is up on RubyForge. It's based on Rails 1.2.3. I guess you could do much better than that today, especially with the upcoming extensibility interfaces in Rails 3.

Resources