using postgreSQL and jsondb in the same web app - ruby-on-rails

My site is written in ruby (rails) and it's very easy to persist the results of an offsite json feed with using jsondb, so I have an app that would benefit from this, but I think I'd like to keep the rest of the site running on postgrs
Would I be better off moving everything to one database (jsondb?) or does rails easily allow me to use multiple ORM's in the same app
# just notes, ignore if you like cos the answers are subjective
# Perhaps I should build two web apps?

Sometimes it is practical to use multiple databases.
I'd take a hard look at the tenacity gem, which was introduced recently as a way to manage multiple databases within Rails .. and even relationships between them.
It doesn't look like it currently supports jsondb, but given its architecture, it should be possible to write your own adapter (... and then contribute it back?)

It's really better to have only one database. If it's better to use postGresql use it or only jsondb.
Having more than one database can be complicated to understand where is really your data.

Related

Rails app with test & live data access similar to stripe

I have a rails 4 app that exposes API to external users. The users also get access to a web dashboard where they can see & manage data related to API calls, similar to stripe. Stripe dashboard also allows you to switch between live & test data. I am looking to replicate similar behavior. Are there any design recommendations or a Rails way on how to do this? Use separate database (db_live vs db_test) or use separate tables inside db_live, and then use *_test table naming convention to access test data inside live database.
Whats the Rails/ActiveRecord way to do this? I am using Postgres as the database.
One potential solution would be to simply add a live (or test) boolean column to the appropriate database tables and use scopes to apply the desired where condition. An index on the column would also help with performance.
The practicality of this solution depends on exactly how test data is generated and how much of it you expect there to be per user/account.
Was searching for the same answer as well. Till now, the best option I can think of is to use a multi-tenant system. You can set a session variable as test|live and based on it connect to different databases OR in case of postgres different schemas. This way, all our code will remain DRY and all the switching logic between test and live systems can be moved in a single place.
Here's a basic idea on multi-tenant systems:
http://jerodsanto.net/2011/07/building-multi-tenant-rails-apps-with-postgresql-schemas/

MongoDB with PostgreSQL in One Rails App

Can I use MongoDB and a PostgreSQL in one rails app? Specifically I will eventually want to use something like MongoHQ. So far I have failed to get this working in experimentation. And it concerns me that the MongoDB documentation specifically says I have to disable ActiveRecord. Any advice would be appreciated.
You don't need to disable ActiveRecord to use MongoDB. Check out Mongoid and just add the gem plus any models along side any of your existing ActiveRecord models. You should note that MongoHQ is just a hosting service for MongoDB and can be used alongside any Object Document Mapper (ODM).
For further details check http://mongoid.org/en/mongoid/docs/installation.html. Just skip the optional 'Getting Rid of Active Record' step.
On a recent client site I worked with a production system that merged MySQL and MongoDB data with a single Java app. To be honest, it was a nightmare. To join data between the two databases required complex Java data structures and lots of code, which is actually databases do best.
One use-case for a two database system is to have the pure transactional data in the SQL database, and the aggregate the data into MongoDB for reporting etc. In fact this had been the original plan at the client, but along the way the databases became interrelated for transactional data.
The system has become so difficult to maintain that is is planned to be scrapped and replaced with a MongoDB-only solution (using Meteor.js).
Postgres has excellent support for JSON documents via it's jsonb datatype, and it is fully supported under Rails 4.2, out of the box. I have also worked with this and I find it a breeze, and I would recommend this approach.
This allows an easy mix of SQL and NoSQL transactions, eg
select id, blast_results::json#>'{"BlastOutput2","report","results","search","hits"}'
from blast_caches
where id in
(select primer_left_blast_cache_id
from primer3_output_pairs where id in (185423,185422,185421,185420,185419) )
It doesn't offer the full MongoDB data manipulation features, but probably is enough for most needs.
Some useful links here:
http://nandovieira.com/using-postgresql-and-jsonb-with-ruby-on-rails
https://dockyard.com/blog/2014/05/27/avoid-rails-when-generating-json-responses-with-postgresql
There are also reports that it can outperform MongoDB on json:
http://www.slideshare.net/EnterpriseDB/the-nosql-way-in-postgres
Another option would be to move your Rails app entirely to MongoDB, and Rails has very good support for MongoDB.
I would not recommend running two databases, based on personal observations on how it can go bad.

What's the best way to integrate a Django and Rails app sharing the same MySQL datastore?

I'm going to be collaborating with a Python developer on a web
application. I'm going to be building a part of it in Ruby and he is
going to build another part of it using Django. I don't know much about
Django.
My plan for integrating the two parts is to simply map a certain URL
path prefix (say, any request that begins with /services) to the Python
code, while leaving Rails to process other requests.
The Python and Ruby parts of the app will share and make updates to the
same MySQL datastore.
My questions:
What do people think generally of this sort of integration strategy?
Is there a better alternative (short of writing it all in one language)?
What's the best way to share sensitive session data (i.e. a logged in
user's id) across the two parts of the app?
As I see it you can't use Django's auth, you can't use Django's ORM, you can't use Django's admin, you can't use Django's sessions - all you are left with is URL mapping to views and the template system. I'd not use Django, but a simpler Python framework. Time your Python programmer expanded his world...
One possible way that should be pretty clean is to decide which one of the apps is the "main" one and have the other one communicate with it over a well-defined API, rather than directly interacting with the underlying database.
If you're doing it right, you're already building your Rails application with a RESTful API. The Django app could act as a REST client to it.
I'm sure it could work the other way around too (with the rest-client gem, for instance).
That way, things like validations and other core business logic are enforced in one place, rather than two.
A project, product, whatever you call it, needs a leader.
This is the first proof that you don't have one. Someone should decide either you're doing ruby or python. I prefer ruby myself, but I understand those who prefer python.
I think starting a product asking yourself those kind of questions is a BAD start.
If your colleague only knows prototype, and you only know JQuery, are you going to mix the technologies too? Same for DB? And for testing frameworks?
This is a never ending arguing subject. One should decide, IMHO, if you want so;ething good to happen. I work with a lot of teams, as a consultant, Agile teams, very mature teams for some of them, and that's the kind of stuff they avoid at all cost.
Except if one of you is going to work on some specific part of the project, which REALLY needs one or other of the technologies, but still think the other one is best for the rest of the application.
I think, for example, at a batch computing. You have ALL your web app in ror or django, and you have a script, called by CRON or whatever, computing huge amounts of data outside the web app, filling a DB or whatever.
My2Cts.

Using MongoDB for a calendar web app

I have been doing web programming for few years. All this time, I have been using RDBMS. As a side project, I would like to create a web application and would like to use NoSQL. I have never used NoSQL. So I would like to use a NoSQL solution. Web app is going to be a calendar and article list that are going to be shared among a project group. I would be using Ruby on Rails.So would it be fine to use MongoDB for this web app? or do you have any other recommendation?
MongoDB should be fine. I can't think of a particularly compelling reason as to why it'd be superior to an RDBMS or one of the other key-value stores out there for this particular problem, but I can't think of a reason not to use it, either. For a learning project, it should be more than fine.
As far as interfaces go, I'm currently using MongoMapper and am happy with it, and Mongoid is picking up a lot of steam. You can even just use the Mongo driver directly - it's very usable.
Candy looks quite nice as ruby lib. There are other like MongoMapper and Datamapper + do_mongo and likely more.

What's the most productive frontend framework to use with SOLR as the backend?

Want to build a web app using SOLR as the only backend. Most of the data will be stored in SOLR via offline jobs although there is some need for CRUD.
Looking at popular web frameworks today like Rails, Django, web2py etc. despite NoSQL the sweet spot for productivity still seems to be around active record implementations sitting on top of a RDBMS.
What's the best framework, in terms of productivity, for building web apps with SOLR as the backend?
All three of the above answers are great recommendations for development frameworks. I would flip around your question and ask "Which is best web app framework for me", not "which is best with Solr" and make a decision based on your skills, the community that you have around you, and other soft factors. Especially if you are completely agnostic on which way to go.
If you have friends who love Grails and can help you get started, then Grails might be the way to go. Have a Python group that meets regularly? Then Django has a lot to offer. I personally love Rails, and so I would recommend rails. But that is only a recommendation of "What I like" versus "what is best".
The wonderful thing about Solr is how agnostic it is to the front end. It plays nice in so many environments!
The web2py Database Abstraction layer does not support SOLR at this time which means you cannot use the DAL syntax for accessing SOLR and you cannot use automatically generated forms from a SOLR DB schema. Yet you can generate forms using SQLFORM.factory as-if you had a normal relational database and perform the insert/update/select/update into SOLR manually. web2py includes libraries for parsing/writing both JSON and XMl so it will be easy to implement SOLR APIs in few lines of code. If you bring this up on the web2py mailing list we can help with some examples.
EDIT (copied from the answer on the web2py mailing list):
Normally in web2py you define a model
db.define_table('message',Field('body'))
and then web2py generates and processes forms for you:
form=SQLFORM(db.message)
if form.accepts(request.vars):
do_something
In your case you would not use define_table because web2py DAL does
not support SOLR and you cannot generate forms from the schema but you
can install this: http://code.google.com/p/solrpy/
and you can do
#in model
import solr
s = solr.SolrConnection('http://example.org:8083/solr')
#in controller
form=SQLFORM.factory(Field('body'))
if form.accepts(request.vars):
s.add(mody=request.vars.body)
s.commit()
do_something
So the difference is SQLFORM.factory instead of SQLFORM and the extra
line after accepts. That is it.
I would use Sunspot 1.2 and Rails 3.
Sunspot is commonly used as an ActiveRecord extension, but is also designed to be ORM-agnostic. Rails 3 has decoupled ActiveRecord from the framework, making it easy to go entirely ORM-free.
http://outoftime.github.com/sunspot/
By the way , SphinxSearch is a lot faster than solr/lucence and many unique features. Also search accuracy is a lot better comparing from my experience and independent benchmarks.
it have native, very easy python api and it integrates well with web2py.
but it needs a RDBMS tho . I am using it , web2py + sphinxsearch , building an office files search engine.
You can give a try too.
www.sphinxsearch.com

Resources