I'm currently using SQLite3 with a simple post and image sharing app, similar to the Rails 3 Hartl tutorial (in terms of db structure). But I'd like to move to Mongo for future scalability/learning.
I'm also hosted on Heroku, and am using a 15 GB shared db. I attempted to install MongoHQ and MongoMapper (as per Heroku's instructions) for the transition and this part according to Heroku's support is set up correctly. However, when I turn off the shared db, the app stops working, rather than running off of Mongo.
I'm not sure what do do next, do I have to rewrite my code in mongo or does mongo mapper solve all that? Do I lose my data if I change, if so, how do I copy?
Could any of you please point me to some resources or help me out? Thank you very much!!
MongoDB is not a drop in replacement for a SQL database. There are a couple of things you need to adapt:
The models' code are to be updated to use MongoDB. I can suggest using Mongoid, an ODM, as it will ease your learning path. Mongoid implements Active Record.
The current data saved in your SQL database needs to be migrated - and this is not automatic – to MongoDB schemas. MongoDB do not support migrations as you are used to in SQL world. You will need to write your own scripts for that.
I suggest you write a simple app from scratch using your MongoDB ODM of choice – MongoMapper or Mongoid – so that you get familiar with the basis of MongoDB before attempting to make a migration.
Related
I am building an e-commerce application using spree in rails. The application uses PostgreSQL as database. I want it to be changed to MySQL. How do I achieve this?
The short answer is, you don't. I've been through three Spree migrations where we were either using legacy data, or switching databases and trust me; you want nothing to do with this.
If you don't need to take the old data with you, you should be fine just running the migrations on a fresh MySQL database. If you need the legacy data, God be with you...
Converting Spree database schema is nothing nice, and it's hard to just pull what you need out of the database since because almost every data model depends on a foreign key from another. There are dental procedures I would go through voluntarily before attempting what you are proposing. This also begs the question; why do you need to go to MySQL from Postgres?
I have a Ruby script that downloads web pages containing financial statements for publicly traded companies, scrapes the pages for essential financial data, processes the financial data, and writes the results to a Postgres database.
I looked at the procedure for creating a Ruby gem at http://guides.rubygems.org/make-your-own-gem/ , and I'm considering making my Ruby web-scraping script a Ruby gem. Unlike the Hello World exercise in the example, my script needs a Postgres database ready to go.
I am working on a Rails app (Doppler Value Investing) that displays the stock parameters. Having a Ruby gem that nicely integrates into my app would be smoother and more elegant than the setup I would otherwise use. (At the moment, I have a separate Ruby app that does the scraping work and writes the results to the Postgres database.)
The one hitch I can think of is the need to manually create a Postgres database first. Is there a way to programmatically do this, or do I simply need to include in the README a statement that says something like "You MUST create a Postgres database with the name *db_name*, or this gem will not work"?
Just include the instruction in the README. Apart from anything else, you can't know ahead of time what privileges the user of your gem is going to have, so you'd have to deal with not being able to create the database programmatically anyway. It's a one-time task, so automating it doesn't make a huge amount of sense.
Once the database is set up, creating the schema automatically does make sense.
I have a Rails app and a Sinatra app, sharing the same database. The Sinatra app uses ActiveRecord.
Can I run migrations from within each app, as if they were in the same app? Will this cause any problems?
The schema.rb file in the Rails app tracks the current migration via
ActiveRecord::Schema.define(:version => 20121108154656) do
but, how does the Sinatra app know the current version the database?
Rails 3.2.2, Ruby 1.9.3.
The version column in the schema_migrations table equate to the time stamp on the front of the ruby migration file example: 20130322151805_create_customers.rb So if two ore more applications are contributing to the schema_migrations table roll backs will not be possible if rails can't find the down() method (because it will not find a migration file contained in another app ie db/migrate/...)
I have a current situation that is exactly this and I have opted to have a master ActiveRecord app that manages migration and data conversions as our database evolves. Keep in mind that part of the deal is to keep models up to date as well. This has been time consuming so we are considering breaking apart the DB in to business domains and providing APIs (JSON) to query support data for another application. This way each application manages it domain and is responsible for exposing data via API.
regards.
If you connect both applications to the same database you should be able to run migrations on it but I strongly suggest you use another option since you will almost surely hit a wall at one time or another:
split the database in two if possible with each application responsible for its own database /migrations.
have one application considered the "master" database and use another database for the data specific to the second application but make it connects to both database (each application still only apply migrations to one database)
If you need to share data between multiple applications another option is to implement a REST service in one and use it on the other, you can have a look at the grape gem for a simple way of doing so.
Edit: I realize I forgot to speak about the activerecord migration, there is no longer any "version" of the schema, what activerecord does is that it read all your migration filename, extract their identifier (the starting part) and check if they have already been applied so in theory you can run migrations from two applications on the same database provided they don't interfere.
But if both migrations act on the same tables you will almost certainly run into big troubles at one point.
I disagree with Schmurfy, even if his presented options are valid, its a bit of an overkill to share data through REST (granted, its pretty easy to implement with ruby / rails).
If your case is simple you could just use one database from both apps, and since you use AR in both of them you have no problems with versioning, AR takes care of that.
Also i dont know what happens if you run db:migrate from both apps simultaniously if you use a inferior dbms like mysql which does not allow DDL in a transaction, certainly nothing good..
Also it would bother me to look which app needs what column and not have the migrations in one place. You could use a shared repository to manage the migrations from both apps.
Rails migrations store current database version in schema_migrations table in the database. So both of your apps will be able to check the current version.
The version numbers are timestamps, so there shouldn't be any problem with duplicate values, as it'll be almost impossible to generate two migrations at the exact same millisecond. So you should be fine here.
The only problem I see is that when you rollback a migration in one app, it'll set the db to the previous known version and I'm not sure if it will pick the previous one from the db (which could be from the other app), or the number from the previous migration file. You may want to test that scenario to make sure.
I decided to put all migrations in the Rails app because:
Since there is only one database
Rails manages migrations
This has worked well.
This simplifies the system because all migrations are stored in one place. And, the Sinatra app doesn't need to know about them anyway.
Can I use MongoDB and a PostgreSQL in one rails app? Specifically I will eventually want to use something like MongoHQ. So far I have failed to get this working in experimentation. And it concerns me that the MongoDB documentation specifically says I have to disable ActiveRecord. Any advice would be appreciated.
You don't need to disable ActiveRecord to use MongoDB. Check out Mongoid and just add the gem plus any models along side any of your existing ActiveRecord models. You should note that MongoHQ is just a hosting service for MongoDB and can be used alongside any Object Document Mapper (ODM).
For further details check http://mongoid.org/en/mongoid/docs/installation.html. Just skip the optional 'Getting Rid of Active Record' step.
On a recent client site I worked with a production system that merged MySQL and MongoDB data with a single Java app. To be honest, it was a nightmare. To join data between the two databases required complex Java data structures and lots of code, which is actually databases do best.
One use-case for a two database system is to have the pure transactional data in the SQL database, and the aggregate the data into MongoDB for reporting etc. In fact this had been the original plan at the client, but along the way the databases became interrelated for transactional data.
The system has become so difficult to maintain that is is planned to be scrapped and replaced with a MongoDB-only solution (using Meteor.js).
Postgres has excellent support for JSON documents via it's jsonb datatype, and it is fully supported under Rails 4.2, out of the box. I have also worked with this and I find it a breeze, and I would recommend this approach.
This allows an easy mix of SQL and NoSQL transactions, eg
select id, blast_results::json#>'{"BlastOutput2","report","results","search","hits"}'
from blast_caches
where id in
(select primer_left_blast_cache_id
from primer3_output_pairs where id in (185423,185422,185421,185420,185419) )
It doesn't offer the full MongoDB data manipulation features, but probably is enough for most needs.
Some useful links here:
http://nandovieira.com/using-postgresql-and-jsonb-with-ruby-on-rails
https://dockyard.com/blog/2014/05/27/avoid-rails-when-generating-json-responses-with-postgresql
There are also reports that it can outperform MongoDB on json:
http://www.slideshare.net/EnterpriseDB/the-nosql-way-in-postgres
Another option would be to move your Rails app entirely to MongoDB, and Rails has very good support for MongoDB.
I would not recommend running two databases, based on personal observations on how it can go bad.
I currently have a system in which one rails 2.3.2 has a database with all of my content data on. I have since created another application which is using rails 3.1 and is very similar but with some more features (hence changed database structure (added and removed columns around the place)).
My issue is that I'm not sure how to get the data (I only really want three values from all of the "entries" (I don't work with databases often and so don't really know the lingo)) in one of my models from one database (SQLite production) to the other (also SQLite production).
I looked into db:seed however it turns out that rails 2.3.2 doesn't support db:seed and so I cannot use this.
Any ideas on how I can do this and easily add the missing information to these entries aswell (such as the published_at column which is new in the newer application which needs to be added for each entry)?
Best Regards,
Joe
If you need to migrate the whole application, check this out: Migrating from Rails 2 to Rails 3
If you only need to manage the sqlite databases, there are a lot of tools to do that. Here's the complete list