How to pull data from remote server database in rails? - ruby-on-rails

I am using tiny_tds for connecting to remote database, which is only used for MySQL and Sql server. Is their is any other gem available which can access any vendor database?

You're not understanding how database access works.
We use a driver to talk to a database. The database vendors have different protocols that are used to connect, so a driver handles that.
Above that layer we'd have an API that talks to the driver. That'd be something like DBI, which knows how to talk to different drivers. We still have to write using the query language of the database but DBI gives us some advantages. It's a pain to convert from one database to another, because usually all the queries change and the inconsistencies between "standards" show up.
Above that layer we'd have something like ActiveRecord or Sequel, which are ORMs, and are mostly DBM agnostic. They allow us to use a consistent language to define our connections to databases, create queries and handle interactions. If we want to talk to a different database manager, we install the driver, change the connection string, and the rest should work.
This is a huge time savings and a "very good thing". You can use SQLite for your proof-of-concepts, and something like PostgreSQL, MySQL, or Oracle for your production system, without changing queries. Only the DSN/connection string changes usually.
Read through Sequel's "Connecting to a database" document to get an idea what ORMs can do, along with "Sequel: The Database Toolkit for Ruby" and "Cheat Sheet" for an idea what Sequel can do.

Related

ActiveRecord and NoSQL

I've been working with Rails for a few years and am very used to ActiveRecord, but have recently landed a task that would benefit from (some) NoSQL data storage.
A small amount of data would be best placed in a NoSQL system, but the bulk would still be in an RDBMS. Every NoSQL wrapper/gem I've looked at, though, seems to necessitate the removal of ActiveRecord from the application.
Is there a suggested method of combining the two technologies?
Not sure what NoSQL service you are looking into, but we have used MongoDB in concert with Postgres for a while now. Helpful hint, they say you need to get rid of ActiveRecord, but in reality, you don't. Most just say that because you end up not setting up your database.yml and/or running rake commands to setup AR DB.
Remember also that Postgres has HStore and JSON datatypes which give similar functionality as NoSQL datastores. Also, if the data you are looking to store outside of your AR DB is not very complex, I would highly recommend looking into Redis.
If you look at the Gemfile.lock of this project, you can see that it uses ActiveRecord with Mongoid.
Even if you use other gems that don't need ActiveRecord, you shouldn't care. If you are using it, you should have a valid reason to do so.

Linking neo4j with other databases

I am working on a research domain called knowledge managment and i am using neo4j.
I want to link my neo4j base with other database that requires physical data storage (PostgreSQL, MySQL...). Is this possible?
In general sure, it depends on how you want to set up the linking.
Perhaps you can detail your use-case more?
Normally people sync data between other datastores and Neo4j e.g. by triggering updates or polling.
For Postgres there is also a foreign data wrapper.
You can also use an event-sourced system, where data is written to your relational databases and relationships also to Neo4j. (also)

Dynamic database connection in a Rails App

I'm quite new to Rails but in my current assignment I have no other choice but use RoR. My problem is that in my app I would like to create, connect and destroy databases automatically on user demand but as far as I understand it is quite hard to accomplish this with ActiveRecord. It would be nice to hear some advice from more experienced RoR developers on this issue.
The problem in details:
I have a main database (which I access with activerecord). In this database I store a list of my active programs (and some template data for creating new programs). I would like to create a separate database for each of this programs (when a user creates a new program in my app).
In the programs' databases I would like to store the state and basic info of the particular program and a huge amount of program related data (which is used to calculate the state and is necessary to have for audit reasons).
My problem is that for example I want a dashboard listing all the active programs and their state data. So first I have to get the list from my main db and after that I have to connect to all the required program databases and get the state data.
My question is what is the best practice to accomplish this? What should I use (ActiveRecord, a particular gem, etc.)?
Hi, thanks for your answers so far, I would like to add a couple of details to make my problem more clear for you:
First of all, I'm not confusing database and table. In my case there is a tool which is processing log files. Its a legacy tool (written in ruby 1.8.6) and before running it, I have to run an SQL script which creates a database with prefilled- and also with empty tables for this tool. The tool then processes the logs and inserts the calculated data into different tables in this database. The catch is that the new system should support running programs parallel which means I have to create different databases for different programs.(this was not an issue so far while the tool was configured by hand before each run, but now the configuration must be automatic by my tool) There is no way of changing the legacy tool while it would be too complicated in the given time frame, also it's a validated tool. So this is the reason I cannot use different tables for different programs, because my solution should be based on an other tool.
Summing my task up:
I have to crate a complex tool using RoR and Ruby 2.0.0 which:
- creates a specific database for a legacy tool every time a user want to start a new program
- configures this old tool on a daily basis to process the required logs and insert the calculated data into the appropriate database
- access these databases and show dashboards based on their data
The database I'm using is MySQL.
I cannot use other framework, because the future owner of my tool won't be able to manage/change/update it. So I have to go with RoR, which is quite painful for me right now and I really hope some of you guys can give me a little guidance.
Ok, this is certainly outside of the typical use case scenario, BUT it is very doable within Rails and ActiveRecord.
First of all, you're going to want to execute some SQL directly, which is fine, but you'll also have to take extra care if you're using user input to determine the name of the new database for instance, and do your own escaping. (Or use one of ActiveRecord's lower-level escaping methods that we normally don't worry about.) The basic idea though is something like:
create_sql = <<SQL
CREATE TABLE foo ...
SQL
ActiveRecord::Base.connection.execute(create_sql)
Although now that I look at ActiveRecord::ConnectionAdapters::Mysql2Adapter, there's a #create method that might help you.
The next step is actually doing different things in the context of different databases. The key there is ActiveRecord::Base.establish_connection. Using that, and passing in the params for the database you just created, you should be able to do what you need to for that particular db. If the db's weren't being created dynamically, I'd put that line at the top of a standard ActiveRecord model so that that model would always connect to that db instead of the main one. If you want to use the same class, and connect it to different db's (one at a time of course), you would probably remove_connection before calling establish_connection to the next one.
I hope this points you in the right direction. Good luck!

SQLite in development, PostgreSQL in production—why not?

Heroku advises against this because of possible issues. I'm an SQL noob, can you explain the type of issues that could be encountered by using different databases?
I used sqlite3 in development and postgres in production for a while, but recently switched to postgres everywhere.
Things to note if you use both:
There are differences between sqlite3 and postgres that will bite you. A common thing I ran into is that postgres is stricter about types in queries (where :string_column => <integer> will work fine in sqlite and break in postgres). You definitely want a staging area that uses postgres if your dev is sqlite and it matters if your production app goes down because of a sql error.
Sqlite is much easier to set up on your local machine, and it's great being able to just delete/move .sqlite files around in your db/ directory.
taps allows you to mirror your heroku postgres data into your local sqlite db. It gets much slower as the database gets larger, and at a few 10s of tables and 100K+ rows it starts to take 20+ minutes to restore.
You won't get postgres features like ilike, the new key/value stores, fulltext search
Because you have to use only widely supported SQL features, it may be easier to migrate your app to mysql
So why did I switch? I wanted some postgres-only features, kept hitting bugs that weren't caught by testing, and needed to be able to mirror my production db faster (pg_restore takes ~1 minute vs 20+ for taps). My advice is to stay with sqlite in dev because of the simplicity, and then switch when/if you need to down the road. Switching from sqlite to postgres for development is as simple as setting up postgres - there's no added complexity from waiting.
Different databases interpret and adhere to the SQL standard differently. If you were to, say, copy paste some code from SQLite to PostgreSQL there's a very large chance that it won't immediately work. If it's only basic queries, then maybe, but when dealing with anything particular there's a very low chance of complete compatability.
Some databases are also more up to date with the standard. It's a similar battlefield to that of internet browsers. If you've ever made some websites you'd know compatability is a pain in the ass, having to get it to work for older versions and Internet Explorer. Because some databases are older than others, and some even older than the standards, they would've had their own way of doing things which they can't just scrap and jump to the standard because they would lose support for their existing larger customers (this is especially the case with a database engine called Oracle). PostgreSQL is sort of like Google Chrome, quite high up there on standards compliance but still with some of its own little quirks. SQLite is, as the name suggests, a light-weight database system. You could assume it lacks some of the more advanced functionality from the standards.
The database engines also perform the same actions differently. It is worth getting to know and understand one database and how it works (deeper than just the query level) so you can make the most of that.
I was in a (kind of) similar situation. Generally it is a very bad idea to use different database engines for production and test. There are multiple reasons
SQL syntax differences including DML, DDL statements, stored procedures, triggers etc
Performance optimizations done on one DB wont be valid on the other
SQLite is an embedded database, PostgreSQL is not
They don't support the same data types
Different syntax/commands to configure/setup db. SQLite uses PRAGMAs
One should stick to one db engine, unless you have a really, really good reason. I can't think of any.

Using SQLite as production database, bad idea but

We are currently using postgresql for our production database in rails, great database, but I am building the new version of our application around SQLite. Indeed, we don't use advanced functions of postgres like full text search or PL/SQL. Considering SQLite, I love the idea to move the database playing with just one file, its simple integration in a server and in Rails, and the performance seems really good -> Benchmark
Our application's traffic is relatively high, we got something like 1 200 000 views/day. So, we make a lot of read from the database, but we make a few writes.
What do you think of that ? Feedback from anyone using or trying (like us) to use SQLite like a production database ?
If you do lots of reads and few writes then combine SQLite it with some sort of in-memory cache mechanism (memcache or redis are really good for this). This would help to minimize the number of accesses (reads) to the database. This approach helps on any many-reads-few-writes environment and it helps to not hit SQLite deficiencies - in your specific case.
SQLite is designed for embedded systems. It will work fine with a single user, but doesn't handle concurrent requests very well. 1.2M views per days probably means you'll get plenty of the latter.
For doing only reads I think in theory it can be faster than an out-of-process database server because you do not have to serialize data to memory or network streams, its all accessed in-process. In practice its possible an RDBMS could be faster; for example MySQL has pretty good query caching features and for certain queries that could be an improvement because all your rails process would use this same cache. With sqllite they would not share a cache.

Resources