visual sql query generator for ruby on rails - ruby-on-rails

in my RoR app I need to develop a visual query generator as similar as possible to the one provided by MS ACCESS. (sample screenshot of how this looks in ms access)
The user would be able to choose db tables, from the tables choose fields and then add conditions to the fields.
Is there any gem / code that you are aware of, that would help me in this endavour?

This would need a lot of TLC to get to be like the MS Access Query, but you could use the Ransack gem to accomplish the queries and nested associations.
http://railscasts.com/episodes/370-ransack

Related

Rails + Postgres: Multi-tenancy done right?

I am going to build an app using Rails. It uses multi-tenancy using the Apartment gem and PostgreSQL. The app will have users, each of which have an account. This implies that each user has it's own PostgresSQL schema; the users table is in the default schema.
Each user has his own list of customers in his own schema. A customer with the same email (essentially the same customer) can appear in multiple schemas. I want a customer to be able to log in and see all the users he's associated with. I can't put the customers table in the default/public schema because it's related to other tables that are not in the default schema.
What I thought I would do is create a link table between customers and users in the public schema. This table would contain the email of the customer and the id of the user. My issue with that is that I don't understand how well this would work with Rails. What I would like to achieve is something like customer.users.
So the question is: How should I approach this problem?
I created this lib to help us solve this issue
https://github.com/Dandush03/pg_rls
Currently the most famous implementation are Apartment Gem (from Influitive), ActiveRecordMultiTenant (from Citus) and the rails 6.1 way, DB sharding
there are many issues with the Apartment and rails 6.1 approach when dealing with a huge amount of schemas/databases mainly when you most run a scale migration or when you have to change default values on a table, this is because you would need to run this migration on each tenant, and that very cost-efficient. and Citus's approach gets expensive in the long run. 
thankfully PostgreSQL came with a great solution on pg V-9 but it had some performance issue that was solved on pg V-10. This approach allows you to keep specific tables behind a 'special id' which can be later on partitioned with pg new tools
my approach is mainly focused on PG and how they recommend you to implement RLS, and most of my queries are executed with SQL statements which help us a bit when dealing with performance issues when running the migration. I tried to mix the best of rails with the best of pg functions.
what is even better on my approach is that when you start running your special pg search function, there will be no downfall because all data is secure and under the same DB. also you will gain the ability to log as a superuser and get your statistics.
I can keep going but I think I make my point clear, if you'd like I encourage you to check out the gem (prob still some bugs to handle like right now it only handle setting the Tenant from subdomain) but it does make my life easier on my ongoing projects. and if I get some more supports (like) I would keep on maintaining it and upgrading it to be a more generic tool
I suggest to differ between your users (who log in, not part of a tenant), and the customers (which are kept separately, and located in each tenant). The users table (possibly accompanied by other tables) can hold the information for the assignment from user to schema/customer etc. I would not even use foreign keys to link the user table with the tables in the tenant, just to keep them really separate.
In short, the user table serves to authenticate and to authorize only.
Update: The question describes a multi-tenancy approach using separate database schemas for the individual tenants. In this setup up, I would not link the users with the customers by database foreign keys, and I would not query them together. Just authenticate against the users, and get the assigned tenant(s). After that switch to the tenant.
If you really want to query both items (users and customers) in one run, I would not use separate schemas: One schema, create a tenant table, and put a foreign key into all other tables (customers etc.). In this scenario you could even do without a separate user table, and you could query the (single) customer table.
Update 2: To answer your query question:
You can query for schemas in PostgreSql's meta data:
select schemaname from pg_tables where tablename = 'Customer'
Which gives you all schemas with a customer table.
Using that information you can dynamically build a union select:
select name from schema1.customer
union all
select name from schema2.customer
union all
[...repeat for all schemas...]
to query all tables across schemas. You could use group by to eliminate duplicates.

Where to place sql queries in rails?

I have started a rails project on top of a legacy database. Standard practices required to use an ORM, like assigning ID field to each table, haven't been followed. So, I will not be creating all the models matching all the table. I need to run queries joining multiple tables using numerous conditions. I will mostly be using Model.find_by_sql or Model.connection.select_all methods. Where should I put these queries? Should I stash these in one of the models I have created that is involved in the query?
What's the standard practice for such a situation?
As much as possible, you still want to insulate the rest of your application from the details of the database by putting your queries and whatnot into the model layer. So yes, "stashing" in the right model object relevant to what you're trying to do seems like the right thing.
Are you allowed to change the schema of the database? If so, you may want to use migrations to slowly make your database look more like a standard ActiveRecord backing store.
You may also want to look into alternatives to ActiveRecord such as Sequel.
It is good idea to place the sql queries under sql folder under db. You need to create the sql folder.

MongoDB with PostgreSQL in One Rails App

Can I use MongoDB and a PostgreSQL in one rails app? Specifically I will eventually want to use something like MongoHQ. So far I have failed to get this working in experimentation. And it concerns me that the MongoDB documentation specifically says I have to disable ActiveRecord. Any advice would be appreciated.
You don't need to disable ActiveRecord to use MongoDB. Check out Mongoid and just add the gem plus any models along side any of your existing ActiveRecord models. You should note that MongoHQ is just a hosting service for MongoDB and can be used alongside any Object Document Mapper (ODM).
For further details check http://mongoid.org/en/mongoid/docs/installation.html. Just skip the optional 'Getting Rid of Active Record' step.
On a recent client site I worked with a production system that merged MySQL and MongoDB data with a single Java app. To be honest, it was a nightmare. To join data between the two databases required complex Java data structures and lots of code, which is actually databases do best.
One use-case for a two database system is to have the pure transactional data in the SQL database, and the aggregate the data into MongoDB for reporting etc. In fact this had been the original plan at the client, but along the way the databases became interrelated for transactional data.
The system has become so difficult to maintain that is is planned to be scrapped and replaced with a MongoDB-only solution (using Meteor.js).
Postgres has excellent support for JSON documents via it's jsonb datatype, and it is fully supported under Rails 4.2, out of the box. I have also worked with this and I find it a breeze, and I would recommend this approach.
This allows an easy mix of SQL and NoSQL transactions, eg
select id, blast_results::json#>'{"BlastOutput2","report","results","search","hits"}'
from blast_caches
where id in
(select primer_left_blast_cache_id
from primer3_output_pairs where id in (185423,185422,185421,185420,185419) )
It doesn't offer the full MongoDB data manipulation features, but probably is enough for most needs.
Some useful links here:
http://nandovieira.com/using-postgresql-and-jsonb-with-ruby-on-rails
https://dockyard.com/blog/2014/05/27/avoid-rails-when-generating-json-responses-with-postgresql
There are also reports that it can outperform MongoDB on json:
http://www.slideshare.net/EnterpriseDB/the-nosql-way-in-postgres
Another option would be to move your Rails app entirely to MongoDB, and Rails has very good support for MongoDB.
I would not recommend running two databases, based on personal observations on how it can go bad.

How to build a Search functionality to search multiple models like GitHub?

I am wondering on how to implement a search functionality like Github.
Just one search box on the top header right and when searched for a keyword, displays the results for Repository, Code and User.
Is there any tutorial or example to implement this on Rails 3?
Odds are really good they're doing separate searches across the tables for the same value, then combining the results afterwards.
Use Rails to create a small form containing a text field. When it's submitted take the value of the field and do a query using that as the search term.
If you're not sure how to do queries using ActiveRecord, see "Active Record Query Interface" for a nice overview.
You will have to do several queries, one per model, and put the results together on the same view.
If your question is "how do I do full text searches on several activerecord models in a DRY way" then there are basically two paths:
The common solution, but a bit complex, is using a dedicated daemon on your machine, like Sphinx. Sphinx is a service in (like Apache or MySQL) that indexes your content and allows you to do searches. You can use the Thinking Sphinx gem to communicate with it easily from rails. An alternative to Sphinx is Solr (there's also a gem for it called Sunspot)
If you are using Postgresql, there's a simpler alternative that doesn't require external services running on your server. Postgresql has with some full-text search capabilities built-in. There's a gem called texticle that helps using these services from rails. You can have that working very quickly.

How yo use MYOB in RAILS Application

Can anyone explain to integrate the MYOB in rails application
Integration to MYOB is not as simple as it might look.
In order to integrate to MYOB, you have to install the ODBC drivers, have the MYOB application installed and have direct access to the MYOB company file.
what is your integration requirements and purpose?
the solution we choose to go for is to build a desktop app that will facilitate the integration between the web app and MYOB.
hope this helps.
You can, however the only way to really do this is with the MYOB ODBC driver. The way this driver works is that it uses the Myobp.exe as a middle man between ODBC and the .MYO file.
The MYOB ODBC driver presents a SQL interface to MYOB, however there are some caveats. It's not a real SQL interface to MYOB. The data you can get in and out is closely matched by the Import and Export functions that MYOB presents in the program itself.
For getting data out, with the ODBC interface you can write SQL queries, and even some SQL type functions can be in these queries, but you may occasionally come across some things that will catch you out.
For getting data in, the ODBC driver presents uses a schema of tables with the prefix "Import_", with these tables all you can do is import data and they will not return things like the invoice number you just imported. This also means you cannot update existing entries the same way you would expect with a real SQL interface.
However, some data imports will result in an update of existing data if some of the fields match. My experience with MYOB and it's ODBC interface primarily deals with getting sales orders from a system I wrote entered into MYOB, so certain things can be done.
I did write my first version of this system in Rails, however you can't use the standard Rails models without adjusting the way they work because Rails is very opinionated on how it expects the database to act.
More recently I re-wrote it in Clojure and used handwritten queries.
For multiline inserts (for example the lines of an invoice) MYOB uses a transaction as a wrapper to indicate what lines should be on the one invoice.
So your best bet if you want to use Rails, would be to avoid using the Rails models, and perhaps write direct SQL queries and interface to that with your controllers and views.
I can't get to my old code right now, but it's something along the lines of this:
def create_myob_invoice( iso )
# We can assume we have the job
# Now, for each iso line, we add an invoice
latest_invoice_number = Sale.find( :first, :order => 'InvoiceNumber DESC' ).InvoiceNumber.to_i
if latest_invoice_number
# We need to wrap it all in a transaction so MYOB knows they're all the one invoice
MyobDatabase.transaction do
iso.lines.each do |line|
new_service_line = ImportServiceSale.new
# Map iso fields to MYOB service sale fields
# e.g. suff like:
new_service_line.Description = "#{line.sDescription}"
new_service_line.save
end
end
end
end
You may also be able to find something to help over at: http://freelancing-gods.com/posts/talking_to_myob_with_ruby

Resources