Heroku Rails - Turn off Active Record Postgres connection - ruby-on-rails

I have a rails 4 app I am hosting on heroku. They give some specific advice about how to manage your DB connection pool when using a multi-threaded server (puma)https://devcenter.heroku.com/articles/concurrency-and-database-connections
When I ran load testing for my app I got an error- can't connect to the db. when each page was being hit, rails initializes active record, even if I'm not making any queries on that page, or referencing any models.
My question is:
How can I make a sort of whitelist (or blacklist) so that active record is not initialized with a db connection for these specific controller actions? In an initializer?
Ideally I would run the cheaper postgres service on heroku (40 connections) because I know my app doesnt use the db very often. If traffic hits higher that the 40 connections things will start to error, which seems silly for an app that wasn't going to use the db on those requests.
I read about how to disable active record for an entire app: Disable ActiveRecord for Rails 4
But how do I selectively enable it? Are there any other different performance considerations here (by not eager loading these things or any other gotchas)

In you application_controller.rb
before_filter :maybe_disconnect_db
def maybe_disconnect_db
ActiveRecord::Base.remove_connection() if ActiveRecord::Base.connected?
end
def maybe_connect_db
ActiveRecord::Base.establish_connection() unless ActiveRecord::Base.connected?
end
Then for each controller/action that needs the db connection add
skip_before_filter :maybe_disconnect_db, #only or unless filter here
before_filter :maybe_connect_db, #only or unless filter here
This should establish the connection for any specific db request, and disconnect for any request that doesn't need it, while also handling multiple db requests in a row without action, and multiple non db requests in a row without action.
ActiveRecord::Base.remove_connection
ActiveRecord::Base.establish_connection

Related

How to dynamically connect an abstract class to different databases for a single request in Rails 5?

In our application, we have several models which need to connect to different external databases that hold the same tables and columns, but are each separate and cannot be unified.
Currently, the application runs on separate servers, in which each is connected only to a specific external database. However, these are 10+ servers, all serving the exact same application, with the only difference being the external database they connect to.
The goal is to have a single server running the application and have the application decide which database to query based on a certain parameter passed into the controller.
Our current approach is the following. We have an abstract class from which relevant models inherit, with a method to reconnect it to the specific database:
class AbstractRecord < ApplicationRecord
self.abstract_class = true
def self.reconnect
database = Thread.current[:database_name].constantize
self.establish_connection database
end
end
Then, we have every controller inherit from a controller class with a before_action that sets the current database name in Thread.current and calls that method:
class AccessController < ApplicationController
before_action :set_current_database
private
def set_current_database
Thread.current[:database_name] = current_user.database_name
AbstractRecord.reconnect
end
end
Each user has the information on which database they need to connect to, and so the application reconnects the database based on the current user.
This application also serves an API, with controllers inheriting from a similar controller that also reconnects the database based on the current API user.
We know all of the databases we need to connect to and keep them in yml files, and all of them are loaded into constants inside an initializer.
This approach works for the most part. Whenever a request is made, the database is successfully reconnected to the appropiate database, and the application functions as normal.
However, issues arise when a request is sent at the same time that another request is being processed, both in development and production:
ActiveRecord::ConnectionNotEstablished (No connection pool with 'AbstractRecord' found.)
This error is raised whenever any model that needs to query the AbstractRecord database does so after a new connection has been initiated in a different request.
Given enough time to finish, requests don't seem to interfere with each other and the database reconnections work fine.
It is my understanding that Rails handles requests on individual threads for each of them, and each thread uses a different database connection, which raises the question: Why is establish_connection causing other requests to lose their connection? Is there a major misunderstanding on how threads and database connections work in Rails in this case?
Back to the main question: How can I dynamically connect my models to a specific database during a single request in this version of Rails? Is this approach correct, or is there a more adequate solution?
Rails version: 5.2.4.3
Ruby version: 2.6.3p62
#Joaquin for me this is clearly a case of multi-tenancy, where I must have a central database with a customer table and their respective database connection. There are some libraries that do this elegantly, with the ar-octopus gem.
In your case, there is a concorrency failure, as the key you are using in Thread.current is probably being used in two or more simultaneous executions. A change I would make would be to make your Thread.current key more specific as
Thread.current[:"#{current-table-name}_database_name"] = current_user.database_name
where the Person class would have the key Thread.current[:"#{Person.table_name}_database_name"], but this approach is not a silver bullet and is certainly has flaws.
I suggest looking at gem ar-octopus, it will bring you many benefits.

Rails Controller Without Database

I am building a Rails service that uses Server-Sent Events (SSE) to stream events to the browser. There is a controller with standard RESTful endpoints for manipulating / querying data, and then another controller (inheriting from ActionController::Live) that handles the asynchronous code. Then, in the middle I have Redis as a pubsub
Because I am pre-computing the messages I'd like to send in the RESTful controller, I do not use the database in the SSE controller (the auth does not require a database connection).
The Problem:
Because the database connection is being unnecessarily grabbed from the pool, I am limited in the number of connections, but the number of database connections I allow.
Question:
Is there a way to skip_before_filter (or similar) to avoid requiring a database connection?
You can disable db connections by default. I think this SO post tells you how:
Rails 3 - how do I avoid database altogether?

Does a before_filter in the application controller slow down the app?

I have a few before_filter in my application controller to check 1) If the current_user is banned, 2) If the current_user has received a new message and 3) If the current_user has any pending friend requests.
This means that before every request the app will check for these things. Will this cause server issues in the future, possible a server overload?
I wouldn't definitely say that it would create a server overload on it's own, for a server overload you need many concurrent requests and rails have a connection pool to the database out of the box, but this will slow down the process as you have 3 queries before each request is even at the controller to do what it was intended to do.
Facebook solved this at 2009 using what they called BigPipe, it is not a new technology rather it is leveraging the browsers and the ability to send a few requests with fragmented parts of the page and only then compose it using some Javascript.
You can have a read here http://www.facebook.com/note.php?note_id=389414033919.
As for your check if the user is banned, yes that is something you'd have to check either way, perhaps you can have this in cache using memcached or redis so it won't hit your database directly every time.

Force Delayed Job to use a separate DB connection

My app is set up in such a way that we use a different database connection per subdomain, using different environments. Delayed Job does what you'd expect (I guess) and uses the current environment of that request (and thus DB connection) when inserting the job to it's delayed_jobs table.
Problem is that DJ can't process jobs from all these different tables, so I'm trying to force DJ to use just one database, especially set up for it. I have tried this but it just won't work and I've no idea what to try next.
Any pointers/suggestions would be VERY much appreciated, really at my wits end with this.
Attempted code:
Delayed::Job.class_eval do
establish_connection ActiveRecord::Base.configurations["delayed_job"]
end
Connection to the DB is done in a before_filter in the ApplicationController.
The code in ApplicationController to establish the connection per the domain will happen only in your application server on each request.
Add a :domain attribute to your Job class and set it when you queue the job. In Job#perform, establish your DB connection.

Is it okay to authenticate with MongoDB per request?

I have a Rails controller that needs to write some data to my MongoDB. This is what it looks like at the moment.
def index
data = self.getCheckinData
dbCollection = self.getCheckinsCollection
dbCollection.insert(data)
render(:json => data[:_id].to_s())
end
protected
def getCheckinsCollection
connection = Mongo::Connection.new('192.168.1.2', 27017)
db = connection['local']
db.authenticate('arman', 'arman')
return db['checkins']
end
Is it okay to authenticate with MongoDB per request?
It is probably unnecessarily expensive and creating a lot more connections than needed.
Take a look at the documentation:
http://www.mongodb.org/display/DOCS/Rails+3+-+Getting+Started
They connect inside an initializer. It does some connection pooling so that connections are re-used.
Is there only one user in the database?
I'd say: don't do the db authentication. If MongoDB server is behind a good firewall, it's pretty secure. And it never ever should be exposed to the internet (unless you know what you're doing).
Also, don't establish a new connection per request. This is expensive. Initialize one on startup and reuse it.
In general, this should be avoided.
If you authenticate per request and you get many requests concurrently, you could have a problem where all connections to the database are taken. Moreover, creating and destroying database connections can use up resources within your database server -- it will add a load to the server that you can easily avoid.
Finally, this approach to programming can result in problems when database connections aren't released -- eventually your database server can run out of connections.

Resources