Workaround for Heroku 30 second timeout w/ long external query - ruby-on-rails

Note: There are going to be things in this post which are less-than-best-practices. Be warned :)
I'm working on an admin dashboard which connects to a micro-instance AWS server.
The DB has tens of millions of records.
Most queries come back within a few seconds but some take up to a minute or two to return, based on a few things outside of my control.
Due to Heroku's 30-second limit (https://devcenter.heroku.com/articles/request-timeout), I need to find some way to buy time to keep the connection open until the query returns. Heroku does say that you can buy time by sending bytes to the client in the meantime, which buys you another 55 seconds.
Anyways, just curious if you guys have a solution to stall time for Heroku. Thanks!

I have made a workaround for this. Our app is running Sinatra and I have used EventMachine gem to keep writing \0 into stream every 10 seconds so Heroku doesn't close connection until action is complete, see the example https://gist.github.com/troex/31790323fb4a8a29c8b8cd84e50ad1e8
My example is using Puma but it should work for Unicorn and Thin as well (you don't need EventMachine.run for Thin). For Rails I think you can use before/after_action to start/stop event timer.

You could break down the thing into multiple queries.
You may send a query, have your AWS server respond immediately just saying that it received query and then once it pulls the data, have it send that data via a POST request to your Heroku instance.

Yes, do it via ajax, send back a response that says ask again in a bit...

Related

Rails ActiveRecord/Postgres single query timeout?

I have a logging query (a simple INSERT) that happens on every single request.
For this request only (the one that happens on every page load), I want to set the limit to 500ms in case the database is locked/slow/down it won't affect the site, where the site hangs while it waits to connect/write.
Is there a way I can specify a timeout somehow on a per-query basis that I can abort the LoggedRequest.create! if it's taking too long?
I don't want to set it in my config because I have many other queries that shouldn't have timeouts that low.
I'm using Postgres 11.7
I also don't know how I feel about setting a timeout for the entire session because I don't want that connection to be shared from the pool with other queries that can't have that timeout.
Rails 6 introduces event based triggers for notifications, logging etc that comes in very handy, provided you are using/can afford to migrate to Rails 6. Here'a useful post that demonstrates creating event based triggers for notifications/logging: https://pramodbshinde.wordpress.com/2020/03/20/custom-events-tracking-with-activesupportnotifications-and-audited/
If, for some reason, you cannot use Rails 6, perhaps this article might help you find some answers: https://evilmartians.com/chronicles/the-silence-of-the-ruby-exceptions-a-rails-postgresql-database-transaction-thriller
If I were you, I could also contemplate using AJAX with a fire-and-forget API request to server for logging/whatever that is not critical to normal functioning of the application.

Running code repeatedly in Ruby on Rails with Heroku for an indefinite period

I am attempting to build a web application with Ruby on Rails that users site up for and get an email alert when a certain event happens.
As such, I need to be able to make an API call and then based on the JSON response, send the alert, but I need a way to have this API call happen repeatedly for an indefinite amount of time automatically. I am also using Heroku at this time if that needs to be taken into account.
Thanks for your help.
This sound like a cron job in plain old linux. Heroku calls this addon Scheduler. You have to define the task withing lib/tasks/scheduler.rake
For further information read the heroku docs for scheduler here

Send lots of emails as soon as possible [duplicate]

I have some questions about ActionMailer :
How does Actionmailer connect to a smtp server ?
Are the connections concurrent or parallel if the number of emails high > 1000 ?
How will sending out emails like facebook does ( 1000's in numbers ) as immediate emails affect the ruby on rails application and how would actionmailer handle it ?
Any other solution/plugin to send out large number emails from a RoR application apart ActionMailer?
------------------------------------------------added :
We need to send out at least 1000 emails per 15 minutes . We are using a Notes Domino server as our smtp server .! what is the possible architecture for this kind of problem. We are already storing the emails in the database to send them later , but what is needed is the sending approach !
The usual thing is to create a background job to send email. ActionMailer is very good for single emails but does tend to run into trouble after sending multiple emails as each one can take several seconds to complete. That's why I created PostageApp to help solve those problems.
Some services on the market to help you with sending lots of email from Rails:
MailGun
SendGrid
PostmarkApp
MailChimp
Mailjet
PostageApp
All of these have ways of sending multiple messages with a single API call or SMTP transaction.
1) Actionmailer connects to your smtp server via a set of parameters including a host, port and protocol.
3) The effect will be a slow site as a result of the many synchronous tasks being executed.
2 & 4)
Actionmailer is a bit too slow to be sending out a ton of emails under load, remember that it is a synchronous operation and as such its not really the sort of thing you want to be doing a lot on a busy site.
To be honest you're better off not sending that quantity of email from your website. It's not really designed to be used in such a way. If I had to send that sort of volume I'd look at doing the work in the background, something like Delayed Job would work well here or one of the many async rails mailers found here would do the trick.
What you really want to look at here is the requirement that you're trying to fulfil, is it absolutely necessary that the website be responsible for sending the mail in a synchronous fashion? In most cases the answer to that question is no. If you can, you'll be far better off deferring this sort of task to another part of your system, keep your site as lean and focused as you can.
Simple solution here for you...
Sidekiq or Resque
I'd highly recommend Sidekiq as it's not near as server intense for running multiple workers for this one - only be careful with concurrency issues (make sure you don't have 2 workers pick up the same job and send duplicate emails that is).
Say you set 20 Sidekiq workers, each should be able to send an email every 2-4 seconds, you're looking at an easy 300-600 per minute.
DO NOT try to do this without background workers like Sidekiq, Resque, or DelayedJob. You will freeze your entire app if you try sending in app with any large amount of emails. Even sending activation emails in app and what not will cause you unnecessary slow down issues.
I'd have one Worker that handles the queueing periodically and another Worker class that handles the sending. We're using Resque (6 workers maybe?) for this on an older app (pre-sidekiq) to send around 500 emails every 5 minutes with no issues.
You can aways use a third party like someone mentioned. Sendgrid is decent. But that wasn't the question, this is how you do it yourself simply and easily.
You define the SMTP settings in a config file if left blank it uses sendMail local
concurrent
multiple handlers
Is there a bulk email plugin for Rails apps?
you may also do 1000.times do email.deliver but it will probably collapse ur server

Connection time out in Rails application

I have a rails application is deployed on Apache + Passenger + Rails 2.3.8(Ruby 1.8.7) + Linux server + MySQL 5.
I am trying to create an excel report by getting records from DB and download it.
When my report has < = 600(approx.) records, it get created and download successfully.
But when report contains more records, it does not get down load.
Query and logic processing completes in back-end and application server, but browser starts throwing connection time-out after some time.
I have tried increasing keepAlive time, also tried to modify browser settings. Nothing works for me.
As you didn't provide your code, I can only reply with a general answer.
on my opinion, letting a response time of a request be too long is always not ideal even if you can avoid time-out issue from your browser. you have two better choices:
if you don't need to reply the latest data, use cron job to generate your excel file and respond it when getting request. here is a good reference.
if you have to reply the latest data, divide the data in your database into many parts and replay them separately. (in this case, you may have to send request many times)

How Rails handles database connection in the background?

I am trying show controller specific pages in my rails app when the database connection goes away. I do this by catching the Mysql::Error in the rescue_action method and rendering appropriate pages. When the mysql service alone is stopped , i get the Mysql::Error exception really quickly and i could render the pages without any delay.
But when the server itself is shut down, rails takes 3 mins to throw the Mysql::Error and after 5-6 request the whole website becomes unresponsive.
I tried to figure out, which method in rails framework takes such a long time , when the mysql server is shut down. It was a method connection.real_connect (in the active record mysql_adapter file),which took 3 mins to return with an exception.
so i decided to timeout out this method using systemTimer gem. This monkey patch worked perfectly, when i start the website with database connection and immediately shutdown the database server.
But when i start the website with database, and access the website for sometime and then shut down the database server, it doest work at all. and the whole website becomes unresponsive as before. I wonder what is the difference between the two scenarios.
I think i need to know more in detail about how rails handle database connection . how it reacts when the database connection goes off. so that i could identify exact places where i can put monkey patches and make it work for my spefic requirement. I havent seen any relevant article explaining this.
Any help will be very useful for me
Thanks,
I've not tried this, but you can add connect_timeout as one of the specified options (along with port, host, etc) for the MySQL connection in the database.yml file. That value is passed to the real_connect call to establish the connection to MySQL.
Furthermore, since you are experiencing a delay after the initial connection is made and the DB is shutdown, you may need to use the read_timeout config option.

Resources