Why does my Net::HTTP.post_form timeout? - ruby-on-rails

In my rails app controller I am posting to the api of the app on the same machine. I have build this out to handle the posting the data to the url:
url = "http://172.16.155.165:3000/api/jobs"
params = {
:input => "original/video.h264",
:output => "new/video.mp4",
:preset => 'h264'
}
jobResults = Net::HTTP.post_form(URI.parse(url), params)
This works great when I run this code through rails console but when I use it in my controller it gives me this error after loading for a minute or so:
Timeout::Error in SeminarsController#create
Timeout::Error
Once the timeout happens the data is actually posted and the api does what it should. It is like it is hanging until it times out then posts the data. The controller never goes beyond this step though. It should write the response body to a file with jobResults.body which would work fine if it didn't time out. If I write this into rails console it outputs the response immediately. The api will never take a whole minute to respond.
Am I doing something to cause this to happen? How can I make it work right?
edit:
This is the code for create in app/controllers/api/jobs_controller.rb:
def create
job = Job.from_api(params, :callback_url => lambda { |job| api_job_url(job) })
if job.valid?
response.headers["X-State-Changes-Location"] = api_state_changes_url(job)
response.headers["X-Notifications-Location"] = api_notifications_url(job)
respond_with job, :location => api_job_url(job) do |format|
format.html { redirect_to jobs_path }
end
else
respond_with job do |format|
format.html { #job = job; render "/jobs/new"}
end
end
end

Yes. Ideally you should remove the long running process (yes this is long running process) into background job. Remember that when many users start updating the videos, this process will show down for many reasons (like bandwidth, API acceptance rate etc). Rake::Timeout always pops out if the process passes the threshold. It is actually designed to abort requests that are taking too long to respond. And, it is not raised in console.
How can I make it work right?
Move it to the background job. Or you can explictly increase the rake timeout interval by doing something like this
# config/initializers/timeout.rb
Rack::Timeout.timeout = 30 # seconds
But i suggest not to do this. This rake-timeout helps in debugging. Mainly people use in heroku with newrelic.

Related

Rails - multiple theads to avoid the slack 3 second API response rule

I am working with the slack API. My script does a bunch of external processing and in some cases it can take around 3-6 seconds. What is happening is the Slack API expects a 200 response within 3 seconds and because my function is not finished within 3 seconds, it retries again and then it ends up posting the same automated responses 2-3 times.
I confirmed this by commenting out all the functions and I had no issue, it posted the responses to slack fine. I then added sleep 10 and it done the same responses 3 times so the ohly thing different was it took longer.
From what I read, I need to have threaded responses. I then need to first respond to the slack API in thread 1 and then go about processing my functions.
Here is what I tried:
def events
Thread.new do
json = {
"text": "Here is your 200 response immediately slack",
}
render(json: json)
end
puts "--------------------------------Json response started----------------------"
sleep 30
puts "--------------------------------Json response completed----------------------"
puts "this is a successful response"
end
When I tested it the same issue happened so I tried using an online API tester and it hits the page, waits 30 seconds and then returns the 200 response but I need it to respond immediately with the 200, THEN process the rest otherwise I will get duplicates.
Am I using threads properly or is there another way to get around this Slack API 3 second response limit? I am new to both rails and slack API so a bit lost here.
Appreciate the eyes :)
I would recommend using ActionJob to run the code in the background if you don't need to use the result of the code in the response. First, create an ActiveJob job by running:
bin/rails generate job do_stuff
And then open up the file created in app/jobs/do_stuff_job.rb and edit the #perform function to include your code (so the puts statements and sleep 30 in your example). Finally, from the controller action you can call DoStuff.perform_later and your job will run in the background! Your final controller action will look something like this:
def events
DoStuff.perform_later # this schedules DoStuff to be done later, in
# the background, so it will return immediately
# and continue to the next line.
json = {
"text": "Here is your 200 response immediately slack",
}
render(json: json)
end
As an aside, I'd highly recommend never using Thread.new in rails. It can create some really confusing behavior especially in test scripts for a number of reasons, but usually because of how it interacts with open connections and specifically ActiveRecord.

Sending Thousands of Request at the Same Time with Ruby on Rails?

I need to develop an endpoint in rails that will send (possibly) hundreds/thousands of request, process it then return/render the json to user/client.
I've tried using thread pool with the size of 5, but it took forever, but when I tried increasing the size to the number of request, it threw ThreadError: can't create Thread: Resource temporarily unavailable exception.
I don't think I can use background job/worker for this because I should return the result.
So what should I do?
I was thinking that I should wrap the process in 20sec timeout so it doesn't reach rails 30sec limit, and if it's still not finished in 20sec, it will return the unfinished result. It goes like this
result = Queue.new
begin
Timeout::timeout(20) do
elements.each do |element|
pool.process {
response = send_request(element)
result << response
}
end
pool.shutdown
end
rescue Timeout::Error
pool.shutdown
end
result = (Array.new(elements.size {result.pop})).flatten
render json: {
data: result
}
But it's still not working, the process still keep going even after it timeout.

Rails controller - execute action only if the a Rails UJS method inside succeed (mutually dependent methods)

Following another question (Rails controller - execute action only if the two methods inside succeed (mutually dependent methods)), I would like to ensure that inside one of my controller's action, if the user does not see the message displayed by a Rails UJS method, then the first methods of the controller action are not implemented either.
CONTEXT
I have a Controller with a method called 'actions'. When the app goes inside the method 'example_action', it implements a first method (1) update_user_table and then (2) another update_userdeal_table. (both will read and write database) and then (3) a third method which is related to a Rails UJS(ajax) call.
My issue is the following: in case of timeout in the middle of the controller, I want to avoid the case where the User table (via method 1) is updated, the UserDeal table is updated (via method 2) but NOT the thrid method i.e the ajax request that displays a message FAILS (error, timeout,...status like 500 or 404 or canceled or timeout...).
In my app, for mobile users if they're in a subway with internet connection, they launch the request that goes through 'example_action' controller, performs successfully the first method (1) and second method (2) but then they enter a tunnel for 60 seconds with very very low (<5b/sec) or NO internet connection, so for UX reasons, I timeout the request and display to the user 'sorry it took too long, try again'. The problem is that if I could not show to them the result (3), I need to be able to not execute (1) and(2).
I need the two methods (1) and(2) and(3) to be "mutually dependent": if one does not succeed, the other one should not be performed. It's the best way I can describe it.
Today Here is my code. It's not working as I am manually testing by clicking and then after just 2 seconds I disconnect the internet connection. I see in my database that (1) and(2) were performed and the databases were updated but I saw the message 'sorry it took too long, try again'.
Is that the right approach ? if yes how to do this?
If not, should I try a different angle like: if (1) and(2) were successful but not(3) should I store the fact the rails UJS xhr status was an error or timeout, that consequently the modal wxas not effectively displayed to the user and then show to them the result/message once they get back online?
Here is the code
html page for the user
the user click on a button that triggers a Rails UJS aajax request that will display ultimately the modal message
<div id=zone">
<%= link_to image_tag(smallest_src_request),
deal_modal_path,
remote: true %>
</div>
This send to a route that points to this controller action
Deal controller
class DealsController < ApplicationController
def deal_modal
Deal.transaction do
update_user_table # that's the (1)
update_userdeal_table # that's the (2)
# show_modal_message
respond_to do |format|
format.js
end
end
private
def update_user_table
# update the table User so it needs to connect to internet and acces the distant User table
end
def update_userdeal_table
# update the table UserDeal table so it needs to connect to internet and access the distant UserDeal table
end
end
This points to a js.erb view file
deal_modal.js.erb
showModalMessage("Here is your result <variable and all>);
To manage the ajax, error, timeouts... (if necessary to the resolution of the question), I use Rails UJS settings.
IMPORTANT: It is here that in case of error or timeout, I send the error / timeout modal message that comes in place of the one you normally get (see just above "Here is your result..")
$(document).on('page:change', function () {
$("#zone").
on('ajax:error',function(event,xhr, status, error){
console.log(' ajax call failed:', error);
var msg;
msg = Messenger().post({
hideAfter: 4,
message: "sorry it took too long, try again."
});
});
$(document).on('page:change', function () {
//set timeout Rails UJS ajax option that will display message for ajax:error cases defined above
$.rails.ajax = function(options) {
if (!options.timeout) {
options.timeout = 5000;
}
return $.ajax(options);
};
});
So the transaction will only rollback if an error is thrown. If an unhandled error is thrown, your application will crash and it will show a 500 error in some way.
In order to display the response to the user, on success or error, you will need to render something. So you don't want to prevent the respond_to block from executing. One way to handle this would be to set a flag via an instance variable.
def deal_modal
begin
Deal.transaction do
update_user_table
update_userdeal_table
end
#success = true
rescue
#success = false
end
# show_modal_message
respond_to do |format|
format.js
end
end
Then in deal_modal.js.erb
<% if #success %>
showModalMessage("Here is your result <variable and all>");
<% else %>
showModalMessage("There was a problem");
<% end %>
EDIT:
Dealing with connection issues is definitely tricky and there isn't really one ideal solution. I would generally let the database continue uninterrupted and let it return either a success or failure on it's own time. For lengthy transactions, you can use a gem like delayed_job or sidekiq to process the action in the background and let the rails controller return a response saying "...pending..." or something. Unless you're using websockets on the frontend, this means continually polling the server with ajax requests to see if the background process is complete.

Optimal way to structure polling external service (RoR)

I have a Rails application that has a Document with the flag available. The document is uploaded to an external server where it is not immediately available (takes time to propogate). What I'd like to do is poll the availability and update the model when available.
I'm looking for the most performant solution for this process (service does not offer callbacks):
Document is uploaded to app
app uploads to external server
app polls url (http://external.server.com/document.pdf) until available
app updates model Document.available = true
I'm stuck on 3. I'm already using sidekiq in my project. Is that an option, or should I use a completely different approach (cron job).
Documents will be uploaded all the time and so it seems relevant to first poll the database/redis to check for Documents which are not available.
See this answer: Making HTTP HEAD request with timeout in Ruby
Basically you set up a HEAD request for the known url and then asynchronously loop until you get a 200 back (with a 5 second delay between iterations, or whatever).
Do this from your controller after the document is uploaded:
Document.delay.poll_for_finished(#document.id)
And then in your document model:
def self.poll_for_finished(document_id)
document = Document.find(document_id)
# make sure the document exists and should be polled for
return unless document.continue_polling?
if document.remote_document_exists?
document.available = true
else
document.poll_attempts += 1 # assumes you care how many times you've checked, could be ignored.
Document.delay_for(5.seconds).poll_for_finished(document.id)
end
document.save
end
def continue_polling?
# this can be more or less sophisticated
return !document.available || document.poll_attempts < 5
end
def remote_document_exists?
Net::HTTP.start('http://external.server.com') do |http|
http.open_timeout = 2
http.read_timeout = 2
return "200" == http.head(document.path).code
end
end
This is still a blocking operation. Opening the Net::HTTP connection will block if the server you're trying to contact is slow or unresponsive. If you're worried about it use Typhoeus. See this answer for details: What is the preferred way of performing non blocking I/O in Ruby?

Phusion passenger spawner rails app causing high cpu usage

I have Asterisk and Rails app running on the same server. All inbound calls via Asterisk triggers a "curl" to the rails app's controller to initiate a juggernaut publish, enabling real-time push of inbound calls to the individual logged in user (Pop-up dialog showing caller profile details).
The problem is, the Passenger Spawner of the rails app is running at almost 100% CPU usage when ever calls starts coming in. Each inbound phone call will run:
/usr/bin/curl http://parlo.local/asterisk/inbound_call?exten=8405&src_num=921187888&recordingfilename=q70001-20
In asterisk controller:
def inbound_call
if params[:src_num].length > 6
extension = AsteriskUserextension.find_by_extension(params[:exten])
if extension.present? && extension.user.present?
#user = extension.user
customer = Customer.first_match(params[:src_num]).first
customer_name = customer.present? ? customer.full_name : "Unknown Caller"
queue = AsteriskQueue.find_by_name(params[:queue])
#result = Asterisk::Action.response_factory("asterisk_inbound","#{queue.try(:title)}","OK",customer.try(:id))
publish
end
end
render :nothing => true, :status => :created
end
I believe the high inbound call rate is causing the high CPU usage. What is the best way to remedy this situation? Will pushing all the work to RESQUE help?
Thanks for any guidance!
You need use FastCGI technology.
Also you need in asterisk use CURL function instead of System application.
Every call to System create shell and fork new proccess.
Also it can be nice idea check asterisk events via AMI instead of dooign CURL.

Resources