Thin EventMachine Sinatra vs. Rails - ruby-on-rails

I have been looking into the possibility of backgrounding some jobs with EventMachine. In Sinatra this appears to work great but Rails 3 appears to execute all ticks before rendering a view.
When I run the following code under the thin webserver it behaves as expected. The first request returns immediately and the second request is waiting for the 3 second sleep call to finish. This is the expected behavior.
class EMSinatra < Sinatra::Base
get "/" do
EM.next_tick { sleep 3 }
"Hello"
end
end
Whereas in Rails 3 running I am trying to do the same thing: (running under thin)
class EmController < ApplicationController
def index
EM.next_tick {
sleep(3)
}
end
end
In Rails the sleep call happens before rendering the view to the browser. The result is that I am waiting for 3 seconds for the initial page to render.
Does anybody know why this is happening? I am not looking for comments on wether this is a good practice or not. I am simply experimenting. Throwing small tasks into the reactor loop seems like an interesting thing to look into. Why should the client have to wait if I am going to make some non-blocking http-requests?

Im not sure this is the answer you are looking for but i did some research on this before.
Let me tell you a littlebit of background information:
What we wanted to achieve was that rails already flushed parts of the template tree (e.g. the first part of the layout) even when the controller action is taking a long while to load.
The effect of this is that the user already sees something in their browser while the webserver is still doing work.
Ofcourse the main view has to wait with rendering because it probably needs data from the controller action.
This technique is also known as BigPipe and facebook wrote a nice blog about this:
http://www.facebook.com/notes/facebook-engineering/bigpipe-pipelining-web-pages-for-high-performance/389414033919
Anyway, after doing some research to achieve this for rails 3 i found this blog post made by Yehuda Katz.
http://yehudakatz.com/2010/09/07/automatic-flushing-the-rails-3-1-plan/
So for now i think you really have to stick with the waiting for the controller

Using EM.defer instead of EM.next_tick causes the sleep to happen after the response is sent back.

Related

Shopify API calls not working in Rails background job

In my Rails controller action, I have a method that does a bunch of Shopify API calls. Things like:
ShopifyAPI::Product.all()
ShopifyAPI::Product.find(:all, params: {title: title})
ShopifyAPI::Product.create(title: title, body_html: description, images: images, tags: tags, product_type: product_type)
All of it does what I want...very neat.
The problem is that I'm going to be uploading a CSV and using this controller method. It's fine if I have like 8 line items, but very quickly it gets slow. So, I thought, let's move it to a background worker.
I'm using Redis/Resque to get everything going and using some dummy outputs (i.e. puts 'Hi there champ!') I've confirmed that the background worker is configured properly and executing when and where it should be. Neat.
So then I put bits and pieces of my controller action in and output that. That all works until I hit my Shopify API calls. I can call .new on about any object, but the when I try to .find, .all, or .create any valid object (which worked before I abstracted it to the background job), it sort of goes dead. Super descriptive! I've tried to output what's going on via logger and puts but I can't seem to generate much output of what's going on, but I have isolated it down to the Shopify API work. I thought that, even though I have an initializer that specifies my passwords, site, API keys, secrets, etc, I might need to reinitialize my Shopify session, as per their setup docs here. I either did it wrong, or that did solve the issue.
At this point I'm sure I'm just missing something in the docs, but I cannot find out how to make these necessary API calls from my background job. Any thoughts on what I might be doing obviously wrong that could solve this? Anyone dealt with anything similar?
Turns out this has to do with where the Shopify Engine was mounted. In my routes.rb I have the following (in addition to other routes; these are the two pertinent ones):
mount ShopifyApp::Engine, at: '/'
root to: 'products#index'
This is all fine and good, but sort of forces the context of your Shopify API calls to be made within the context of the products.rb index controller action...without some changes. 2 ways to do this, one obviously the more Railsy way to do it:
Option 1:
Include
session = ShopifyApp::SessionRepository.retrieve(1)
ShopifyAPI::Base.activate_session(session)
at the beginning of any file in which you want to make Shopify API calls. This sets the session (assuming you only have 1 store, by the way...this is using the retrieve method to retrieve store 1. Risky assumption), authenticate to the API, and everything in life is good.
Option 2:
Class inheritance for the win. Have all your controllers that are making API calls inherit from ShopifyApp::AuthenticatedController. This makes the initializer actually work, and that's it. This is (in retrospect) the clear and obvious way to go. Have an order controller? class OrdersController < ShopifyApp::AuthenticatedController and done: order = ShopifyAPI::Order.find(params[:id]) does exactly what you'd expect it to.

Breaking rails MVC: Sending data from model directly to view via AJAX

I am creating a MUD using Rails. Here is what I got so far:
Right now I am working on a combat system. My combat system will work like this:
current_user sees characters and non_player_characters in room
When current_user attacks another character, the other characters have 5 seconds to "deflect" the attack or they are hit. (Not fully implemented)
When current_user attacks an NPC, there is a 50% the NPC will deflect the attack
NPC will send attacks back to user and user will have to deflect attacks within the proper time interval (Not fully implemented).
In order to implement this combat system I decided I needed to use multithreading and timers:
def initiate_attack
Thread.new do
sleep(5)
hit_target
ActiveRecord::Base.connection.close
end
end
def non_player_character_failed_to_deflect
(1 + rand(10)) < 5
end
def is_non_player_character?
#attack.target_type == "NonPlayerCharacter"
end
def hit_target
if is_non_player_character?
if non_player_character_failed_to_deflect
damage_target
else
puts "Deflected"
end
else
"hit player"
end
end
def damage_target
#target.update(power_level: #target.power_level - 10)
end
This works as far as pure functionality is concerned, but the problem is I can't figure out how to get the strings back to the view so the user can see them. The user should see a message upon anyone initiating an attack, and upon the completion of an attack. I think the main issue with doing this is that by using multithreading MVC is broken because my threads in model are still running after the control has been returned to the controller and view.
So to summarize my question:
1)How do I make it so my view is continuously updated using AJAX with data coming from the model?
For more information please visit the github page for this project:
You need a way to push data to the browser. To do that you have a few options:
Use polling or long-polling. message_bus makes it very simple.
Use websockets as suggested by Justin.
Use another new technology to push events from the server like (server-sent events)[http://www.w3schools.com/html/html5_serversentevents.asp].
I would give the message_bus gem a try.
EDIT: You might as well try (Sidekiq)[http://sidekiq.org/] to run the asynchronous code - I believe you'll find your code using it easier to maintain in the long run, specially compared to the approach of using threads directly.

How to Make the Controller wait for a Delayed Job while the rest of the App continues on?

(This question is a follow-up to How do I handle long requests for a Rails App so other users are not delayed too much? )
A user submits an answer to my Rails app and it gets checked in the back-end for up to 10 seconds. This would cause delays for all other users, so I'm trying out the delayed_job gem to move the checking to a Worker process. The Worker code returns the results back to the controller. However, the controller doesn't realize it's supposed to wait patiently for the results, so it causes an error.
How do I get the controller to wait for the results and let the rest of the app handle simple requests meanwhile?
In Javascript, one would use callbacks to call the function instead of returning a value. Should I do the same thing in Ruby and call back the controller from the Worker?
Update:
Alternatively, how can I call a controller method from the Worker? Then I could just call the relevant actions when its done.
This is the relevant code:
Controller:
def submit
question = Question.find params[:question]
user_answer = params[:user_answer]
#result, #other_stuff = SubmitWorker.new.check(question, user_answer)
render_ajax
end
submit_worker.rb :
class SubmitWorker
def check
#lots of code...
end
handle_asynchronously :check
end
Using DJ to offload the work is absolutely fine and normal, but making the controller wait for the response rather defeats the point.
You can add some form of callback to the end of your check method so that when the job finishes your user can be notified.
You can find some discussion on performing notifications in this question: push-style notifications simliar to Facebook with Rails and jQuery
Alternatively you can have your browser periodically call a controller action that checks for the results of the job - the results would ideally be an ActiveRecord object. Again you can find discussion on periodic javascript in this question: Rails 3 equivalent for periodically_call_remote
I think what you are trying to do here is little contradicting, because you use delayed_job when do done want to interrupt the control flow (so your users don't want to want until the request completes).
But if you want your controller to want until you get the results, then you don't want to use background processes like delayed_job.
You might want to think of different way of notifying the user, after you have done your checking, while keeping the background process as it is.

Providing updates during a long Rails controller action

I have an action that takes a long time. I want to be able to provide updates during the process so the user is not confused as to whether he lost the connection or something. Can I do something like this:
class HeavyLiftingController < ApplicationController
def data_mine
render_update :js=>"alert('Just starting!')"
# do some complicated find etc.
render_update :js=>"alert('Found the records!')"
# do some processing ...
render_update :js=>"alert('Done processig')"
# send #results to view
end
end
No, you can only issue ONE render within a controller action. The render does NOTHING until the controller terminates. When data_mine terminates, there will be THREE renders, which will result in an error.
UPDATE:
You'll likely have to set up a JavaScript (jquery) timer in the browser, then periodically send an AJAX request to the server to determine the current status of your long running task.
For example the long running task could write a log as it progresses, and the periodic AJAX request would read that log and create some kind of status display, and return that to the browser for display.
It is impossible to handle the request that way. For each request, you have just one answer.
If your action takes a long time, then maybe it should be performed asynchronously. You could send user e-mails during the process to notify him of the progress.
I suggest that you to take a look on DelayedJob gem:
http://rubygems.org/gems/delayed_job
It will handle most difficult parts of dealing with assync stuff for you (serializing / deserializing your objects, storage, so on...).
Hope it helps you!

Rails 2.3.X - Execute code after request was rendered and returned?

is it possible in rails 2.3.X to start a new chain of commands after a request has been rendered and returned to the requestor?
I need that feature in order to work with an asynchronous API on the other side: They expect a response to their request and after that response is done my rails app should send a new http-request to them (post something to their API)...
What are the possibilities here? Is there something like a after_render hook?
Should I make use of threads or background tasks and how could this be done?
I would be very glad for some solutions :-)
Kind regards
UPDATE: The Return-Code (eg. 200) should be sent to the requestor before the other calls are executed
The easiest thing to do is spawn a new thread. This is assuming that it is a lightweight call and you don't need advanced error logging or retry logic.
Thread.new do
puts "call the api"
end
The two most popular solutions for this are Delayed Job (that Lars mentioned), and Resque:
https://github.com/tobi/delayed_job
https://github.com/defunkt/resque
How about using something like Delayed Job?
I could be wrong, but I think code execution continues after a render, unless you put a return. This is why you get an error if you try to render twice..
Are you rendering html? If so, maybe you can insert some javascript into the rendered page to make a new request to your controller and initiate the further action that you need to take.

Resources