I am trying to implement a simple solution to help with some behavior. Basically I want to create a listener on a particular url, like
localhost:3000/listen
I have a callback with a 3rd party service that is sending JSON as a post request. In my rails routes I have the route setup, to accept a post request to that namespace.
The thing that I want to happen, is for some logic to be run anytime a new post comes in and to run that logic async without any disruption to the normal web service. For example, the post request will contain some data, if the data has a boolean "true", we need to fire off a Rails Mailer. I normally could do this with a simple rails controller action but this is not correct.
Any thoughts on the best approach to handle this? Would this best with a gem like eventmachine? If anyone could give their feedback to implement a simple solution that would be great!
I would look at your background jobs. I am a Sidekiq fan, and popular alternatives are Resque and DelayedJob.
Your controller will receive the response, and schedule it to be performed in the background. That will send out the mail (or whatever you need it to do) asynchronously.
class CallbackController < ApplicationController
def listen_third_party
data = params.slice([:params, :you, :care, :about])
if data[:boolean_field] == true
CallbackMailer.perform_async(data)
end
end
end
Related
I want to create a callback in my User model. after a user is created, a callback is initiated to run get_followers to get that users twitter followers (via full contact API).
This is all a bit new to me...
Is this the correct approach putting the request in a callback or should it be in the controller somewhere? And then how do I make the request to the endpoint in rails, and where should I be processing the data that is returned?
EDIT... Is something like this okay?
User.rb
require 'open-uri'
require 'json'
class Customer < ActiveRecord::Base
after_create :get_twitter
private
def get_twitter
source = "url-to-parse.com"
#data = JSON.parse(JSON.load(source))
end
A few things to consider:
The callback will run for every Customer that is created, not just those created in the controller. That may or may not be desirable, depending on your specific needs. For example, you will need to handle this in your tests by mocking out the external API call.
Errors could occur in the callback if the service is down, or if a bad response is returned. You have to decide how to handle those errors.
You should consider having the code in the callback run in a background process rather than in the web request, if it is not required to run immediately. That way errors in the callback will not produce a 500 page, and will improve performance since the response can be returned without waiting for the callback to complete. In such a case the rest of the application must be able to handle a user for whom the callback has not yet completed.
(This question is a follow-up to How do I handle long requests for a Rails App so other users are not delayed too much? )
A user submits an answer to my Rails app and it gets checked in the back-end for up to 10 seconds. This would cause delays for all other users, so I'm trying out the delayed_job gem to move the checking to a Worker process. The Worker code returns the results back to the controller. However, the controller doesn't realize it's supposed to wait patiently for the results, so it causes an error.
How do I get the controller to wait for the results and let the rest of the app handle simple requests meanwhile?
In Javascript, one would use callbacks to call the function instead of returning a value. Should I do the same thing in Ruby and call back the controller from the Worker?
Update:
Alternatively, how can I call a controller method from the Worker? Then I could just call the relevant actions when its done.
This is the relevant code:
Controller:
def submit
question = Question.find params[:question]
user_answer = params[:user_answer]
#result, #other_stuff = SubmitWorker.new.check(question, user_answer)
render_ajax
end
submit_worker.rb :
class SubmitWorker
def check
#lots of code...
end
handle_asynchronously :check
end
Using DJ to offload the work is absolutely fine and normal, but making the controller wait for the response rather defeats the point.
You can add some form of callback to the end of your check method so that when the job finishes your user can be notified.
You can find some discussion on performing notifications in this question: push-style notifications simliar to Facebook with Rails and jQuery
Alternatively you can have your browser periodically call a controller action that checks for the results of the job - the results would ideally be an ActiveRecord object. Again you can find discussion on periodic javascript in this question: Rails 3 equivalent for periodically_call_remote
I think what you are trying to do here is little contradicting, because you use delayed_job when do done want to interrupt the control flow (so your users don't want to want until the request completes).
But if you want your controller to want until you get the results, then you don't want to use background processes like delayed_job.
You might want to think of different way of notifying the user, after you have done your checking, while keeping the background process as it is.
I have an API centric application (/api/v1/users) it simply return all users restfully with JSON format.
My problem is, if I call that route on the controller, it returns "Timeout::Error"
What is the problem?
class BaseController < ApplicationController
def index
return HTTParty.get('http://localhost:3000/api/v1/users').body
end
end
Update
users_controller.rb (/api/v1/users)
application_controller.rb
https://gist.github.com/4359591
Logs
http://pastie.org/5565618
If I understand correctly, you have an API end-point, at /api/v1/users, and your BaseController#index is calling that method?
If that is correct, inside the same rails process, and you are testing in development mode (as I can tell from your url), then you only have a single process running, which can only handle a single request at once. So if you start a request to BaseController#index, it will start another request to your own test-server, which is busy, and it will just wait until it times out.
If you want to test your API, I would look at a client tool like e.g. Postman.
HTH.
I have an action that takes a long time. I want to be able to provide updates during the process so the user is not confused as to whether he lost the connection or something. Can I do something like this:
class HeavyLiftingController < ApplicationController
def data_mine
render_update :js=>"alert('Just starting!')"
# do some complicated find etc.
render_update :js=>"alert('Found the records!')"
# do some processing ...
render_update :js=>"alert('Done processig')"
# send #results to view
end
end
No, you can only issue ONE render within a controller action. The render does NOTHING until the controller terminates. When data_mine terminates, there will be THREE renders, which will result in an error.
UPDATE:
You'll likely have to set up a JavaScript (jquery) timer in the browser, then periodically send an AJAX request to the server to determine the current status of your long running task.
For example the long running task could write a log as it progresses, and the periodic AJAX request would read that log and create some kind of status display, and return that to the browser for display.
It is impossible to handle the request that way. For each request, you have just one answer.
If your action takes a long time, then maybe it should be performed asynchronously. You could send user e-mails during the process to notify him of the progress.
I suggest that you to take a look on DelayedJob gem:
http://rubygems.org/gems/delayed_job
It will handle most difficult parts of dealing with assync stuff for you (serializing / deserializing your objects, storage, so on...).
Hope it helps you!
I want a /plan method to return a json object that's itself returned by another local (but belonging to another web app in java) URL (let's call it /plan2 for the sake of this question).
What I want is not a redirect but really to have /plan return the data as it is returned by /plan2, to which I'm appending a bunch of other keys. Since the request is local, would using Net::HTTP be overkill? What are my options, considering I'd also like to send an HTTP Accept header along.
Shooting in the dark here, but I am assuming that /plan belongs to public Rails app and the /plan2 is the url from another web app (maybe not even Rails) on the same server accessible by Rails app but not publicly available. If this is the case, then yes, you can get the response, but I would suggest rather OpenURI or Mechanize than Net::HTTP. If you put it in respond_to JSON format block then everything should be fine with Accept header also.
are you speaking of re-using functionality of another controller-method?
there is no standard way of doing this in rails. you could either put the common functionality into a module and include it this way, or use inheritance if the functionality is in another controller. if it's the same controller class, you could just call the method.
if it's a "private" url, how are you going to call it via http?!
I would suggest encapsulating whatever functionality /plan2 uses and simply re-use that in /plan1
But if you just want to get it to work...
class PlanController < ApplicationController
def plan1
plan2(extra_parameter_1: true, extra_parameter_2: 'hello')
end
def plan2(extra = {})
params.merge!(extra)
# Whatever your code was before...
end
end