Can a server request be routed to a daemon in Rails 5? - ruby-on-rails

Yesterday, I have been reading about daemons and considering it for use with a Rails app. I want to have a Ruby server (the daemon) to handle a specific request when it receives one, so it continuously waits for a request in the background (I am not sure whether this will be a proper use case of a daemon, so correct me if I am wrong).
Is there a way to use routes.rb in Rails 5 to route a request to a daemon?
p.s. Please, don't suggest that I should use a standard controller action to handle requests because there is a requirement I need to fulfil that prevented me from using one. Bottom line, I just want that specific request to be handled by a daemon instead of being handled by the main Rails app.

Related

Rails http request itself in tests hangs

Problem
Making an HTTP request from a model to a route on the same app results in request timeout.
Background
Why would you want to http request itself rather than call a method or something?
Here is my story: there is a rails app A (let's call it shop) and a rails app B (let' call it warehouse) that talk to each other over http.
I'd like to be able to run both of them in a single system test to test end-to-end workflow. Rails only runs a single service, but one can mount app B as a rails engine into the app A, effectively having two apps in a single service. However, they still talk to each other over http and that's the bit that does not work.
Thoughts
It looks as if the second request hits some kind of a thread lock around active record or something. The reason I thinking about active record, is that I was able to make an http call to itself from the controller (that is, before active record related code kicked in)
Question
Is it possible to work around that?

How to prevent Rails controller from hanging when making a web service call

I have a Rails controller that I'm calling when a user loads a specific page. The controller makes a call to a 3rd party web service. However, when the web service is down, my Rails controller just hangs. I'm not able to navigate to another page, log out, or refresh the page...all of these tasks wait for the web service call to complete before being executed. In the event that the web service call never completes, I have to restart my Rails app in order for it to be functional again.
Is there a standard way of preventing this from happening? I am using the Faraday gem to make web service calls. I suppose I could set a timeout value when making my web service call. However, ideally I would like any user action of navigating to another page to halt this web service call immediately. Is this possible?
I believe this is happening because you are probably using a Rack web implementation that can only handle one request at a time. Unicorn is like that where it is event driven. Very much like Node. You should think about fixing this first with a timeout. So if you are using Faraday, you can do something like req.options.timeout = 5 to have a timeout.
Then I recommend using Puma. If that's not an option, you should adjust your server settings to allow more than one connection at a time. For Unicorn, I believe it is worker_processes.

In Rails 3, how do I call some code via a controller but completely after the Request/Response cycle is done?

I have a very weird situation: I have a system where a client app (Client) makes an HTTP GET call to my Rails server, and that controller does some handling and then needs to make a separate call to the Client via a different pathway (i.e. it actually goes via Rabbit to a proxy and the proxy calls the Client). I can't change the pathway for that different call and I can't change the Client at all (it's a 3rd party system).
However: the issue is: the call via the different pathway fails UNLESS the HTTP GET from the client is completed.
So I'm trying to figure out: is there a way to have Rails finish the HTTP GET response and then make this additional call?
I've tried:
1) after_filter: this doesn't work because the after filter is apparently still within the Request/Response cycle so the TCP/HTTP response back to the Client hasn't completed.
2) enqueuing a worker: this works, but it is not ideal because if the workers are backed up, this call back to the client may not happen right away and it really does need to happen right after the Client calls the Rails app
3) starting a separate thread: this may work, but it makes me nervous: adding threading explicitly in Rails could be fraught with peril.
I welcome any ideas/suggestions.
Again, in short, the goal is: process the HTTP GET call to the Rails app and return a 200 OK back to the Client, completely finishing the HTTP request/response cycle and then call some extra code
I can provide any further details if that would help. I've found both #1 and #2 as recommended options but neither of them are quite what I need.
Ideally, there would be some "after_response" callback in Rails that allows some code to run but after the full request/response cycle is done.
Possibly use an around filter? Around filters allow us to define methods that wrap around every action that rails calls. So if I had an around filter for the above controller, I could control the execution of every action, execute code before calling the action, and after calling it, and also completely skip calling the action under certain circumstances if I wanted to.
So what I ended up doing was using a gem that I had long ago helped with: Spawnling
It turns out that this works well, although it required a tweak to get it working with Rails 3.2. It allows me to spawn a thread to do the extra, out-of-band callback to the Client, but let the normal, controller process complete. And I don't have to worry about thread management, or AR connection management. Spawnling handles that.
It's still not ideal, but pretty close. And it's slightly better than enqueuing a Resque/Sidekiq worker as there's no risk of worker backlog causing an unexpected delay.
I still wish there was an "after_response_sent" callback or something, but I guess this is too unusual a request.

Callback (or equivalent) when container w/ Rails app finishes booting?

I have a docker container containing a rails app. Running the container starts a script similar to this: https://github.com/defunkt/unicorn/blob/master/examples/init.sh, which does some busy work and then reaches out to a unicorn.rb script similar to this: https://github.com/defunkt/unicorn/blob/master/examples/unicorn.conf.rb.
I have a clojure web app that can tell this container to run. The request to do this is nonblocking, and the user of the site will somehow be notified when the rails app is ready to receive requests.
I can think of various hacky ways to do this, but is there an idiomatic way to have the container let me know when the unicorn rails app is ready to receive web requests?.
I'd like to have it hit some callback url in my app but I'm open to other options. Thanks!
I don't get why you need to do it that way. Couldn't you just perform HTTP requests to the rails part (eg http:://my_page.com/status) and handle the response accordingly?

Rails 3.1 - Firing an specific event with the EventMachine

I would like to use the plugin em-eventsource ( https://github.com/AF83/em-eventsource ) for server-sent events in a Rails 3.1-project. My problem is, that there is only explained how to listen on events and receive messages, but not how to fire a specific event up and send the message. I would like to produce the event in an Active Record-Observer. Am I right when I think that I have to defer a operation with EventMachine to produce this event, or how can I solve this?
And yes, it has to be Ruby on Rails. If I don't get this to work with EventMachine, I would try to bypass the whole ruby-part with node.js.
Actually I worked on this library a little with the maintainer. I think you mixed the client part with the server one. em-eventsource is a client library which you can use to consume a ServerSentEvent API, it's not meant to fire SSE.
On the server side, it quite doesn't matter whether you are using Rails or any other stack (nodejs, php…) as long as the server you are running on supports streaming. The default web server shipped with Rails does not (Webrick) but there are many others which do: Thin, Puma, Goliath…
In order to fire SSE in Rails, you would have to use both a streaming-capable server among those cited, and abide by the SSE specification. It mostly falls down to, first, responding with the proper Content-type header ("text/event-stream") so that the client (browser) knows it should hang-on, and then start streaming on the socket. That latter part is the one not easily possible as of today in Rails 3 (yet not impossible!); Rails 4 actually now supports streaming in an easy way, with a clean and simple internal API, so it's definitely coming.
In the mean time, you'd either:
mess with Rack's API in Rails (using EventMachine I guess, there are some examples in the wild)
or have it smart and make use of the streaming feature provided by Sinatra, built on top of Rack (see https://gist.github.com/1476463 for an example of Sinatra app which can be mounted in a Rails one!)
or you could use an external service such as Pusher
or leverage a entirely different stack…
A good overview: http://blog.phusion.nl/2012/08/03/why-rails-4-live-streaming-is-a-big-deal/
Maybe I'm wrong, but if IIRC Rails can't support long pooling. Rails block whole server (or thread if you have more than one running inside server) for each request and can't reuse them unless whole response was send. That's why you should setup reverse proxy (like nginx) in front of Rails application if you suspect there could be many concurrent connections - to proxy slow client requests and send them to Rails when whole request is received. It's just how Rack works, there's not much you can do about this probably.

Resources