Get response of message queue inside rails controller - ruby-on-rails

I have a rails controller (say in application A) whose response is dependent on data from another application (say application B).
I am using RabbitMq for inter application communication.
I can not render a response from the controller till the time the queue worker gets a response from application B. So currently when I get an HTTP call on application A, I publish to application B through a RabbitMq queue to fetch the required data. I am listening for the response of application B on a queue created by 'sneakers' gem. I want to receive this fetched data from 'sneakers' queue inside the controller of application A.
So the question is how can I wait for the RabbitMq, queue response inside the controller?
And also if I am able to wait for the response inside controller, how will I figure out which queue response is for which HTTP call.

To address the second issue, you can send a randomly generated string along with the request you send to application B. And the application B while responding will also send the same string that it got with the request. So the controller A will know for which request the response is.
Now coming to the first question, I think rabbitmq is not the correct tool to do such a thing. Even if you could wait for the message, It will be a very slow affair. compared to that a better way would be to expose application B as an API. It will increase the speed of the application by many times.
If API is not an option you can look at this link on how to create a consumer.

Related

Rails API, microservices, async/deferred responses

I have a Rails API which can handle requests from the clients. Clients use that API to perform analysis of their data. Client POSTs the data to API, API checks if that data have been analysed before. If so API just respond with analysis result. If the data haven't been analyzed before API:
Tells client that analysis started.
Establishes the connection with analyzing microservice.
Performs asynchronous (or deferred or i don't know) request to the analyzing microservice and waiting for response. The analysis takes much time so neither the API nor the microservice should be blocked while doing it.
When the response from analyzing microservice is returned API hands it to the client.
The main issue for me is to set up things such way that client could receive somehow the message "Your data had been sent to analysis" right after he performed the request. And then when analysis will be done client could receive its result.
The question is what approach I have to use in that case? Async responses, deferred responses, something else? And what known solutions could help me with that? Any gems?
I'm new to that stuff so I'm really sorry if I ask dumb questions.
If using HTTP you can only have one response to every request. To send multiple responses, i.e. "work in progress", then later the "results", you would need to use a different protocol, e.g. web sockets.
Since HTTP is so very common I'd stick with that in combination with background jobs. There are a couple of options which spring to mind.
Polling: The API kicks off a background jobs (to call the microservice) and responds to the client with a URL which the client can ping periodically for the result. The URL would respond with some kind of "work in progress" status until the result is actually ready). The URL would need to include some kind of id so the API can lookup the background job.
The API would potentially have two URLS; /api/jobs/new and /api/jobs/<ID>. They would, in Rails, map to a controller new and show action.
Webhooks: Have the client include a URL of its own in the request. Once the result is available have the background job hit the given URL with the result.
Either way, if using HTTP, you will not be able to handle the whole thing within a request/response, you will have to use some kind of background processing (so request to the microservice happens in a different process). You could look at Sidekiq, for example.
Here is an example for polling:
URL: example.com/api/jobs/new
web app receives client request
generates a unique id for the request, SecureRandom.uuid.
starts a background job (Sidekiq) passing in the uuid and any other parameters needed
respond with URL such as example.com/api/jobs/
--
background job
sends request to microservice API and waits for response
saves result to database with uuid
--
URL: example.com/api/jobs/UUID
look in database for UUID, if not found respond that job is "in progress". If found return result found in database.
Depending on what kind of API you use. I assume your clients interact via HTTP.
If you want to build an asynchronous API over HTTP the first thing that you should do: accept the request, create a job, handle it in the background and immediately return.
For the client to get the response you have to 2 options:
Implement a status endpoint where clients can periodically poll the status of the job
Implement a callback via webhooks. So the client has to provide a URL which you then call after you're done.
A good start for background processing is the sidekiq gem or more general ActiveJob that ships with Rails.

Wait for HTTP request to complete in rails controller

I have a scenario where my rails controller action has to make a API request to a backend business logic server which does a lot of computations and returns me the result.
I'm thinking to show a loading page to the user and make the call asynchronous using Faye or any other option and redirect the user when the call is complete..
But even if I make the call asynchronous, the HTTP request needs to wait for the server to return the data after process, which would take around 20 seconds.
I would like to know what is the best way to make such calls in rails.?
I had faced a similar situation, below is the route that I took:
When the controller action is triggered
a. I fired off a 'async' request to the API using a worker(I used sidekiq)
b. Loaded a 'AJAX' spinner gif on top of a modal
The worker handling the API request runs on another thread which is synchronous and waits for the result from the API
When the processing is done, the worker fires off notification via 'Faye' which removes the modal and populates the data.
Return an HTTP response with status 202 Accepted(for the request that need to take long to process) and start making AJAX requests(to a URL, e.g /jobs/1) to check the status of the background job. Once your job has finished, update it's status so that your Javascript(AJAX) can handle the result of that background job.

Process job using workers while client waits and return response when complete

I'm building an API using Rails where requests come in and they need to be executed by a cluster of workers running on a different server (these workers call remote APIs and parse the data, etc...). I'm going to be using Sidekiq or Resque to handle the queueing/processing of that.
My issue is the client needs to wait while this is happening and the controller needs to return the response to the client once it's complete. How would I handle this in the controller? We're using a redis backend, so I was thinking something along the lines of subscribing to a pub/sub channel and waiting for the worker to publish a status message. The controller would wait for a set time period and then return a 'check back later' response to the client if it doesn't receive a message in time. What would be the best way to implement that, or is there a better solution?
Do not make your clients wait! There are a lot of issues if you make the controller block for a long running job:
Other programs may assume the request timed out (proxies, browsers, scripts, etc.)
It makes your API endpoints become a source for denial of service
It requires you to put more engineering work into web servers (since a rails process can't handle another web request while it's handling the blocking call)
Part of the reason of using Sidekiq or Resque is the avoid controllers that do heavily lifting during the http request.
Instead, background jobs should report their status to the database. Then web server should query and return to the client the latest status from the database.
If clients need more immediate feedback, you can:
make clients constantly poll
post request to the client (if the API consumer is another webserver)
use another protocol mechanism (eg - websockets).

Respond multiple times to one request?

One place in my rails app requires loading a number of responses from an external server, which currently looks like this:
User makes an AJAX request to the server. "Loading data..." is displayed.
5-30 seconds later, the rails app sends response (assuming the data has not been cached).
It would be much better if I could keep the user informed during that long waiting period with messages informing them of the progress of the request. Such as:
User makes request (as before).
Message "Retrieving ABC" displayed
Message "Retrieving XYZ" displayed
Message "Processing data" displayed
Full response as normal.
How can I go about doing this? I don't think that sending back multiple JavaScript responses to one request is possible, but have no idea what the correct way of doing this is!
This is tricky but Rails supports the notion of streaming a request.
But you probably have to do a lot of work in your project to make this work.
Tenderlove (Aaron Patterson) posted a intro into how Streaming works in Rails and I believe there is a Railscast on this topic.
Probably a simpler solution would be to split this into multiple requests.
So the main request (assuming it's an ajax request) takes forever to complete.
Meanwhile you poll the status on a different ajax request and the main action updates the database with it's process so the other request can retrieve that status and send back the appropriate response (where in the process the main request currently is)
So I'd assign each request something like a request id and then have a database table for those requests and their statuses (could be as simple as having only id:integer and status:string)
You assign the request id on the client (use some random data to create a hash or something) and start the long request with that Id.
The client then polls another endpoint with that same id to get the status back.
The long running request in the meantime updates the Status table with the id it was given and where it is currently in processing that request.

Rails application design: Queueing, Resque, Background Services, and Redis

I am designing a Rails app that takes in requests, uses data within the request to call a 3rd party web service, process the reply and then sends out a response to the original requestor and also issues a PUT request to yet another service.
I am trying to wrap my head around how to design this Rails app as it's different from the canonical Rails structure.
The objects are Lists and Tasks. Each List has many Tasks, and each Task belongs to a List.
The request I would get is something like:
http://myrailsapp.heroku.com/v1/lists?id=1&from=2012-02-12&to=2012-02-14&priority=high
In this example I am requesting tasks from 2/12/2012 to 2/14/2012 with a high priority in List #1
I would then issue a 3rd party web service call like this:
http://thirdpartywebservice.com/v1/lists?id=4128&from=2012-02-12&to=2012-02-14&priority=high
As you can see some processing was done on the data (id was changed in this case)
The results are then sent back to the requestor and to another web service via PUT.
My question is, how do I set up the Rails app to handle these types of behaviors? How does the controller structure change? This looks like a good use case for queues, how do I distribute multiple concurrent requests among queues?
For one thing I don't need data persistence (data can be discarded after the response is sent out) and also data structure design is simplified. (I don't think I need ruby objects, simply dictionaries or hashes representing these would be lighter weight and quicker to implement)
Edit
So I broke down the work flow of the app into these components
Parse incoming request
Construct 3rd part web service request
Send 3rd party request
Enqueue a worker to process the expected response
Process the response once it arrives
Send the parsed result back as a response
Which of the standard ruby controllers handle each of these steps? What are the models needed besides Lists and Tasks?
You should still use a database because passing data to Resque is messy. Rather, you should store it in the database and then pass the id to the workers, fetch the data, commit any new data or delete the record. It's really up to you but this method is cleaner. You can also use a push service like faye to let the user know when the processing is complete.
If you expect to have many concurrent requests, I would recommend Sidekiq as it's less of a memory hog. Having 4-5 resque workers can already suck up about 512 MB. The controller structure should not change. Please comment on anything you need clarified and I'll be happy to update my answer.
EDIT
You would want to use a separate database store, such as Postgres. Not sure if it's important what models you need, but essentially this is what should be happening.
In your controller, create a Request object which contains the query params you want to query this 3rd party service with. Then enqueue a job to be handled by Sidekiq/Resque, let's call this ThirdPartyRequest and pass in the id of the Request object you just created as an argument. Then render a view here showing the Request object. Let's say that Request#response is still empty cause it hasn't been processed yet, so let the user know it's still processing.
A worker then handles your job ThirdPartyRequest. ThirdPartyRequest should then fetch the Request object and obtain the query params needed to contact the third party service. It does that then gets a Request. Update the Request object with this Request then save it.
class ThirdPartyRequest
def self.perform(request_id)
request = Request.find(request_id)
# contact third party service
request.response = ...
request.save
end
end
The user can continually refresh his page to check on his/her Request object. Once it gets updated with the response, they will know its completed. If you want the page to refresh automatically, look into faye/juggernaut/private_pub or a SaaS solution like Pusher.

Resources