user feedback for asynchronous tasks in rails - ruby-on-rails

i have an action that executes a "possible" long running task. A possible, because it does a request to a remote server, and because of network latency, it can block the user interface and give a small delay to the user.
My question is not related with "how to send long tasks in background", but how to push a notification to the user. My idea, was that the user clicks the button, it fires a task in background, the web interface is unblocked, and the user can do whatever he wants and, when the task is done, he receives a flash message. I can do it with AJAX, polling the server, a specific action that gives me the status of my task, for example, but there is any pattern to do it event based? Kudos for answers with proof of concept or prototypes.

No proof of concept here, but you could use something like spawn or delayed_job to fire off your Rails task and unblock the interface, and then communicate back to the client with node.js or something similar. Depending on what you want to do, however, long-polling may be more practical than setting up more server software.

Related

How to process a request with Rails while not locking browser

I've built a CRUD app that allows clients to scrape links. When the client clicks a button rails goes to the controller and runs the script (I can see all the activity in terminal), but there is not feedback on the frontend. Also, the user can't visit any other pages on the website while the script is running.
I script can take a long time so I want the client to be able to click a button, be redirected to another page and the process to start. The user can leave the page if he wants.
I would also like some sort of way to send an email to the user after it's completed.
Would my backend be able to run many tasks at once, right?
You need a background worker. The idea is to initiate the work when the user "Clicks a button", then let background worker perform the hard work while the user can continue browsing.
At the end the user is notified (email or other means) and the job result is accessible.
Off course several workers can work at the same time.
Have a look at sidekiq, resque or delayed_job
You can try EventMachine, if you are using supporting server(Thin, for example)
EM.next_tick{ p "hello, it's next tick!"}
Will be printed asyncronously

Pusher in background job is outpacing my app - missing JS notifications

I'm moving imports of excel files (often large, often small) into the background using Sidekiq, and alerting the user when the importing is complete using Pusher.
At first, in the 'synchronous' flow (Rails app here), it will kick off the excel import, and then it will redirect to a dashboard page which will receive notifications from Pusher when the importing is done.
The problem is the Sidekiq-Pusher flow sometimes goes faster than the redirect to the dashboard page. My JavaScript subscriber won't be initialized in time to receive the message that is being published from within the background process. So the client gets nothing.
Does Pusher offer a way to delay publishing a message until there is a subscriber? Or does Pusher offer a way to 'stockpile' messages until the subscriber springs to life to consume them? Or maybe there is a simpler solution here I have not thought of?
Fyi, I don't want the background job to sleep for a few seconds to make sure the client is ready, and I don't want to use Pusher to trigger a refresh (i.e. save something in a DB, then refresh to display it).
I am happy to provide code samples if desired.
EDIT:
I'm certainly open to using something else besides Pusher if something else can solve my problem.
Invoke the worker after 1 or 2 seconds from the current time so that it gets invoked and show the message after being redirected to the landing page.
TestProcess.perform_at(2.seconds.from_now,parameters)
It will work as expected. Hope it helps :)

Suggestions for how to write a service in Rails 3

I am building an application which will send status requests to users (via email & sms) on a regular basis. I want to execute the service each hour which will:
Query the database for all requests that need to be sent (based on some logic)
Send the requests through Amazon's Simple Email Service (this is already working)
Write a record of the status request notification back to the data store
I am considering wrapping up this series of operations into a single controller with an end point that can be called remotely to kick off the process within the rails app.
Longer term, I will break this process out into an app that can be run independently of my rails app, but for now I'm just trying to keep it simple.
My first inclination is to build the following:
Controller with the following elements:
A method which will orchestrate the steps outlined above (and can be called externally)
A call to the status_request model which will bring back a collection of request needing to be sent
A loop to iterate through the pending requests, which will:
Make a call to my AWS Simple Email Service module to actually send the email, and
Make a call to the status_request model to log the request back to the database
Model:
A method on my status_request model which will bring back a collection of requests that need to be sent
A method in my status_request model which will log that a notification was sent
Since this will behave as a service that gets called periodically from an outside scheduler I don't think I'll need a view for this operation. (Will, of course, need views to show users and admins what requests have been sent, but that's later...).
As someone new to Rails, I'm asking for review of this approach and any suggestions you may have.
Thanks!
Instead of a controller which Jeff pointed out exposes a security risk, you may just want to expose a rake task and use cron to invoke it on an hourly basis.
If you are still interested in building a controller, look at devise gem and its single access token, token_authenticatable, for securing the methods you are exposing.
You may also want to look at delayed_job or resque to offload the call to status_request and the loop to AWS simple service to a background worker process.
You may want a seperate controller and view for the log file so you can review progress on demand.
And if you want to get real fancy use Amazon SNS to send you alerts when the service reaches some unacceptable level of failures, backlog, etc.
Since you are trying to invoke this from an outside process, your approach should work. You could also have a worker process that processes task when they are there.
You will need routes to expose your service, and you may want to also make security decisions. How will the service that invokes your application authenticate so all others can't hit it at will?
Another consideration should be how many emails are you sending. If there are enough, we may want to look into the fact that writing this sort of loop is going to be extremely top heavy; and may affect users on the current system if it's a web application.
In the end, there are many ways to do this. I would focus on the performance/usage you expect as well as security. There's never one perfect way to solve a problem like this, and your way should just be aware of the variables it will need to be operating within.
Resque and Redis might be helpful to you in scheduling and performing operatio n .They are simple and superfast, [here](http://railscasts.com/episodes/271-resque] is a simple tut on same.

How to do an ajax callback after a Delayed Job has finished in Ruby on Rails?

I allow users on my site to rotate their photos. I accomplish this by an ajax call to a Delayed_Job process (via Heroku) that rotates the photo. After they press "rotate photo", I show a loading spinner. But my question is this: what is the best way for my page to know when the Delayed_Job is complete, so I can load the new photo?
Do I need to have a continuous ajax polling of my server to determine if the Delayed Job is complete? Or is there any way I can implement an ajax callback to my page that will notify my page when the Delayed Job has finished?
Thanks in advance.
There's a bunch of ways to deal with this kind of thing. You could do ajax polling as you've mentioned, you could use the comet approach where you essentially leave a connection open until whatever it is on the server has completed, or you could even go all out and use web sockets (probably a bit overkill for this task though).
Without sockets, there's currently no way to have your server send a message to the client, without the client requesting it.
In any case, you should decide whether the need or want to background the task warrants all the extra work of dealing with the polling/comet/sockets. Rotating an image shouldn't take long at all. Depending on whether you can afford to lock up a server process, it'd be a lot simpler to just do the image manipulation in the foreground (not delayed_job). Then, when the ajax request to that action has completed, you know the task is completed.

How to send many emails via ASP.NET without delaying response

Following a specific action the user takes on my website, a number of messages must be sent to different emails. Is it possible to have a separate thread or worker take care of sending multiple emails so as to avoid having the response from the server take a while to return if there are a lot of emails to send?
I would like to avoid using system process or scheduled tasks, email queues.
You can definitely spawn off a background thread in your controller to handle the emails asynchronously.
I know you want to avoid queues, but another thing i have done in the past is written a windows service that pulls email from a DB queue and processes it at certain intervals. This way you can separate the 2 applications if there is a lot of email to be sent.
This can be done in many different ways, depending on how large your application is and what kind of reliability you want. Any of these ways should help you achieve what you want (in ascending order based on complexity):
If you're using IIS SMTP Server or another mail server that supports a pickup directory option, you can go with that. With this option, instead of sending the emails directly, they are saved first in the pickup directory. Your call will immediately return after the email is saved in the pickup directory, so the user won't have to wait until the email is sent. On the other hand, the server will try to send the email as soon as it's saved in the pickup directory so it's almost immediate (just without blocking the call).
You can use a background thread like described in other answers. You'll need to be careful with this option as the thread can end unexpectedly before it finishes its job. You'll need to add some code to make sure this works reliably (personally, I'd prefer not to use this option).
Using a messaging queue server like MSMQ. This is more work and you probably should only look into this if you have a large scale application or have good reasons not to use the first option with the pickup directory.
There are a few ways you could do this.
You could store enough details about the message in the database, and write a windows service to loop through them and send the email. When the user submits the form it just inserts the required data about the message and trusts the service will pick it up. Almost an email queue which you said you didn't want, but you're going to end up in a queue situation with almost any solution.
Another option would be to drop in NServiceBus. Use that for these kinds of tasks.
I typically compile the message body and store that in a table in the db along with the from and to addresses, a subject, and a timestamp indicating when the email was sent. Then I have a background task check the table periodically and pull any that haven't been sent. This task attempts to send each email and updates the timestamp accordingly. One advantage of storing the compiled message body up front is that the background task doesn't have to do any processing of context-specific data, and therefore can be pretty darn simple.
Whenever an operation like is hingent upon an event, there is always the possibility something will go wrong.
In ASP.NET you can spawn multiple threads and have those threads do the action. Make sure you tell the thread it's a background thread, otherwise ASP.NET might way for the thread to finish before rendering your page:
myThread.IsBackground = true;
I know you said you didn't want to use system process or scheduled tasks, but a windows service would be a viable approach to this as well. The approach would be to use MS Queue, or save the actions needing to be done in a DataBase table. Then have a windows service check every minute or so and do those actions.
This way, if something fails (Email server down) those emails / actions can still be done.
They will also be recorded for audit's (which is very nice to have).
This method allows you're web site to function as a website while offloading these tasks to another service. The last thing you need is for multiple ASP.NET processes to be used up waiting for emails to send. let something else handle that.

Resources