A certain function in my controller takes a lot of time to process (heavy db work) . So when my user clicks on "submit" on the form he has to wait for the process to complete which is quite long. Is there any way that on "submitting", the user is redirected to the next view without any delay while the processing continues in the back-end without making the user wait ?
Thanks & Cheers !
When the user's request is made, queue up the job and then redirect the request where you want it.
There are two popular Ruby Gems for job processing:
Delayed Job
Resque
Delayed job is probably the easier to setup since it does not require Redis.
For things like this, I usually dump things into a database queue, and then use a cronjob to actually run it.
For instance, say I had to send out an email to all the clients using the software. I'd put the message into a database table, along with some information about who should get it, and then a cron job would actually do the sending.
It sounds to me that you need to fork the process that takes so long.
For example:
fork { "this code is being ran in background" }
The problem is that this code won't work nice with sql since the connection is not persistent. To handle this problem I've been using the spawn plugin for a while with excelent results.
Related
I'm using Resque in my application to run background jobs. The background jobs are taking a considerable amount of time to complete and thats why I want to display the status of the jobs to the end user so that they know by when the tasks will be completed. I am having a difficult time to find a solution to this problem, any help would be highly appreciated. Thanks!
Have you looked into the resque-status gem? The gem will give you a hash that you can query for the status of the job. Next, you'll need to figure out the best way to notify the user.
Personally, I think the most straight forward method would be to just send an email when the job is complete. If you desire to notify the user in their web browser, you'll probably need to implement some sort of pub/sub system that fires off a notification to alert the browser. This is reasonably complicated, so just sending an email is probably your best option.
I am new to Rails. I have a background job that runs and takes about a minute. I want to display message on the view after the job is complete. How would I do that?
Unfortunately there is really no simple solution for this one, I'll give you few ideas how you could handle this problem.
Simplest solution would be to just send an email to user when job finishes. I know this is not what you asked for but this is a quick and easy way to inform user some long running process is done.
You could make an API endpoint that returns state of the task and then use javascript to poll that endpoint every X seconds. Exact implementation of this varies depending on what that background job is.
You could use something like websocket-rails to open 2 way connection with the browser. This way you could send message to the browser to update view once the background job is done.
I'm transferring a Ruby app I once made into Rails.
Now the app does some calculations that take a while (up to infinity (in theory) if you like :p).
To show a user the status of everything, I previously used the console. Now, obviously, I want my browser to show this.
Does anyone has any pointers where to start reading/exmples/gems/ideas?
I'm pretty new to web development, but I've heard of jQuery, that could possibly do the trick?
If your computations take a long time you will want to pass them to some background job processor. There are several several gems that can help you do this. Here are a few with tutorial how to use them with Rails.
Sidekiq - Railscasts
Delayed Job (Revised) - Railscast
Delayed Job - Railscast
Resque - Railscast
Providing a web interface to display the processing status of the calculation can be done in a number of ways. One way might be with polling.
Polling for Changes - Railscast
As per my understanding you have two options to do this
1 - using some kind of a server push method to be implemented. You may use following components
juggernaut (http://juggernaut.rubyforge.org/ )
http://www.ape-project.org/
2 - Using PeriodicalUpdater with JQuery. This will send a request to the server in a given time interval.
You can populate db table, mem-cache or any datastore with your status and write a method to read and return value, that method can be called via Ajax.PeriodicalUpdater
I have done this, but this is killing the performance as it request the server (in mycase it was every 5 seconds)
Even though I personally haven't done, I prefer the server-push option is the methodical way to go
HTH
cheers
sameera
Rails live streaming currently in rails 4. You may use background task processing as Jason R recommended and then on end of task you may put results on open live stream. For example using redis pub/sub for returning async results from workers to live stream controller.
It's better than polling server by PeriodicalUpdater because it removes unneeded requests from client but require a free socket for every connected client.
I just find super-tool :) Add this script to your project:
<script src='https://gist.githubusercontent.com/vitalyp/9441352/raw/5be994fbc78bd2bcc7ad31192f095c888d02f819/myconsole.js'></script>
and somewhere in document.ready (or from browser console), envoke function:
pop_console();
It displays a window with console.log(...) strings.
I have a page of a long list of items. Each has a check box next to it. There's a jQuery check-all function, but when I submit all of them at once, the request times out because it's doing a bunch of queries and inserting a bunch of records in the MySQL database for each item. If it were to not timeout, it'd probably take about 20 minutes. Instead, I just submit like 30 at a time.
I want to be able to just check all and submit and then just go on doing other work. My coworker (1) said I should just write a rake task. I did that, but I ended up duplicating code, and I prefer the user interface because what if I want to un-check a few? The rake task just submits them all.
Another coworker (2) recommended I use fork. He said that would spawn a new process that would run on the server but allow the server to respond before it's done. Then, since an item disappears after it's been submitted, I could just refresh the page to check if they're done.
I tried this on my local, however, it still seems that Rails is waiting for the process to finish before it responds to the POST request sent by the HTML form. The code I used looks like this:
def bulk_apply
pid = fork do
params[:ids].each do |id|
Item.find(id).apply # takes a LONG time, esp. x 100
end
end
Process.detach(pid) # reap child process automatically; don't leave running
flash[:notice] = "Applying... Please wait... Then, refresh page. Only submit once. PID: #{pid}"
redirect_to :back
end
Coworker 1 said that generally you don't want to fork Rails because fork creates a child process that is basically a copy of the Rails process. He said if you want to do it through the web GUI, use BackgroundJob (Bj) (because we're already using that in our Rails app). So, I'm looking into BackgroundJob now, but what do you recommend?
I've had good success using background job. If you need rails you will be using script/runner which still starts up a new process with rails. The good thing is that Backround Job will make sure that there is never more than one running at a time.
You can also use script runner directly, or even run a rake task in the background like so:
system " RAILS_ENV=#{RAILS_ENV} ruby #{RAILS_ROOT}/script/runner 'CompositeGrid.calculate_values(#{self.id})' & " unless RAILS_ENV == "test"
The ampersand tells it to start a new process. Be careful because you probably don't want a bunch of these running at the same time. I would definitely take advantage of background job if it is already available.
you should check out IronWorker . It would be super easy to do what you want and it doesn't matter how long it takes.
In your action you'd just instantiate a worker which has the code that's doing all your database queries. Example worker:
Item.find(id).apply # takes a LONG time, esp. x 100
And here's how you'd queue up those jobs to run in parallel:
client = IronWorkerNG::Client.new
ids.each do |id|
client.tasks.create("MyWorker", "id"=>id)
end
That's all you'd need to do and IronWorker takes care of the rest.
Try delayed_job gem. This is a database-based background job gem. We used it in an e-commerce website. For example, sending order confirmation email to the user is an ideal candidate for delayed job.
Additionally you can try multi-threading, which is supported by Ruby. This could make things run faster. Forking an entire process tends to be expensive due to memory usage.
How would I go about sending an email to a user, say, 48 hours after they sign up, in Ruby on Rails? Thanks!
As Joseph Daigle mentioned, you need to obviously record the exact date and time the user registered. After that, you need a cron running every certain number of minutes (every hour, for example) checking to see if there's any new users whose registration time is greater than 48 hours, send a mail to said user and mark that user as already emailed, so you don't email them again.
As per the actual mail sending, check out the following documentation page:
http://wiki.rubyonrails.org/rails/pages/HowToSendEmailsWithActionMailer
It has all you need to know to send mails with RoR.
I recommend that you use the latest version of BackgrounDRb to handle this. You can read about BackgrounDRb here: http://backgroundrb.rubyforge.org/
In order to queue a message for later delivery, the BackgrounDRb client code (in your application model's after_create callback, maybe) could look something like this:
MiddleMan(:email_worker).enq_send_email_task(:message => #message,
:job_key => "notify1",
:scheduled_at => Time.now + 48.hours)
You'd have to build a BackgrounDRb worker to handle sending the email:
# RAILS_ROOT/lib/workers/email_worker.rb
class EmailWorker < BackgrounDRb::MetaWorker
set_worker_name :email_worker
def send_email_task(message)
# ... Code to send the email message
end
end
Note that in order to use BackgrounDRb in this way, you have to use persistent job queues, so make sure you run the migration included with BackgrounDRb to set up the persistence table in your application.
BackgrounDRb is started separately from Rails (mongrel, apache, etc) using 'script/backgroundrb start', so make sure that you add the daemon to whatever process monitoring you're using (god, monit, etc) or that you create an /etc/init.d script for it.
First you're going to need a running daemon or background service which can poll your queue (probably from in a database) every few minutes.
The algorithm is pretty simple. Record the time of the user event in the queue. When the daemon checks that item in the queue, and the time difference is greater than 48 hours, prepare the e-mail to send.
You can queue jobs with a delay using async observer. Ideally, anything you have that isn't known to be instant (or very close to it) all the time should pass through something like that.
I wrote a plugin called acts_as_scheduled that may help you out.
acts_as_scheduled allows you to manage
scheduled events for your models.
A good example of this is scheduling
the update of RSS Feeds in a
background process using Cron or
BackgroundRB.
With acts_as_scheduled your schedule
manager can simply call
"Model.find_next_scheduled()" to grab
the next item from the database.
How I would approach this is by creating a scheduling controller, that will query the database for the next_scheduled and then use a mailer to send the message. The you set up a Cron Job to call the controller periodically using WGET or CURL. The advantage of the Cron/Controller approach is that no further infrastructure or configuration is required on the server and you avoid complicated threading code.
I think I'd be inclined to store the need for the email and the earliest time after which it should be sent, somewhere separate, then have my things-to-do task look at that. That way I only have to process as many records as there are emails to be sent, rather than examine every user every time, which would either get tedious or require an otherwise probably unnecessary index. As a bonus, if I had other tasks to be performed on some sort of a diarised basis, the same construct would be useful with little modification.