Spinning Background Tasks in Rails - ruby-on-rails

What is the preferred way to create a background task for a Rails application? I've heard of Starling/Workling and the good ol' script/runner, but I am curious which is becoming the defacto way to manage this need?
Thanks!
Clarification: I like the idea of Rake in Background, but the problem is, I need something that is running constantly or every 10 hours. I am not going to have the luxury of sitting on a web request, it will need to be started by the server asynchronous to the activities occurring on my site.

Ryan Bates created three great screencasts that might really help you:
Rake in Background
Starling and Workling
Custom Daemon
He talks about the various pros and cons for using each one. This should help you get started.

It depends on your needs.
Try out delayed_job, which was created by Tobi delayed_job (last updated 2011), a Shopify founder.
There are forks by DHH deleayed_job (last updated 2008), and collectiveidea delayed_job (last updated 20 days ago as of 6/28/2018).

I usually rely on cronjob scheduling as it gives the flexibility without having to write separate code to schedule it. Anything that can be executed from shell, can be scheduled! Be it any script (ruby / rake task / py / bash / any other you like), cronjob scheduling can be easily achieved.
If running on windows, one can use scheduled tasks
Hope this helps.

async_observer is the best. It doesn't do all kinds of dumb busy wait stuff or lose jobs on worker crashes like starling, no DB polling, etc... and it integrates into rails remarkably well.
I push tons of jobs through it and it pretty much doesn't care.

Most of the plugins that have been mentioned will do the job, but if all you need is a Rake task run on a set schedule, then there's really no need to start throwing more architecture at it.
Just add a cron job which executes
"cd /path/to/rails/app; RAILS_ENV=production rake run:my:task"
Why reinvent the wheel, when Unix like operating systems have been running tasks on a schedule for decades?

I have used the daemons plugin in the past.

While I don't know if it is becoming a standard, I have had great success with BackgroundRB. I have several workers, some are long running tasks triggered by a user action while others are started on a schedule.

Have a look at Taskr. It's basically like cron, but with a RESTful web interface. You can use it to schedule tasks to periodically connect to your Rails app and trigger arbitrary code (via the Taskr4rails plugin). It's meant to fit nicely into a system built around RESTful services, plus it can notify you if a task returns an error, fails to run, etc.

Related

Rails Background Process (Heroku Rails 3+)

I'm am going to set up some functionality for my app that is Rails 3.2.3 and on Heroku. The idea is to have a task, or job (or whatever you want to call it) run every day, to make sure user information from the external API is up to date with the user information in my db. I'm curious what is the the best way to set this up? Should it be a cron job that runs a rake task?
Seems like there are quite a few ways to do this and I'm interested in the ways others are doing this. The only way I can think to do it is to run a rake task in a cron job, but would love to figure out what best practices are, or the most simple way to do it. Seems like there are a lot of ways to skin this cat... lots of different tools out there too.
If there was a pure rails way to do this, I think that would be better so I don't have to screw around with every system I place my app onto.
For a simple sync job that runs once a day, I believe having a cronjob would be sufficient and likely more stable in the long run.
Honestly, solutions such as Resque and Sidekiq is a bit overkill in my opinion (for your needs). You're still required to use a scheduler to send messages to these systems.
Check out the gem 'whenever' if you're looking at making the deployment and writing of crontabs easier: https://github.com/javan/whenever/
Railscasts regarding 'whenever': http://railscasts.com/episodes/164-cron-in-ruby
There are two options. They're better than options you mentioned in your question
Resque.
Sidekiq.
Try the later one. It is faster, lightweight and based on multithreading so there isn't interference with system. You'll need to look into scheduler of both the gem for processing everyday.
Hope this helps!
Use the Heroku scheduler add on to the handle scheduling itself. You can have it run a rake task, resque, or whatever.
Here is a few to choose from :
resque (with resque-scheduler. But you have to use redis with it)
rufus-scheduler ( if you want something simple, resque uses rufus-scheduler itself)
You may try delayed_job with a few tricks like this one. Not that great for scheduling but can use your application database.

Best current rails background task method?

I am trying to find out the best way to run scripts in the background. I have been looking around and found plenty of options, but many/most seem to have become inactive in the past few years. Let me describe my needs.
The rails app is basically a front-end to configure when and how these scripts will be run. The scripts run and generate reports and send email alerts. So the user must be able to configure the start times and how often these scripts will run dynamically. The scripts themselves should have access to the rails environment in order to save the resulting reports in the DB.
Just trying to figure out the best method from the myriad of options.
I think you're looking for a background job queuing system.
For that, you're either looking for resque or delayed_job. Both support scheduling tasks at some point in the future -- delayed_job does this natively, whereas resque has a plugin for it called resque_scheduler.
You would enqueue jobs in the background with parameters that you specify, and then at the time you selected they'll be executed. You can set jobs to recur indefinitely or a fixed number of times (at least with resque-scheduler, not sure about delayed_job).
delayed_job is easier to set up since it saves everything in the database. resque is more robust but requires you to have redis in your stack -- but if you do already it's pretty much the ideal solution for your problem.
I recently learned about Sidekiq, and I think it is really great.
There's also a RailsCast about it - Sidekiq.
Take a look at the gem whenever at https://github.com/javan/whenever.
It allows you to schedule tasks like cron jobs.
Works very well under linux, and the last commit was 14 days ago. A friend of mine used it in a project and was pretty satisfied with it.
edit: take a look at the gem delayed_job as well, it is good for executing long tasks in the background. Useful when creating a cron job only to start other tasks.

Background process in Rails 3

I am writing a Web app that will need to run a background process that will poll a web service every minute or so and then query my Rails db and send out alerts to users of the app via Twitter. I have researched this a lot but I feel I am just going around in circles. I have come across delayed_job, background_job and a few other options like creating a custom daemon suggested in a Railscast. Does anyone have any suggestions for the best way to do this? The process will have to run constantly in the background and won't be triggered by an event in the front end. Any help or guidance would be appreciated.
Why don't you just create a rake task and add it to your CRON execution?
You can even use Whenever to configure this for you.
I used Beanstalkd for this and can recommend it.
http://railscasts.com/episodes/243-beanstalkd-and-stalker
You can simply use cron for tasks that has to be executed every X minutes, hours etc.
gem whenever is usefull to setup this with rails: https://github.com/javan/whenever
I don't know much about delayed_job. But you can check out some tutorials, for example this article on heroku: http://devcenter.heroku.com/articles/delayed-job
I used delayed_job for our application.
While working on this, we researched many sites and finally we are able to apply it.
We apply our experiences in the following link
http://www.kyybaventures.com/blog/rails-delayed-job#more-2916
Hope this will help to get started with background process in rails 3.
We can either use backgroundrb or unix crontab.
Crontab will do the job if you don't want to send any heavy loaded process to run asynchronously during the request process cycle of the application.
Backgroundrb consumes lot of memory and cpu in production environment if any of the process hangs out. Also we need to configure a monitor tool to make sure that the background process is running.

delayed_job, daemons or other gem for recurring background jobs

I need to build a background job that goes through a list of RSS feeds and analyze them say every 10 minutes.
I have been using delayed_job for handling background jobs and I liked it a lot. I believe though that it's not built for recurring background jobs. I guess I can auto-schedule background job at the end of everyone (maybe with begin..rescue just to ensure it gets executes). Or preschedule say a month of advance worth of jobs and have another one that reschedule the every month..etc
This raised some concerned to me as I started asking myself: what if the server goes down in the middle of execution and the jobs didn't get scheduled?
I have also looked at Daemons gems which seemed the like it runs simple Ruby scripts with start/stop commands. I like the way delayed_job schedules and handles retries.
What do you recommend using in this case? What do you think the best way to design such a system with recurring background jobs? Also do you know a way I can monitor that background process and get notified if it stops?
I just implemented delayed_job for a similar task (using :run_at => 2.days.from_now) and found it to be a perfect fit. The easiest way to handle your concern about a process failing is to make the first step of the job to create the next job. Also, you can create a has_many relationship to the delayed_job model which would allow you to access the :last_error. Or, look at the "Hooks" section of readme and it has a perfect example for failure.
I think that this was a similar question: A cron job for rails: best practices? - not only are there answers, but also links to railscasts about background jobs in rails.
I used cron + delayed_job, but scheduled tasks were supposed to run few times a day, mostly just once.
Take a look at SimpleWorker. It's an elastic scheduling and background processing worker queue. It's cloud based and has persistence and redundancy so you don't need to worry if your servers go down or are restarted.
Very flexible in terms of scheduling, provides great introspection of jobs in the queue as well as notifications on status and errors.
Full disclosure: I work at SimpleWorker.

Best practice for Rails App to run a long task in the background?

I have a Rails application that unfortunately after a request to a controller, has to do some crunching that takes awhile. What are the best practices in Rails for providing feedback or progress on a long running task or request? These controller methods usually last 60+ seconds.
I'm not concerned with the client side... I was planning on having an Ajax request every second or so and displaying a progress indicator. I'm just not sure on the Rails best practice, do I create an additional controller? Is there something clever I can do? I want answers to focus on the server side using Rails only.
Thanks in advance for your help.
Edit:
If it matters, the http request are for PDFs. I then have Rails in conjunction with Ruport generate these PDFs. The problem is, these PDFs are very large and contain a lot of data. Does it still make sense to use a background task? Let's assume an average PDF takes about one minute to two minutes, will this make my Rails application unresponsive to any other server request during this time?
Edit 2:
Ok, after further investigation, it seems my Rails application is indeed unresponsive to any other HTTP requests after a request comes in for a large PDF. So, I guess the question now becomes: What is the best threading/background mechanism to use? It must be stable and maintained. I'm very surprised Rails doesn't have something like this built in.
Edit 3:
I have read this page: http://wiki.rubyonrails.org/rails/pages/HowToRunBackgroundJobsInRails. I would love to read about various experiences with these tools.
Edit 4:
I'm using Passenger Phusion "modrails", if it matters.
Edit 5:
I'm using Windows Vista 64 bit for my development machine; however, my production machine is Ubuntu 8.04 LTS. Should I consider switching to Linux for my development machine? Will the solutions presented work on both?
The Workling plugin allow you to schedule background tasks in a queue (they would perform the lengthy task). As of version 0.3 you can ask a worker for its status, this would allow you to display some nifty progress bars.
Another cool feature with Workling is that the asynchronous backend can be switched: you can used DelayedJobs, Spawn (classic fork), Starling...
I have a very large volume site that generates lots of large CSV files. These sometimes take several minutes to complete. I do the following:
I have a jobs table with details of the requested file. When the user requests a file, the request goes in that table and the user is taken to a "jobs status" page that lists all of their jobs.
I have a rake task that runs all outstanding jobs (a class method on the Job model).
I have a separate install of rails on another box that handles these jobs. This box just does jobs, and is not accessible to the outside world.
On this separate box, a cron job runs all outstanding jobs every 60 seconds, unless jobs are still running from the last invocation.
The user's job status page auto-refreshes to show the status of the job (which is updated by the jobs box as the job is started, running, then finished). Once the job is done, a link appears to the results file.
It may be too heavy-duty if you just plan to have one or two running at a time, but if you want to scale... :)
Calling ./script/runner in the background worked best for me. (I was also doing PDF generation.) It seems like the lowest common denominator, while also being the simplest to implement. Here's a write-up of my experience.
A simple solution that doesn't require any extra Gems or plugins would be to create a custom Rake task for handling the PDF generation. You could model the PDF generation process as a state machine with states such as submitted, processing and complete that are stored in the model's database table. The initial HTTP request to the Rails application would simply add a record to the table with a submitted state and return.
There would be a cron job that runs your custom Rake task as a separate Ruby process, so the main Rails application is unaffected. The Rake task can use ActiveRecord to find all the models that have the submitted state, change the state to processing and then generate the associated PDFs. Finally, it should set the state to complete. This enables your AJAX calls within the Rails app to monitor the state of the PDF generation process.
If you put your Rake task within your_rails_app/lib/tasks then it has access to the models within your Rails application. The skeleton of such a pdf_generator.rake would look like this:
namespace :pdfgenerator do
desc 'Generates PDFs etc.'
task :run => :environment do
# Code goes here...
end
end
As noted in the wiki, there are a few downsides to this approach. You'll be using cron to regularly create a fairly heavyweight Ruby process and the timing of your cron jobs would need careful tuning to ensure that each one has sufficient time to complete before the next one comes along. However, the approach is simple and should meet your needs.
This looks quite an old thread. However, what I have down in my app, which required to run multiple Countdown Timers for different pages, was to use Ruby Thread. The timer must continue running even if the page was closed by users. Ruby makes it easy to write multi-threaded programs with the Thread class. Ruby threads are a lightweight and efficient way to achieve parallelism in your code. I hope this will help other wanderers who is looking to achieve background: parallelism/concurrent services in their app. Likewise Ajax makes it a lot easier to call a specific Rails [custom] action every second.
This really does sound like something that you should have a background process running rather than an application instance(passenger/mongrel whichever you use) as that way your application can stay doing what it's supposed to be doing, serving requests, while a background task of some kind, Workling is good, handles the number crunching. I know that this doesn't deal with the issue of progress, but unless it is absolutely essential I think that is a small price to pay.
You could have a user click the action required, have that action pass the request to the Workling queue, and have it send some kind of notification to the user when it is completed, maybe an email or something. I'm not sure about the practicality of that, just thinking out loud, but my point is that it really seems like that should be a background task of some kind.
I'm using Windows Vista 64 bit for my
development machine; however, my
production machine is Ubuntu 8.04 LTS.
Should I consider switching to Linux
for my development machine? Will the
solutions presented work on both?
Have you considered running Linux in a VM on top of Vista?
I recommend using Resque gem with it's resque-status plug-in for your heavy background processes.
Resque
Resque is a Redis-backed Ruby library for creating background jobs,
placing them on multiple queues, and processing them later.
Resque-status
resque-status is an extension to the resque queue system that provides
simple trackable jobs.
Once you run a job on a Resque worker using resque-status extension, you will be able to get info about your ongoing progresses and ability to kill a specific process very easily. See examples:
status.pct_complete #=> 0
status.status #=> 'queued'
status.queued? #=> true
status.working? #=> false
status.time #=> Time object
status.message #=> "Created at ..."
Also resque and resque-status has a cool web interface to interact with your jobs which is so cool.
There is the brand new Growl4Rails ... that is for this specific use case (among others as well).
http://www.writebetterbits.com/2009/01/update-to-growl4rails.html
I use Background Job (http://codeforpeople.rubyforge.org/svn/bj/trunk/README) to schedule tasks. I am building a small administration site that allows Site Admins to run all sorts of things you and I would run from the command line from a nice web interface.
I know you said you were not worried about the client side but I thought you might find this interesting: Growl4Rails - Growl style notifications that were developed for pretty much what you are doing judging by the example they use.
I've used spawn before and definitely would recommend it.
Incredibly simple to set up (which many other solutions aren't), and works well.
Check out BackgrounDRb, it is designed for exactly the scenario you are describing.
I think it has been around for a while and is pretty mature. You can monitor the status of the workers.
It's a pretty good idea to develop on the same development platform as your production environment, especially when working with Rails. The suggestion to run Linux in a VM is a good one. Check out Sun xVM for Open Source virtualization software.
I personally use active_messaging plugin with a activemq server (stomp or rest protocol). This has been extremely stable for us, processing millions of messages a month.

Resources