Why does Rails action controller not use threads? - ruby-on-rails

My Rails application has a route that takes a lot of time to process, which makes the entire webpage freeze.
Why does this happen? Is it Rails or third-party gems which are not thread-safe?
Is there any way to work around this? I'm considering using a process pool, just like a thread pool, except it is heavier, it'll take a lot of memory, but it'll be cheaper than halting the whole app.

First thing to notice, your Rails action should not be heavy-weight. When a user requests a page, you should serve the user right away.
Now, there are cases when you need the user to wait for the result, in which case, you can always use websockets, or HTTP streaming.
Now, Ruby and Rails have a problem with threads, which you can read about in "Parallelism is a Myth in Ruby."
A solution you can use in Rails, is to use servers like Unicorn, which forks as many process workers as you want, and each one will be working independent of the others, Puma for creating multi threads, etc.
Now, if you have an action which is a heavy process, you may want to delay the work to a process pool like delayed_job. You can even create a nice UI with JavaScript to fetch the status of the job and show the progress to the user. You can use a pool of tasks to be performed with RabbitMQ, where another process On the background could listen to new messages and act on them, and even give a response, etc.
Have in mind that most webservers have a client timeout, and you don't really want the user to wait for one minute or more without a response, so it's always nice to use a stream response to give some feedback right away while the action is being completed, or answer with some JavaScript code that will continue hitting the server to see how the task is being performed, or even a websocket if required.

Rails uses a mutex lock around the entire request in the middleware stack, so a Rails process only ever takes one request at a time.
However, you can disable this by enabling the config.threadsafe! option AND using a multithreaded server, such as Puma.
Then there is the whole roadblock of using MRI which doesn't really let two threads run at the same time unless they are doing non-blocking I/O.
You would need to use a Ruby implementation that supports real threads, such as Rubinius or Jruby.

Related

Recurring job to check if url exists

I want to build a service that notifies me when a url returns status 200. I'm currently using a sidekiq worker, if the status == 200, it updates my database (row.available = true), if not, it raises an exception and retries the worker in n seconds, n amount of times.
Though this works, it doesn't feel efficient or scalable (1000's checks would result in 1000's of exceptions, and on certain platforms that's bad news -- JRuby), and I'm sure there is a way I can build an internal service to manage this url monitoring that doesn't rely on sidekiq (perhaps in Go, or another, more suited Ruby gem). However, I have no idea where to begin, and so I'd appreciate some general direction.
Writing and running a simple link checker is easy. Doing that for 1000s of links quickly, without redundancy, and handling dead and slow-responding links without bogging down your entire system gets harder.
I'd use three threads, plus two queues:
A dispatcher thread that only reads from the database. It is responsible for finding and queuing URLs to be checked in to a "to be checked" queue.
A worker thread that consumes from the first queue and pushes results into the "updated URL results" queue.
An updater/consumer thread that takes the result of a thread in #2 and updates the database.
Ruby has some built-in classes to help:
Thread
Queue
I'd highly recommend Typhoeus and Hydra for use in the middle thread. The documentation for these two classes cover a lot of what you need to do as far as handling multiple threads running in parallel.
I wouldn't write this code as part of a Rails application. There is no value added by Rails to this, nor is it necessary. I would either require Active Record and piggy-back on the existing database.yaml settings and models, or use Rails' "runner" to run the code as an adjunct to the Rails code.
Or, I'd write a small, application-specific, piece of code to run on a different server to avoid bogging down the Rails server. Using something like MySQL or PostgreSQL drivers would let you talk to the same database that Rails uses. In this case I'd use the Sequel gem to act as the ORM, but that's because I prefer it over Active Record.
There are a lot of things to consider as you write this code, including retries of failed URLs, sensing redirections and updating the source URLs to reflect them to avoid wasting time, and not beating up the hosting servers causing you to be banned.
I've written several apps for this purpose over the years and doing it right takes a lot of forethought, so think out your design up front otherwise you could end up with some major rewrites later on.

Best way to run rails with long delays

I'm writing a Rails web service that interacts with various pieces of hardware scattered throughout the country.
When a call is made to the web service, the Rails app then attempts to contact the appropriate piece of hardware, get the needed information, and reply to the web client. The time between the client's call and the reply may be up to 10 seconds, depending upon lots of factors.
I do not want to split the web service call in two (ask for information, answer immediately with a pending reply, then force another api call to get the actual results).
I basically see two options. Either run JRuby and use multithreading or else run several regular Ruby instances and hope that not many people try to use the service at a time. JRuby seems like the much better solution, but it still doesn't seem to be mainstream and have out of the box support at Heroku and EngineYard. The multiple instance solution seems like a total kludge.
1) Am I right about my two options? Is there a better one I'm missing?
2) Is there an easy deployment option for JRuby?
I do not want to split the web service call in two (ask for information, answer immediately with a pending reply, then force another api call to get the actual results).
From an engineering perspective, this seems like it would be the best alternative.
Why don't you want to do it?
There's a third option: If you host your Rails app with Passenger and enable global queueing, you can do this transparently. I have some actions that take several minutes, with no issues (caveat: some browsers may time out, but that may not be a concern for you).
If you're worried about browser timeout, or you cannot control the deployment environment, you may want to process it in the background:
User requests data
You enter request into a queue
Your web service returns a "ticket" identifier to check the progress
A background process processes the jobs in the queue
The user polls back, referencing the "ticket" id
As far as hosting in JRuby, I've deployed a couple of small internal applications using the glassfish gem, but I'm not sure how much I would trust it for customer-facing apps. Just make sure you run config.threadsafe! in production.rb. I've heard good things about Trinidad, too.
You can also run the web service call in a delayed background job so that it's not hogging up a web-server and can even be run on a separate physical box. This is also a much more scaleable approach. If you make the web call using AJAX then you can ping the server every second or two to see if your results are ready, that way your client is not held in limbo while the results are being calculated and the request does not time out.

Is it worth using a daemon?

Hey guys, I have a program that uses ajax to send a post to multiple social networks via their APIs based on user form input. I was wondering if this process (which doesn't take more than 2-3 seconds when I test it myself) is worth daemonizing with something like BackgroundRB? In other words, were this program to become used by 100+ people, would the simple call to an action via AJAX slow the entire application down?
Yeah I'd recommend using DelayedJob to accomplish this task. You want to avoid unnecessary HTTP requests to your app. With DelayedJob, it connects to your database and makes third party connections without initiating any HTTP requests to your app.
I wouldn't recommend BackgroundRB.
Sort answer: you have to go into background, use delayed_job
Longer answer:
The problem is that although it takes only 2-3 seconds, it completely locks the application server while it does it. so if you have lets say 5 mongrels, or passenger app servers running, it means that if 5 people decide to do this action within 2-3 seconds interval no other requests will be able to be processed.
So while its ok to do it during the development it's a must to move it to background in production.
I wouldn't recommend BackgroundRB. For what you need it seems you need delayed_job
You have a lot of solution to made that
bj
delayed_job
resque

Best practice for Rails App to run a long task in the background?

I have a Rails application that unfortunately after a request to a controller, has to do some crunching that takes awhile. What are the best practices in Rails for providing feedback or progress on a long running task or request? These controller methods usually last 60+ seconds.
I'm not concerned with the client side... I was planning on having an Ajax request every second or so and displaying a progress indicator. I'm just not sure on the Rails best practice, do I create an additional controller? Is there something clever I can do? I want answers to focus on the server side using Rails only.
Thanks in advance for your help.
Edit:
If it matters, the http request are for PDFs. I then have Rails in conjunction with Ruport generate these PDFs. The problem is, these PDFs are very large and contain a lot of data. Does it still make sense to use a background task? Let's assume an average PDF takes about one minute to two minutes, will this make my Rails application unresponsive to any other server request during this time?
Edit 2:
Ok, after further investigation, it seems my Rails application is indeed unresponsive to any other HTTP requests after a request comes in for a large PDF. So, I guess the question now becomes: What is the best threading/background mechanism to use? It must be stable and maintained. I'm very surprised Rails doesn't have something like this built in.
Edit 3:
I have read this page: http://wiki.rubyonrails.org/rails/pages/HowToRunBackgroundJobsInRails. I would love to read about various experiences with these tools.
Edit 4:
I'm using Passenger Phusion "modrails", if it matters.
Edit 5:
I'm using Windows Vista 64 bit for my development machine; however, my production machine is Ubuntu 8.04 LTS. Should I consider switching to Linux for my development machine? Will the solutions presented work on both?
The Workling plugin allow you to schedule background tasks in a queue (they would perform the lengthy task). As of version 0.3 you can ask a worker for its status, this would allow you to display some nifty progress bars.
Another cool feature with Workling is that the asynchronous backend can be switched: you can used DelayedJobs, Spawn (classic fork), Starling...
I have a very large volume site that generates lots of large CSV files. These sometimes take several minutes to complete. I do the following:
I have a jobs table with details of the requested file. When the user requests a file, the request goes in that table and the user is taken to a "jobs status" page that lists all of their jobs.
I have a rake task that runs all outstanding jobs (a class method on the Job model).
I have a separate install of rails on another box that handles these jobs. This box just does jobs, and is not accessible to the outside world.
On this separate box, a cron job runs all outstanding jobs every 60 seconds, unless jobs are still running from the last invocation.
The user's job status page auto-refreshes to show the status of the job (which is updated by the jobs box as the job is started, running, then finished). Once the job is done, a link appears to the results file.
It may be too heavy-duty if you just plan to have one or two running at a time, but if you want to scale... :)
Calling ./script/runner in the background worked best for me. (I was also doing PDF generation.) It seems like the lowest common denominator, while also being the simplest to implement. Here's a write-up of my experience.
A simple solution that doesn't require any extra Gems or plugins would be to create a custom Rake task for handling the PDF generation. You could model the PDF generation process as a state machine with states such as submitted, processing and complete that are stored in the model's database table. The initial HTTP request to the Rails application would simply add a record to the table with a submitted state and return.
There would be a cron job that runs your custom Rake task as a separate Ruby process, so the main Rails application is unaffected. The Rake task can use ActiveRecord to find all the models that have the submitted state, change the state to processing and then generate the associated PDFs. Finally, it should set the state to complete. This enables your AJAX calls within the Rails app to monitor the state of the PDF generation process.
If you put your Rake task within your_rails_app/lib/tasks then it has access to the models within your Rails application. The skeleton of such a pdf_generator.rake would look like this:
namespace :pdfgenerator do
desc 'Generates PDFs etc.'
task :run => :environment do
# Code goes here...
end
end
As noted in the wiki, there are a few downsides to this approach. You'll be using cron to regularly create a fairly heavyweight Ruby process and the timing of your cron jobs would need careful tuning to ensure that each one has sufficient time to complete before the next one comes along. However, the approach is simple and should meet your needs.
This looks quite an old thread. However, what I have down in my app, which required to run multiple Countdown Timers for different pages, was to use Ruby Thread. The timer must continue running even if the page was closed by users. Ruby makes it easy to write multi-threaded programs with the Thread class. Ruby threads are a lightweight and efficient way to achieve parallelism in your code. I hope this will help other wanderers who is looking to achieve background: parallelism/concurrent services in their app. Likewise Ajax makes it a lot easier to call a specific Rails [custom] action every second.
This really does sound like something that you should have a background process running rather than an application instance(passenger/mongrel whichever you use) as that way your application can stay doing what it's supposed to be doing, serving requests, while a background task of some kind, Workling is good, handles the number crunching. I know that this doesn't deal with the issue of progress, but unless it is absolutely essential I think that is a small price to pay.
You could have a user click the action required, have that action pass the request to the Workling queue, and have it send some kind of notification to the user when it is completed, maybe an email or something. I'm not sure about the practicality of that, just thinking out loud, but my point is that it really seems like that should be a background task of some kind.
I'm using Windows Vista 64 bit for my
development machine; however, my
production machine is Ubuntu 8.04 LTS.
Should I consider switching to Linux
for my development machine? Will the
solutions presented work on both?
Have you considered running Linux in a VM on top of Vista?
I recommend using Resque gem with it's resque-status plug-in for your heavy background processes.
Resque
Resque is a Redis-backed Ruby library for creating background jobs,
placing them on multiple queues, and processing them later.
Resque-status
resque-status is an extension to the resque queue system that provides
simple trackable jobs.
Once you run a job on a Resque worker using resque-status extension, you will be able to get info about your ongoing progresses and ability to kill a specific process very easily. See examples:
status.pct_complete #=> 0
status.status #=> 'queued'
status.queued? #=> true
status.working? #=> false
status.time #=> Time object
status.message #=> "Created at ..."
Also resque and resque-status has a cool web interface to interact with your jobs which is so cool.
There is the brand new Growl4Rails ... that is for this specific use case (among others as well).
http://www.writebetterbits.com/2009/01/update-to-growl4rails.html
I use Background Job (http://codeforpeople.rubyforge.org/svn/bj/trunk/README) to schedule tasks. I am building a small administration site that allows Site Admins to run all sorts of things you and I would run from the command line from a nice web interface.
I know you said you were not worried about the client side but I thought you might find this interesting: Growl4Rails - Growl style notifications that were developed for pretty much what you are doing judging by the example they use.
I've used spawn before and definitely would recommend it.
Incredibly simple to set up (which many other solutions aren't), and works well.
Check out BackgrounDRb, it is designed for exactly the scenario you are describing.
I think it has been around for a while and is pretty mature. You can monitor the status of the workers.
It's a pretty good idea to develop on the same development platform as your production environment, especially when working with Rails. The suggestion to run Linux in a VM is a good one. Check out Sun xVM for Open Source virtualization software.
I personally use active_messaging plugin with a activemq server (stomp or rest protocol). This has been extremely stable for us, processing millions of messages a month.

Using Thread.new to send email on rails

I've been sending emails on my application (ruby 1.8.7, rails 2.3.2) like this
Thread.new{UserMailer.deliver_signup_notification(user)}
Since ruby use green threads, there's any performance advantage doing this, or I can just use
UserMailer.deliver_signup_notification(user)
?
Thanks
Global VM lock will still almost certainly apply while sending that email, meaning no difference.
You should not start threads in a request/response cycle. You should not start threads at all unless you can watch them from create to join, and even then, it is rarely worth the trouble it creates.
Rails is not thread-safe, and is not meant to be from within your controller actions. Only since Rails 2.3 has just dispatching been thread-safe, and only if you turn it on in environment.rb with config.threadsafe!.
This article explains in more detail. If you want to send your message asynchronously use BackgroundRb or its analog.
In general, using green threads to run background tasks asynchronously will mean that your application can respond to the user before the mail is sent. You're not concerned about exploiting multiple CPUs; you're only concerned on off-loading the work onto a background process and returning a web page as soon as possible.
And from examining the Rails documentation, it looks like deliver_signup_notification will block long enough to get the mail queued (although I may be wrong). So using a thread here might make your application seem more responsive, depending on how your mailer is configured.
Unfortunately, it's not clear to me that deliver_signup_notification is necessarily thread-safe. I'd want to read the documentation carefully before relying on that.
Note also that you're making assumptions about the lifetime of a Rails process once a request has been served. Many Rails applications using DRb (or a similar tool) to offload these background tasks onto an entirely separate worker process. The easiest way to do this changes fairly often--see Google for a number of popular libraries.
I have used your exact strategy and our applications are currently running in production (but rails 2.2.2). I've kept a close eye on it and our load has been relatively low (Less than 20 emails sent per day average, with peaks of around 150/day).
So far we have noticed no problems, and this appears to have resolved several performance issues we were having when using Google's mailserver.
If you need something in a hurry then give it a shot, it has been working for us.
They'll be the same as far as I know.

Resources