Rails / delayed_job - want to load newest version of job class - ruby-on-rails

I'm using the delayed_job plugin in Rails to do background processing, and I'm experiencing a hiccup in the 'agile development' I've been experiencing so far in Rails...
Usually in rails if I hit an error / want to add some new functionality - I just add some code and refresh the page and the new code runs.
With delayed_job, it seems like the job class isn't being reloaded... if a job fails and I go and fix the error and fire the job again, the old code runs again.
Is there any way to make delayed_job load the newest version of the job class before invoking it?
Just in case this has anything to do with it - I know delayed_job has a few different options in the ways to declare jobs / run jobs:
My job class is in the lib directory of Rails and is declared like:
class FooJob < Struct.new(:foo_id)
and I'm invoking the job like this from the controller:
Delayed::Job.enqueue(FooJob.new(params[:id]))

There is nothing builtin to do this. Generally you are responsible for managing and reloading your workers. This is probably just as well since Rails development reloading is good but not perfect, and attempting to auto-reload a delayed job would potentially run into all sort subtle issues that would be pretty opaque to debug inside a worker process. Also, if it automatically reloaded the environment for every job a lot of use cases would get tremendously slow in dev mode.
My suggestion is just to get use to doing rake jobs:work and then Ctrl-C when you make changes. Alternatively you can create a script that just manually runs the jobs on an ad-hoc basis (taken from delayed_job docs):
#!/usr/bin/env ruby
require File.dirname(__FILE__) + '/../config/environment'
Delayed::Worker.new.start

I use this hack that seams to work quite nice, but be aware that it's probably very Rails and delayed_job version specific so you probably need to change some things. Tested with Rails 3.2.0 and delayed_job 2.1.4.
Put this into e.g. script/delayed_job_development and run it from the Rails root.
#!/usr/bin/env ruby
require File.expand_path('../config/environment', File.dirname(__FILE__))
require 'delayed/worker'
require "rails/console/app"
class DummyConsoleClass
include Rails::ConsoleMethods
end
dummy_console = DummyConsoleClass.new
worker = Delayed::Worker.new({:quiet => false})
puts "Waiting for jobs..."
loop do
if Delayed::Job.find_available(worker.name).count > 0
puts "Found jobs"
dummy_console.reload!
loop do
break if worker.work_off.sum == 0
end
puts "Done, waiting for jobs..."
end
sleep(2)
end
Please comment if you know that this is a very bad idea or things to be aware of, I mainly use it when editing and testing jobs that run immediately not with jobs scheduled to run long into the future.

Related

Ruby on Rails, callback, run a method later

I need somekind of a callback for a function to be caled in 5 min after the create-method.
My situation:
The user logs in in my web-page, uploads some files (create-method is invoked), in 5 min should the files be on their way to be analyzed(in 5 min it should call the method, which just take the whole folder, where the files are stored and analysis it). That is why such things like typing rake jobs:work or using gem daemons and typing "RAILS_ENV=production script/delayed_job start" in the command line does not suit me.
I want to start my apllication as usual with rails s, log in, upload the files and it should work automatically that the files are analyzed.
As I understood once the jobs started they will continue run? I do not need this. I need just some methods run in 5 min after create method.
All this stuff with gem 'delayed_job_active_record' to qeue the jobs and daemons to start the workers seem too complicted for such an easy task.
So, is it possible using gem 'delayed_job_active_record' and gem daemons to start my application with rails s and everythings will be done automatically in background without me stopping an application and typing things in the commanline to run the delayed jobs?
Or is it possible to do without all thise complicated stuff?
I have already asked about delayed_jobs here and here.
Many thanks in advance.
Here is a post where it is described how to set up scheduling with delayedjob
Update 2015-07-06: link's broken and I can't find a cached version - see update below
If you can, I recommend looking into sidekiq which is a great message queue and even has built in scheduling. It does use redis though, so unless you already have redis deployed it will be a tiny bit of work.
Update
Here is a gist with a simple solution to scheduled and recurring jobs with delayedjob

possible to debug/pry from methadone's App class?

I started a blank project with methadone, an awesome framework for building command line apps. The only problem is that I am unable to debug from within the App class that's in bin/my_app
The App class is a file created when you run methadone. Here's how I'm trying to use pry
#!/usr/bin/env ruby
require 'optparse'
require 'methadone'
require 'my_app'
require 'pry'
class App
include Methadone::Main
include Methadone::CLILogging
main do
binding.pry # <= pry here
end
...
When I run rake features I can tell the running process is trying to do something with pry since it pauses for a few seconds. I get the following error and then rake/cucumber is aborted.
process still alive after 3 seconds (ChildProcess::TimeoutError)
I can use pry just fine from the cucumber steps, rspec, or any other place, just not from anywhere in this App class.
One very interesting thing is that if I run my command line app from the console it WILL stop where pry is. It just wont pop into pry when using cucumber.
How can I get the app to work with pry when I'm running rake features?
Update
Sorry, I should clarify that methadone comes with aruba. So my cucumber scenario would look like this
When I successfully run `my_app arg1`
However, it WILL go into debug/pry if I run it with
bundle exec bin/my_app
Use pry-remote to connect to a pry session in the Aruba-managed subprocess.
(Disclosure: I paired with #Dty to come to this solution)
Aruba runs the app in a totally separate process, so I would guess what's happening is that when aruba runs your app, pry starts up at a prompt and waits for input. Since it doesn't get any, aruba times out after three seconds (the default it will wait for an app to complete). This is why you see the "process still alive" issue.
I'm not 100% sure how you could get the standard input of your shell that's running rake features to connect to your app's standard input so you could issue pry commands, but I don't think aruba was designed to allow this.
You have a couple of options:
Tag your scenario with #announce, and use When I run interactively... followed by several When I type - these commands should go to the interactive pry console that's waiting. Kind of kludgy, but it might work
Execute a unit test of your App class. You'll need to replace the call to go! with something like go! if $0 == __FILE__ so that you can require your executable in a test and manipulate App directly.
I have not tried either of these, but the second option feels a bit better and could also be improved with support from the library, if you can figure out a good way to do this.

run ruby script in rails application

This may be a stupid question but I was just wondering where, or if its possible to run a ruby script which is kind of unrelated to the rails application I would like it to run in. To clarify, I am working on an automation test suite that is written mainly in bash, but I want to create a front end (my rails application) that allows other users to run automated tests not through the command line. So I guess basically I want a user to select certain parameters, from a database or form fields, then take those parameters and pass them to a ruby script which calls my bash automation script.
I hope this is clear. Thanks!
If you want to call a script from a rails app it gets complex. You would want to use a background job or some sort of queue to run these jobs because they do block the server and your users would be waiting for the call to complete and the results to load, most likely hitting a timeout.
See delayed_job
and you might want to try creating a small wrapper script in ruby that can interface with your application.
Good luck!
for short tasks you should use system or popen
when tasks are longer then they are still needed in case of delayed_job
You can add a script to your scripts folder in the root of your rails app. Start your script like this:
your script can be [name here].rb
The reason why we load in the environment is so we can use rails models and rails related things in your script:
#!/bin/env ruby
ENV['RAILS_ENV'] = "production" # Set to your desired Rails environment name
require '/[path to your rails app on your server]/config/environment.rb'
require 'active_record'
If you want to run this on your server, then you have to edit your crontab on your server. Or you can use the whenever gem (which I''m having trouble with, but the entire universe doesn't). Conversely, if you have heroku, then there's the heroku scheduler that makes running scripts easy.
You can run Ruby code with rails runner.
… let us suppose that you have a model called “Report”. The Report model has a class method called generate_rankings, which you can call from the command line using
$ rails runner 'Report.generate_rankings'
Since we have access to all of Rails, we can even use the Active Record finder method to extract data from our application.
$ rails runner 'User.pluck(:email).each { |e| puts e }'
charles.quinn#highgroove.com
me#seebq.com
bill.gates#microsoft.com
obie#obiefernandet.com
Example taken from The Rails 5 Way by Obie Fernandez.

Recurring tasks in a Ruby On Rails application: Cron or other?

I am currently writing an application that pulls new information from RSS sources and has to update those RSS sources in a certain frequency. Currently I am pulling only when the user requests a feed but I want to change that behavior to automatic periodic fetching.
I was writing a shellscript that would interact with the database and gets started periodically via cron - but this is lots of double effort so I was wondering what would be the "Rails Way" or "Ruby Way" to do this. I am using Ubuntu, Apache and Passenger. Can you suggest better methods that are maybe even included in the application, so I can easily deploy the app to another machine without having to mingle with cron?
I would suggest doing something like a rake task and using the whenever gem to generate your cron job to run the rake task.
Check out, http://railscasts.com/episodes/164-cron-in-ruby, for more information on the whenver gem.
The main benefit of the whenever gem is that it keeps your application requirements (i.e. the cron job running every x hours, in the application) inside your application, increasing the portability of your application.
I recommend a combination of the two above. You want a rake task, even if you have a direct method already created. This is because server admin stuff that you'd want to run in cron, you might also want to run from the command line occasionally, and this is what rake tasks are good for.
The whenever plugin sounds cool, although I can't vouch for it. Of course, it's good to know how to do things from scratch, then use plugins to make your life easier. Here's the from-scratch way.
Create a new file, lib/tasks/admin.rake
Inside, create the task itself:
namespace :admin
desc "Updates all RSS feeds"
task :rss => :environment do
RssFeed.update_all
end
end
This assumes you have an RssFeed class, and the update_all method does what you'd expect. You can call this from the command line:
rake admin:rss
And you can add this to cron (by calling crontab -l as the web user) and adding this line:
10 0 * * * cd /path/to/rails/app && rake RAILS_ENV=production admin:rss
There are a variety of solutions. For the simplest setup, you can use script/runner in your crontab something like so:
10 0 * * * /home/myuser/myproject/script/runner -e production ModelName.methodname
Methodname must be a static method on your model. You need to reference the project by full path, otherwise it will not be found most likely in the cron environment. Check your crontab man page for info on the crontab syntax if you're not familiar. The above, for example, runs the script at the 10th minute of the 0th hour of every day (at 12:10am, in short).
If you need a more powerful solution, you could use BackgroundRB. BackgroundRB runs a daemon and supports tasks that schedule, and can put results in a database. They even have a simple communication protocol to allow your web processes to request a task be completed, and then have a way to retrieve the result. This allows you to control background jobs right from the web interface, rather than a crontab which just "happens".
There is a good bit more setup needed for BackroundRB to work, but it may be worth it if jobs need to be controlled.
Try using whenever. Eventhough in the end it will create a cron, but the scheduling definition will be written inside your application using Ruby DSL.
For small teams and personal projects, the whenever gem is great. But if your company has an ops team separate from the development team, it might not be ideal.
At my last job, the ops team needed to be able to see the cron we were installing so they could be confident it wouldn't have any side effects for the system. So a DSL solution wasn't going to work. But we (the developers) wanted the cron scripts in version control.
So to compromise, we checked text files with the raw cron, similar to this:
10 0 * * * cd /path/to/rails/app && rake RAILS_ENV=production admin:rss
And we added a step to the capistrano script that installed that to the crontab as part of the deploy.
Try webmin setup in your server. If your hosted site provide it. Go to the below mentioned URL. It's easy to set up and user freiendly.
URL is:
http://your_ip_address:10000/
I have used this in many of my application it's worked for me to schedule cron jobs.

I have a Rails task: should I use script/runner or rake?

For ad hoc Rails tasks we have a few implementation alternatives, chief among which would seem to be:
script/runner some_useful_thing
and:
rake some:other_useful_thing
Which option should I prefer? If there's a clear favourite then when, if ever, should I consider using the other? If never, then why would you suppose it's still present in the framework without deprecation warnings?
The difference between them is that script/runner boots Rails whereas a Rake task doesn't unless you tell it to by making the task depend on :environment, like this:
task :some_useful_task => :environment do
# do some useful task
end
Since booting Rails is expensive, it might be worth skipping if you can avoid it.
Other than that, they are roughly equivalent. I use both, but lately I've used script/runner executing a script separately more.
Passing parameters to a rake task is a pain in the butt, to say the least. You either need to resort to environment variables or a very hackish parameter system that is not intuitive and has lots of caveats.
If your task needs to handle command line arguments gracefully then writing a script is the way to go.
Luke Francl mentions script/runner booting up Rails. That's true. But if you don't want to boot up rails then just run the script as is without script/runner. So the only real difference between scripts and rake tasks are their aesthetics. Choose whatever feels right to you.
I use rake tasks for little tasks (one or two lines). Anything more complicated goes into the script/ directory. I'll break this rule if I think other developers will expect the code to live in one place over another.
FWIW there seems to be some movement away from using script runner in favor of rake:
Update (4/25/2009): I recommend using rake tasks as opposed to script/runner for recurring tasks.
Also, as per this post you can use rake for recurring tasks just fine:
If I then wanted this to run nightly on my production database at midnight, I might write a cronjob that looks something like this:
0 0 * * * cd /var/www/apps/rails_app/ && /usr/local/bin/rake RAILS_ENV=production utils:send_expire_soon_emails
Corrected based on comment 2 down. Give them the karma!
FWIW - Rails 3.0+ changes how you initialize the Rails system in a standalone script.
require File.dirname(__FILE__) + '/config/environment'
As mentioned above you can also do:
rails runner script/<script name>
Or put all the code in a Rake task, but I have a lot of legacy code from Rails 2; so I didn't want to go down that path immediately.
Each has its advantages and disadvantages.
One thing I've done is just write normal ruby scripts and put them in the script/maintenance directory.
All you need to do to load rails and get access to all your models, etc, is put require '../../config/environment.rb' at the top of your file, then you're away.
For one off commands script/runner can be fine. For anything repeated, a rake task is easier in the long-run, and has a summary if you forget what it does.
In Rails 3.0+, the config/environment.rb requires the config/application.rb, that requires the config/boot.rb.
So, to load an app in Rails 3, you still only have to require the environment.rb
I got the impression script/runner was primarily for periodic tasks. E.g., a cron job that runs:
SomeClass.update_from_web('http://www.sourcefordata.gov/')

Resources