How can i execute a job properly with rails? - ruby-on-rails

The routes are like this: get:'/hi', to: 'tables#execute'
I have this controller but I don't know how can I execute it
class TableController < ApplicationController
def execute
CurrentJob.set(wait: 2.minutes).perform_later(self)
render plain: 'OK'
end
end
I am getting the below error:
ActiveJob::SerializationError in TableController#execute
Unsupported argument type: TableController
Note: I have modified my controller like this:
def execute
CurrentJob.perform_later params[:name]
render plain: 'OK'
end
Let me know how to execute it.

perform_later takes what you want to pass along to the job. These are passed to the job's perform method. It only takes certain types which it knows how to serialize.
By default the arguments must be either String, Integer, Float, NilClass, TrueClass, FalseClass, BigDecimal, Symbol, Date, Time, DateTime, ActiveSupport::TimeWithZone, ActiveSupport::Duration, Hash, ActiveSupport::HashWithIndifferentAccess, Array, Range, or GlobalID::Identification instances, although this can be extended by adding custom serializers.
A Controller object is not allowed, and also doesn't make much sense to pass to a background job. It's the controller's job to decide what to pass to the job, probably something from params.
See Creating A Job in Active Job Basics for more.

Related

Rails multiple record update

The following array of boolean attributes for multiple records
{"utf8"=>"✓","_method"=>"patch", "authenticity_token"=>"...",
"ts"=>
{"1"=>{"go"=>"0", "pickup"=>"0", "delivery"=>"1"},
"2"=>{"go"=>"0", "pickup"=>"0", "delivery"=>"1"},
"3"=>{"go"=>"0", "pickup"=>"0", "delivery"=>"1"},
[...]},
"commit"=>"Save changes"}
is being posted from one controller to a child controller with the following action that has un-conventional naming for the parameters.
def update_all
params[:ts].keys.each do |id|
#daystruttimeslot = Daystruttimeslot.find(id.to_i)
#daystruttimeslot.update(ts_params)
end
end
is hitting the error undefined local variable or method 'ts_params' for #<DaystruttimeslotsController:0x00007fa118f262f8> Did you mean? to_param params #_params
How can these parameters be properly processed by this action?
def update_all
ts = params.require(:ts)
#daystruttimeslots = Daystruttimeslot.where(id: ts.keys)
#daystruttimeslots.each do |d|
d.update(ts.fetch(d.id.to_s).permit(:go, :pickup, :delivery))
end
end
This does a single read operation instead of fetching each record separately and also provides a ivar that actually makes sense instead of whatever is at the end of the loop.
If you need to validate that all the ids are correct compare ts.keys.length to #daystruttimeslots.size. You also might want to consider wrapping this in a transaction so that the changes are rolled back if any of the updates fail instead of just leaving the job half done.

Sending email as a job on ruby on rails

Good day everyone,
I created a mailer to send an email to my client. As of right now im still testing it, but I couldn't make it to work. I've read redis, sidekiq, rails_mailer and still nothing. I can see that the mail is in the queue of sidekiq UI but I cant receive the email.
Here's the flow of my code.
User will check the text box on the view if they wanted to send an email to a client.
I a method will be triggered on the controller. Heres my code.
def send_workorder_message
if params.has_key?(:to_send_email)
WorkorderMessage::WorkorderMessageJob.perform_in(10.seconds, #curr_user, params[:message])
end
endv
then a workorder job is created. heres the code.
class WorkorderMessage::WorkorderMessageJob
# include SuckerPunch::Job
include Sidekiq::Worker
sidekiq_options queue: 'mailers'
def perform(user, message)
Spree::WorkorderMailer.workorder_send_to_email(user, message).deliver_now
# ActiveRecord::Base.connection_pool.with_connection do
# end
end
end
after that it will trigger the WorkorderMailer heres the code.
class WorkorderMailer < BaseMailer
def workorder_send_to_email(to_user, message)
ActiveRecord::Base.connection_pool.with_connection do
subject = "sample message mailer"
#message = message
#user = to_user
mail(
to: #user.email,
# 'reply-to': Spree::Store.current.support_address,
from: Spree::Store.current.support_address,
subject: subject
)
end
end
end
when I use the preview mailer I can see the UI working fine.
Also I've noticed that on sidekiq view I see this User Obj. I that normal?
According to the Sidekiq documentation, the arguments you pass must be primitives that cleanly serialize to JSON, and not full Ruby objects, like the user you are passing here:
Complex Ruby objects do not convert to JSON, by default it will
convert with to_s and look like #<Quote:0x0000000006e57288>. Even if
they did serialize correctly, what happens if your queue backs up and
that quote object changes in the meantime? Don't save state to
Sidekiq, save simple identifiers. Look up the objects once you
actually need them in your perform method.
The arguments you pass to perform_async must be composed of simple
JSON datatypes: string, integer, float, boolean, null(nil), array and
hash. This means you must not use ruby symbols as arguments. The
Sidekiq client API uses JSON.dump to send the data to Redis. The
Sidekiq server pulls that JSON data from Redis and uses JSON.load to
convert the data back into Ruby types to pass to your perform method.
Don't pass symbols, named parameters or complex Ruby objects (like
Date or Time!) as those will not survive the dump/load round trip
correctly.
I would suggest you change it to lookup the User by ID within the job, and only pass the ID instead of the entire user object.
# pass #curr_user.id instead of #curr_user
WorkorderMessage::WorkorderMessageJob.perform_in(10.seconds, #curr_user.id, params[:message])
# accept the ID instead of user here
def perform(user_id, message)
# get the user object here
user = User.find(user_id)
# send the mail
mail(
to: user.email,
#...
end

Can I call delayed_job with max attempts of 1?

I have a method that I run asynchronously
User.delay(queue: 'users').grab_third_party_info(user.id)
In case this fails, I want it to not retry. My default retries are 3, which I cannot change. I just want to have this only try once. The following doesn't seem to work:
User.delay(queue: 'users', attempts: 3).grab_third_party_info(user.id)
Any ideas?
This isn't my favorite solution, but if you need to use the delay method that you can set the attempts: to one less your max attempts. So in your case the following should work
User.delay(queue: 'users', attempts: 2).grab_third_party_info(user.id)
Better yet you could make it safer by using Delayed::Worker.max_attempts
User.delay(queue: 'users', attempts: Delayed::Worker.max_attempts-1).grab_third_party_info(user.id)
This would enter it into your delayed_jobs table as if it already ran twice so when it runs again it will be at the max attempts.
From https://github.com/collectiveidea/delayed_job#custom-jobs
To set a per-job max attempts that overrides the Delayed::Worker.max_attempts you can define a max_attempts method on the job
NewsletterJob = Struct.new(:text, :emails) do
def perform
emails.each { |e| NewsletterMailer.deliver_text_to_email(text, e) }
end
def max_attempts
3
end
end
Does this help you?
You have to use a Custom Job.
Just like #lazzi showed, you have to create a custom job in order to override the max_attempts.
As you can see in the README here, the only params that the .delay method take are:
priority
run_at
queue
And if you think about it, a value for max_attempts is not stored in the delayed_jobs table, only the attempts are stored, so there's no way for it to be persisted.
The only way to do it is to create a custom job that gets re-instantiated when the delayed job worker processes the job. It then reads the value from the max_attempts method and uses that to determine if the current attempts in the table record equals or exceeds the max_attempts value.
In your case, the simplest way to do it would be something like this:
# Inside your user.rb
class User < ApplicationRecord
FetchThirdPartyInfoJob = Struct.new( :user ) do
def perform
User.grab_third_party_info(user.id) # REFACTOR: Make this an instance method so you don't need to pass the User's id to it.
end
def queue_name
"users"
end
def max_attempts
3
end
end
end
Then run it wherever you need to by using enqueue, like this:
Delayed::Job.enqueue( User::FetchThirdPartyInfoJob.new( user ) )
I also added a little REFACTOR comment on your code because User.grab_third_party_info(user.id) looks to be incorrectly setup as a class method that you then pass the instance id to instead of just calling it directly on the user instance. I can't think of a reason why you would want this, but if there is, please leave it in the comments so we can all learn.

delayed_job: how to check for presence of a particular job based on a triggered method

I have a method like this that goes through an array to find different APIs and launch a delayed_job instance for every API found like this.
def refresh_users_list
apis_array.each do |api|
api.myclass.new.delay.get_and_create_or_update_users
end
end
I have an after_filter on users#index controller to trigger this method. This is creating many jobs to be triggered that will eventually cause too many connections problems on Heroku.
I'm wondering if there's a way I can check for the presence of a Job in the database by each of the API that the array iterates. This would be very helpful so I can only trigger a particular refresh if that api wasn't updated on a given time.
Any idea how to do this?
In config/application.rb, add the following
config.autoload_paths += Dir["#{config.root}/app/jobs/**/"]
Create a new directory at app/jobs/.
Create a file at app/jobs/api_job.rb that looks like
class ApiJob < Struct.new(:attr1, :attr2, :attr3)
attr_accessor :token
def initialize(*attrs)
self.token = self.class.token(attr1, attr2, attr3)
end
def display_name
self.class.token(attr1, attr2, attr3)
end
#
# Class methods
#
def self.token(attr1, attr2, attr3)
[name.parameterize, attr1.id, attr2.id, attr3.id].join("/")
end
def self.find_by_token(token)
Delayed::Job.where("handler like ?", "%token: #{token}%")
end
end
Note: You will replace attr1, attr2, and attr3 with whatever number of attributes you need (if any) to pass to the ApiJob to perform the queued task. More on how to call this in a moment
For each of your API's that you queue some get_and_create_or_update_users method for you'll create another Job. For example, if I have some Facebook api model, I might have a class at app/jobs/facebook_api_job.rb that looks like
class FacebookApiJob < ApiJob
def perform
FacebookApi.new.get_and_create_or_update_users(attr1, attr2, attr3)
end
end
Note: In your Question you did not pass any attributes to get_and_create_or_update_users. I am just showing you where you would do this if you need the job to have attributes passed to it.
Finally, wherever your refresh_users_list is defined, define something like this job_exists? method
def job_exists?(tokens)
tokens = [tokens] if !tokens.is_a?(Array) # allows a String or Array of tokens to be passed
tokens.each do |token|
return true unless ApiJob.find_by_token(token).empty?
end
false
end
Now, within your refresh_users_list and loop, you can build new tokens and call job_exists? to check if you have queued jobs for the API. For example
# Build a token
def refresh_users_list
apis_array.each do |api|
token = ApiJob.token(attr1, attr2, attr3)
next if job_exists?(token)
api.myclass.new.delay.get_and_create_or_update_users
end
end
Note: Again I want to point out, you won't be able to just drop in the code above and have it work. You must tailor it to your application and the job's you're running.
Why is this so complicated?
From my research, there's no way to "tag" or uniquely identify a queued job through what delayed_job provides. Sure, each job has a unique :id attribute. You could store the ID values for each created job in some hash somewhere
{
"FacebookApi": [1, 4, 12],
"TwitterApi": [3, 193, 44],
# ...
}
and then check corresponding hash key for an ID, but I find this limiting, and not always sufficient for the problem When you need to identify a specific job by multiple attributes like above, we must create a way to find these jobs (without loading every job into memory and looping over them to see if one matches our criteria).
How is this working?
The Struct that the ApiJob extends has a :token attribute. This token is based on the attributes passed (attr1, attr2, attr3) and is built when a new class extending ApiJob is instantiated.
The find_by_token class method simply searches the string representation of the job in the delayed_job queue for a match based on a token built using the same token class method.

DelayedJob Job instance treats all passed in objects as nil

I have been wrestling with DelayedJob for the last day and a half. I'm trying to create a simple Job class that accepts a large string and an ActiveRecord object. But no matter what I pass in to the Job when enqueing it, it is treated as nil. I've tried many different strategies to make this work; I have tried passing in just the id of the ActiveRecord object (treated as nil), I've tried adding an initializer to the Job object (rather than having it inherit from an instance of Struct)...nothing works.
I've simplified my job class into something ridiculous, and it still doesn't work:
class SimpleJob < Struct.new(:owner_id)
def perform
#owner = Owner.find(owner_id)
puts #owner.full_name
end
end
And in my controller:
def test_job
Delayed::Job.enqueue(SimpleJob.new(#owner.id))
redirect_to :action => 'index', :controller => 'owner'
end
The error is, of course, that Owner can't be found with an id of nil (before you might ask, yes, #owner is instantiated and working; a before_filter ensures this).
I'm using Rails 2.3.5, DelayedJob version 2.0.7. My Job object is located in the libs folder, if that makes a difference.
Is there some part of the configuration I'm missing?
Your call to delayed job is setup correctly. The first thing to check is that #owner.id is not nil to start with as its more then likely that its the issue
Have a look to the DB, in the table delayed_jobs, and check if the object are correctly serialized.
bye

Resources