When using Rails Action Mailer, you have the option to use deliver_now to send the email immediately, or deliver_later to send through asynchronously using Active Job. If Active Job is not specified an adapter, it will use an in-process thread pool which would not persist if the server were to stop. Alternatively, I can create a Job to manage my e-mails, which I can then call with perform_now or perform_later, which as I understand is more or less the exact same thing as deliver_now and deliver_later.
My question is, if I specify an adapter, let's say Sidekiq, and then have a database to store my jobs, why would I create a job to handle my e-mails? Is there any additional benefit, or is it an unnecessary step? On a slightly different note, if I did want to create a job for the process, would my email method need to have deliver_now or simply nothing at all? I presume if the e-mail were to say deliver_later, would it knock back the email to the end of the queue and force it wait again until it is sent?
To illustrate, if I have no Job set up to handle my emails, I could simply have:
class UserMailer < ActionMailer::Base
def send_email(user)
mail(to: user.email, subject: "My Subject")
end
end
To call, I would use:
UserMailer.send_email(my_user).deliver_later.
However, if I had wanted to add a job, and both options were called with UserJob.perform_later(user), would my setup be:
class UserJob < ActiveJob::Base
def perform(user)
UserMailer.send_email(user).deliver_now
end
end
Or
def perform(user)
UserMailer.send_email(user)
end
Lastly, I don't think this makes sense, but what would happen if I used:
def perform(user)
UserMailer.send_email(user).deliver_later
end
Related
Just wondering what's the best way to go about structuring asynchronous mailers in my Rails app (using Sidekiq)? I have one ActionMailer class with multiple methods/emails...
notifier.rb:
class Notifier < ActionMailer::Base
default from: "\"Company Name\" <notify#domain.com>"
default_url_options[:host] = Rails.env.production? ? 'domain.com' : 'localhost:5000'
def welcome_email(user)
#user = user
mail to: #user.email, subject: "Thanks for signing up!"
end
...
def password_reset(user)
#user = user
#edit_password_reset_url = edit_password_reset_url(user.perishable_token)
mail to: #user.email, subject: "Password Reset"
end
end
Then for example, the password_reset mail is sent in my User model by doing...
user.rb:
def deliver_password_reset_instructions!
reset_perishable_token!
NotifierWorker.perform_async(self)
end
notifier_worker.rb:
class NotifierWorker
include Sidekiq::Worker
sidekiq_options queue: "mail"
def perform(user)
Notifier.password_reset(user).deliver
end
end
So I guess I'm wondering a couple things here...
Is it possible to define many "perform" actions in one single worker? By doing so I could keep things simple (one notifier/mail worker) as I have it and send many different emails through it. Or should I create many workers? One for each mailer (e.g. WelcomeEmailWorker, PasswordResetWorker, etc) and just assign them all to use the same "mail" queue with Sidekiq.
I know it works as it is, but should I break out each of those mail methods (welcome_email, password_reset, etc) into individually mailer classes or is it ok to have them all under one class like Notifier?
Really appreciate any advice here. Thanks!
As discussed here, Sidekiq supports delayed mailer by default, so there is no need to create separate workers:
Notifier.delay.password_reset(user.id)
I am not sure but I think its not a good idea to pass an instance in mailer action if you're using delay, so maybe its better to change the code above to :
Notifier.delay.password_reset(user.id)
How would one convert the following ActiveRecord::Observer to a Service object (or maybe multiple objects)?
The PushService updates all connected browsers via WebSocket of all changes. It does this by POST-ing to an external process. Since migrating from Thread.new to Sidekiq, the observer broke. A Sidekiq job started in an :after_create can run before the transaction is actually committed, so it will raise an ActiveRecord::NotFound error.
It is recommended to use an :after_commit hook, but then information needed by the PushService such as record.changes will not be available.
The interesting use case that this observer fulfills is that when a new message is created, which is a reply to another message. It will automatically run two callbacks. An :after_create for the reply-message and an :after_touch for the thread-message.
I am interested to see how this behavior can be run by using an explicit service object.
class PushObserver < ActiveRecord::Observer
observe :message
def after_create(record)
Rails.logger.info "[Observe] create #{record.inspect}"
PushService.new(:create, record).publish
end
def after_update(record)
Rails.logger.info "[Observe] update #{record.changed.inspect}"
PushService.new(:update, record).publish
end
def after_touch(record)
Rails.logger.info "[Observe] touched #{record.changes.inspect}"
PushService.new(:touch, record).publish
end
def after_destroy(record)
Rails.logger.info "[Observe] destroy #{record.inspect}"
PushService.new(:destroy, record).publish
end
end
You can set up subscribers and listeners using a gem called wisper.
So in your PushService class, you would include Wisper::Publisher There is an exact example on the Github page for using this as a service. This way, the firing off of events will be taken care of the actual objects.
So whenever PushService does something that requires updating connected browsers
def some_action
#some code
publish(:done_something, self)
end
Then the Service object can be listening for this event with the subscribe method.
In my app I have certain events that trigger a lot of emails (~100). Obviously sending them immediately is not an option, so I'm using DelayedJob to queue them up and send them after the request is processed. I've now found that the logic to determine WHICH 100 people to email is heavy enough that it takes a while to run, so I'd like to DelayedJob that process as well. Where should this logic go? (model? mailer?) Sending mail from the model just feels wrong. Is there a best practice here?
You should write a class that represents the job. Not a model class, not a controller class: a job class.
# app/jobs/mail_job.rb
class MailJob
attr_accessor :first_option, :second_option
def initialize(first_option, second_option)
self.first_option = first_option
self.second_option = second_option
end
def perform
accounts = Account.where("some_key" => first_option).to_a
# more complicated stuff goes here
accounts.each do |account|
AccountMailer.hello_message(account).deliver
account.mark_hello_delivered!
end
end
end
job = MailJob.new(params["first"], params["second"])
Delayed::Job.enqueue(job)
I am using Rails 3.0.9 and I have following code to send an email when a comment is posted.
class Mailer < ActionMailer::Base
def comment_notification(comment)
User.active.each do |user|
#user = user
mail(:to => #user.email, :subject => subject)
end
end
end
If there are not active users then User.active is empty and the code inside does not get executed. However the view is rendered and view fails because #user is missing.
The above code is invoked by observer
Mailer.comment_notification(comment).deliver
One way to fix this problem would be to change the code in observer to something like this
User.active.each do |recipient|
Mailer.comment_notification(comment, recipient).deliver
end
Is this the right way to fix this way. I would like my observer to be as thin as possible.
Yes, your observer fix is correct. You should loop through and send emails one by one. The mailer should just send one email at a time. This is a job best left to Delayed Job though. You don't want to waiting around while an email sends.
Here is a tutorial on Delayed Job: http://railscasts.com/episodes/171-delayed-job
Be sure to check the Readme for Delayed Job as well, paying special attention to the "Rails 3 Mailers" section: http://github.com/collectiveidea/delayed_job
I'm trying to implement an ActionMailer function that will send out a newsletter to a specific user. I want to make sure that the newsletter is only sent to subscribed users. I tried implementing it like so:
class UserMailer < ActionMailer::Base
def newsletter(user)
return unless user.subscribed # This still renders my mailer view
mail(:to => user.email, :subject => "Newsletter")
end
end
The problem is that the return unless user.subscribed line still appears to be rendering the mailer view and is still sent by the calling code (from a cron job):
task :cron => :environment do
User.where(:subscribed => true).each do |user|
UserMailer.newsletter(user).deliver
end
end
Note that I do have that subscription logic in my cron job as well for performance reasons (shouldn't have to iterate over ALL users, only those that are subscribed). However, it feels like the UserMailer class is the right place for this logic to exist (otherwise any other location that calls the newsletter method will need to check the subscribed flag as well.
The Mailer, IMHO, is the wrong place for this logic. The mailer should do nothing but format and send messages. The logic to decide whether or not to send should be within the calling block of code. It's not the right way, but something as simple as:
UserMailer.newsletter(user).deliver if user.subscribed?
Alternately, as you mentioned, you shouldn't have to iterate over all users, just the subscribed. So with a scope in the User model called subscribed:
User.subscribed.each do |user|
UserMailer.newsletter(user).deliver
end
This way you don't need to test on a per-user basis; only the subscribed users are included, and the logic is in the calling block, not in the mailer.