Say I rescue from an Exception and I do:
begin
raise StandardError
rescue StandardError => ex
ExceptionNotifier.notify_exception(ex)
end
end
How can I make that ExceptionNotifier email be sent from a queue? So, it is asynchronous to the process of the application?
In the docs I can see how to send ExceptionNotifier if the error has happened within a worker, but not how to enqueue that sending to a queue.
The queue aspect of Rails has to be handled by a third-party semi-persistent data store. We use Redis & Resque
--
Here is a good tutorial on this:
Initializer
#app/config/initializers/redis.rb
require 'resque/server' #-> allows processing of jobs
require 'resque_scheduler' #-> allows for scheduling
uri = URI.parse(ENV["REDISCLOUD_URL"] ||= "http://localhost:6379")
Resque.redis = Redis.new(:host => uri.host, :port => uri.port, :password => uri.password)
-
Resque
This will allow you to send data to redis, using your Resque queue to handle it:
def your_action
Resque.enqueue(SendEmail, [[data ref]])
end
-
Queue
Then you can use resque to run through the Redis queue & send the emails:
$ rake resque:work QUEUE='*'
Quite a vague description, I know; but hopefully it will give you an idea as to how to use a third-party queue-based system to handle sending emails for you
Related
I have two rails applications, App1 and App2(added cloudAMQP gem) in heroku, App1 is producing some message when click on a button
App1
class Publisher
def publish
# Start a communication session with RabbitMQ
connection = Bunny.new(:host => "chimpanzee.rmq.cloudamqp.com", :vhost => "test", :user => "test", :password => "password")
connection.start
# open a channel
channel = connection.create_channel
# declare a queue
queue = channel.queue("test1")
# publish a message to the default exchange which then gets routed to this queue
queue.publish("Hello, everybody!")
end
end
so in the App2 i have to consume all the messages without any button click and put that in sidekiq to process the data, but i am stuck on how can i automatically read from that queue, i know the code how to read values from queue, people are saying sneakers gem, but i am bit confused with sidekiq and sneakers, any idea of how can we do it in heroku?
To read the messages you publish from App1 to App2, in App2 you gonna need sneakers (https://github.com/jondot/sneakers)
your reader would do something like:
class Reader
include Sneakers::Worker
from_queue 'test1'
def work(message)
# your code
ack!
end
end
and you need to configure your environment, you can take a look at https://github.com/jondot/sneakers/wiki/Configuration
I following Ryan Bates tutorial ActionController Live, and deploying app at heroku. All works fine, except events, where Ryan sad that we should reopen the redis connection, and I cant do it. I using RedisToGo to perform redis on heroku.
Here my events controller action:
def events
response.headers["Content-Type"] = "text/event-stream"
redis = Redis.new(:url => uri)
redis.psubscribe('messages.*') do |on|
on.pmessage do |pattern, event, data|
response.stream.write("event: #{event}\n")
response.stream.write("data: #{data}\n\n")
end
end
rescue IOError
logger.info "Stream closed"
ensure
redis.quit
response.stream.close
end
Also here redis initializer:
uri = URI.parse(ENV["REDISTOGO_URL"])
REDIS = Redis.new(:url => uri)
Can someone help me?
EDIT
I got all to work just initializing the client using Redis.new(url: ENV["REDISTOGO_URL"]) instead of parsing the URI in events controller action.
replace this:
redis = Redis.new(:url => uri)
redis.psubscribe
with this:
REDIS.psubscribe
anywhere you have 'redis' above, replace with the REDIS global.
I am on Heroku with a custom domain, and I have the Redis add-on. I need help understanding how to create a background worker for email notifications. Users can inbox message each other, and I would like to send a email notification to the user for each new message received. I have the notifications working in development, but I am not good with creating background jobs which is required for Heroku, otherwise the server would timeout.
Messages Controller:
def create
#recipient = User.find(params[:user])
current_user.send_message(#recipient, params[:body], params[:subject])
flash[:notice] = "Message has been sent!"
if request.xhr?
render :json => {:notice => flash[:notice]}
else
redirect_to :conversations
end
end
User model:
def mailboxer_email(object)
if self.no_email
email
else
nil
end
end
Mailboxer.rb:
Mailboxer.setup do |config|
#Configures if you applications uses or no the email sending for Notifications and Messages
config.uses_emails = false
#Configures the default from for the email sent for Messages and Notifications of Mailboxer
config.default_from = "no-reply#domain.com"
#Configures the methods needed by mailboxer
config.email_method = :mailboxer_email
config.name_method = :name
#Configures if you use or not a search engine and which one are you using
#Supported enignes: [:solr,:sphinx]
config.search_enabled = false
config.search_engine = :sphinx
end
Sidekiq is definitely the way to go with Heroku. I don't think mailboxer supports background configuration out of the box. Thankfully, it's still really easy with sidekiq's queueing process.
Add gem 'sidekiq' to your gemfile and run bundle.
Create a worker file app/workers/message_worker.rb.
class MessageWorker
include Sidekiq::Worker
def perform(sender_id, recipient_id, body, subject)
sender = User.find(sender_id)
recipient = User.find(recipient_id)
sender.send_message(recipient, body, subject)
end
end
Update your Controller to Queue Up the Worker
Remove: current_user.send_message(#recipient, params[:body], params[:subject])
Add: MessageWorker.perform_async(current_user.id, #recipient.id, params[:body], params[:subject])
Note: You should never pass workers ActiveRecord objects. That's why I setup this method to pass the User ids and look them up in the worker's perform method, instead of the entire object.
Finally, restart your server and run bundle exec sidekiq. Now your app should be sending the email background.
When you deploy, you will need a separate dyno for the worker which should look like this: worker: bundle exec sidekiq. You will also need Heroku's redis add-on.
Sounds like a H21 Request Timeout:
An HTTP request took longer than 30 seconds to complete.
To create a background worker for this in RoR, you should grab Resque, a Redis-backed background queueing library for RoR. Here is a demo. Another demo. And another demo.
To learn more about using Resque in Heroku, you can also read the herokue article up here. Or this tutorial (it's an old one though). Another great tutorial.
There is also a resque_mailer gem that will speed things up for you.
gem install resque_mailer #or add it to your Gemfile & use bundler
It is fairly straightforward. Here is a snippet from a working demo by the author:
class Notifier < ActionMailer::Base
include Resque::Mailer
default :from => "from#example.com"
def test(data={})
data.symbolize_keys!
Rails.logger.info "sending test mail"
Rails.logger.info "params: #{data.keys.join(',')}"
Rails.logger.info ""
#subject = data[:subject] || "Testing mail"
mail(:to => "nap#localhost.local",
:subject => #subject)
end
end
doing Notifier.test.deliver will deliver the mail.
You can also consider using mail delivery services like SES.
Sidekiq is an option that you could consider. To get it working you can add something like RedisToGo, then configure an initializer for Redis. Then on Heroku you can add something like worker: bundle exec sidekiq ... to your Procfile.
https://github.com/mperham/sidekiq/wiki/Getting-Started
It also has a dashboard for monitoring.
https://github.com/mperham/sidekiq/wiki/Monitoring
I am using Resque and Resque Schedule to start a job that has to be run immediately on the application start. Other scheduled jobs are loaded every 30 seconds.
This is the code for my config/initializers/redis.rb
require 'rake'
require 'resque'
require 'resque/server'
require 'resque_scheduler/tasks'
# This will make the tabs show up.
require 'resque_scheduler'
require 'resque_scheduler/server'
uri = URI.parse(ENV["REDISTOGO_URL"])
REDIS = Redis.new(:host => uri.host, :port => uri.port, :password => uri.password)
Resque.redis = REDIS
Dir["#{Rails.root}/app/workers/*.rb"].each { |file| require file }
Resque.enqueue(AllMessageRetriever)
Resque.schedule = YAML.load_file(Rails.root.join('config', 'schedule.yml'))
When the application is started up, the AllMessageRetriever gets run 2-3 times rather than only once. Do the initializers get called more than once? This happens both on Heroku and my local environment?
Is it possible to set a delayed job in Resque-Scheduler which will only get executed once and immediately on runtime?
The code for AllMessageRetriever. Basically it loops over a table and calls an external API to get data and then updates it to the table. This entire task happens 2-3 times if I add the enqueue method in initializer file
require 'socialcast'
module AllMessageRetriever
#queue = :message_queue
def self.perform()
Watchedgroup.all.each do |group|
puts "Running group #{group.name}"
continueLoading=true
page=1
per_page=500
while(continueLoading == true)
User.first.refresh_token_if_expired
token = User.first.token
puts "ContinueLoading: #{continueLoading}"
#test = Socialcast.get_all_messages(group.name,token,page,per_page)
messagesArray = ActiveSupport::JSON.decode(#test)["messages"]
puts "Message Count: #{messagesArray.count}"
if messagesArray.count == 0
puts 'count is zero now'
continueLoading = false
else
messagesArray.each do |message|
if not Message.exists?(message["id"])
Message.create_with_socialcast(message, group.id)
else
Message.update_with_socialcast(message)
end
end
end
page += 1
end
Resqueaudit.create({:watchedgroup_id => group.id,:timecompleted => DateTime.now})
end
# Do anything here, like access models, etc
puts "Doing my job"
end
end
Rake
Firstly, why are you trying to queue on init?
You'd be much better delegating to a rake task which is called from an initializer.
This will remove dependency on the initialize process, which should clear things up a lot. I wouldn't put this in an initializer itself, as it will be better handled elsewhere (modularity)
Problem
I think this line is causing the issue:
Resque.enqueue(AllMessageRetriever)
Without seeing the contents of AllMessageRetriever, I'd surmise that you're AllMessageRetriever (module / class?) will be returning the results 2/3 times, causing Resque to add the (2 / 3 times) data-set to the queue
Could be wrong, but it would make sense, and mean your issue is not with Resque / Initializers, but your AllMessageRetriever class
Would be a big help if you showed it!
I am writing a rails app which requires to track users' status to see if they are available, busy or offline. I'm using the private_pub gem, which uses Faye underneath. When a user signs in he subscribes to a channel /user/[:user_id]. I want to update user's status to ONLINE when they subscribe using Faye's subscribe event listener. I added this code at the end of private_pub.ru file:
server = PrivatePub.faye_app
server.bind :subscribe do |client_id, channel|
if /\/user\/*/.match(channel)
m = /\/user\/(?<user_id>\d+)/.match(channel)
user_id = m[:user_id]
end
user = User.find(user_id)
user.status = 1 # 1 means online
end
run server
The problem is every time a user subscribes, thin server reports:
[ERROR] [Faye::RackAdapter] uninitialized constant User
I guess I need to require certain files to be able to use activerecords in the rackup file. But I don't know how.
Thanks for any help.
In our project we decide to use redis for similar case.
Gemfile:
gem 'redis-objects'
Faye: use redis-rb for writing status
require 'redis'
Redis.current = Redis.new(:host => '127.0.0.1', :port => 6379)
# init faye server
...
server.bind(:subscribe) do |client_id, channel|
if /\/user\/*/.match(channel)
m = /\/user\/(?<user_id>\d+)/.match(channel)
Redis.current.set("user:#{m[:user_id]}:online_status", "1")
end
end
Rails: use redis-objects gem for reading it in User's model.
class User < ActiveRecord::Base
include Redis::Objects
value :online_status
end
#user.online_status # returns "1" if channel is connected
Hope this helps.