In my rails app, I'm using the SendGrid parse API which posts mail to my server. Every now and then SendGrid's Parse API submits the same email twice.
When I get a posted mail I place it in the IncomingMail model. so in order to prevent this double submitting issue, I look at each IncomingMail when processing to see if there is a duplicate in the table within the last minute. That tested great on development, it caught all the double submissions.
Now I pushed that live to heroku, where I have 2+ dynos and it didn't work. My guess being that it has something to do with replication. So that being the case, how can scalable sites with multiple server deal with something like this?
Thanks
You should look at using a background job queue. Heroku has "Workers" (which was Delayed Job). Rather than sending the email immediately, you push it onto the queue. Then one or more Heroku 'workers' need to be added to your account, and each one will pull jobs in sequence. This means there can be a short delay (depending on load) before the email is sent, but this delay is not presented to the user, and should there be a lot of email to send you just add more workers.
Waiting for an external service like an email provider on each user action is dangerous because any network problem will take down your site as several users have to 'wait' for their HTTP requests to be responded to while Heroku is blocked with these third party calls.
In this situation with workers each job would fail but would be retried and eventually succeed.
This sounds like it could be a transaction issue. If you have multiple workers running simultaneously their operation may be 'interleaved'. For instance this sequence of events would result in 2 mails being sent.
Worker A : Checks for an existing record and doesn't find one
Worker B : Checks for an existing record and doesn't find one
Worker A : Post to Sendgrid
Worker B : Post to Sendgrid
You could wrap everything in a transaction to keep this from happening. Something like this should do it.
class IncomingMail < ActiveRecord::Base
def check_and_send(email_address)
transaction do
# your existing code for preventing duplicates and sending
end
end
end
Related
I have some questions about ActionMailer :
How does Actionmailer connect to a smtp server ?
Are the connections concurrent or parallel if the number of emails high > 1000 ?
How will sending out emails like facebook does ( 1000's in numbers ) as immediate emails affect the ruby on rails application and how would actionmailer handle it ?
Any other solution/plugin to send out large number emails from a RoR application apart ActionMailer?
------------------------------------------------added :
We need to send out at least 1000 emails per 15 minutes . We are using a Notes Domino server as our smtp server .! what is the possible architecture for this kind of problem. We are already storing the emails in the database to send them later , but what is needed is the sending approach !
The usual thing is to create a background job to send email. ActionMailer is very good for single emails but does tend to run into trouble after sending multiple emails as each one can take several seconds to complete. That's why I created PostageApp to help solve those problems.
Some services on the market to help you with sending lots of email from Rails:
MailGun
SendGrid
PostmarkApp
MailChimp
Mailjet
PostageApp
All of these have ways of sending multiple messages with a single API call or SMTP transaction.
1) Actionmailer connects to your smtp server via a set of parameters including a host, port and protocol.
3) The effect will be a slow site as a result of the many synchronous tasks being executed.
2 & 4)
Actionmailer is a bit too slow to be sending out a ton of emails under load, remember that it is a synchronous operation and as such its not really the sort of thing you want to be doing a lot on a busy site.
To be honest you're better off not sending that quantity of email from your website. It's not really designed to be used in such a way. If I had to send that sort of volume I'd look at doing the work in the background, something like Delayed Job would work well here or one of the many async rails mailers found here would do the trick.
What you really want to look at here is the requirement that you're trying to fulfil, is it absolutely necessary that the website be responsible for sending the mail in a synchronous fashion? In most cases the answer to that question is no. If you can, you'll be far better off deferring this sort of task to another part of your system, keep your site as lean and focused as you can.
Simple solution here for you...
Sidekiq or Resque
I'd highly recommend Sidekiq as it's not near as server intense for running multiple workers for this one - only be careful with concurrency issues (make sure you don't have 2 workers pick up the same job and send duplicate emails that is).
Say you set 20 Sidekiq workers, each should be able to send an email every 2-4 seconds, you're looking at an easy 300-600 per minute.
DO NOT try to do this without background workers like Sidekiq, Resque, or DelayedJob. You will freeze your entire app if you try sending in app with any large amount of emails. Even sending activation emails in app and what not will cause you unnecessary slow down issues.
I'd have one Worker that handles the queueing periodically and another Worker class that handles the sending. We're using Resque (6 workers maybe?) for this on an older app (pre-sidekiq) to send around 500 emails every 5 minutes with no issues.
You can aways use a third party like someone mentioned. Sendgrid is decent. But that wasn't the question, this is how you do it yourself simply and easily.
You define the SMTP settings in a config file if left blank it uses sendMail local
concurrent
multiple handlers
Is there a bulk email plugin for Rails apps?
you may also do 1000.times do email.deliver but it will probably collapse ur server
I've read most of the other answers on this topic, but a lot of them related to either third-party services like MailChimp (which I'm not necessarily opposed to) or how not to upset the host's email server.
I believe this case is unique so that it'll contribute...
I have my own DigitalOcean droplet running a rails app. I need to send out 100-1000 emails every so often, each with a unique message (a link I'm using for tracking clicks originating from the email).
I'm also operating my own iRedMail server.
Can someone recommend how to best-handle this task? I was going to simply cycle through the list of emails and use the template.html.erb to drop in my link, but what types of problems might I run into?
Thank you!
You should decouple your Rails App from the mail sending so that you don't have to wait in your view for the mails to be sent (assuming that you click on something that triggers the start of your mail sending). Use something like delayed_job or another queueing mechanism that Rails offers and only queue up the sending job of the e-mails. Then when the queue comes to execute the particular job you can customize the message with an HTML part and a text part or whatever else you need and pass them on individually to your MTA.
Sorry for the basic question about Sidekiq's delaying ActionMailer. As per this article, Sidekiq can delay sending out emails by just saying UserMailer.delay_for(1.hour).....
Does this mean this is handled in the background now, or does it mean that it simply just delays sending the email out for an hour but once that hour comes, then the email is basically being sent like a regular ActionMailer, which slows down response time?
Or is it that if I truly want to do this in the background then I would have to do the other sidekiq stuff like putting it in a specific Worker and then firing it up that way?
Also, separately, if I do just do it via UserMailer.delay..., I presume I won't need a worker dyno on Heroku to save some money, correct?
Thanks for the help!
Yes, for emails you don't need to do anything else. It's like calling the Mailer 1 hour later. You just need to make sure you don't pass any complex objects into the mailer, for example a user object, you should only pass the user_id, because it will be stored in redis. On the Mailer fetch the user object with the given id.
I am building an application which will send status requests to users (via email & sms) on a regular basis. I want to execute the service each hour which will:
Query the database for all requests that need to be sent (based on some logic)
Send the requests through Amazon's Simple Email Service (this is already working)
Write a record of the status request notification back to the data store
I am considering wrapping up this series of operations into a single controller with an end point that can be called remotely to kick off the process within the rails app.
Longer term, I will break this process out into an app that can be run independently of my rails app, but for now I'm just trying to keep it simple.
My first inclination is to build the following:
Controller with the following elements:
A method which will orchestrate the steps outlined above (and can be called externally)
A call to the status_request model which will bring back a collection of request needing to be sent
A loop to iterate through the pending requests, which will:
Make a call to my AWS Simple Email Service module to actually send the email, and
Make a call to the status_request model to log the request back to the database
Model:
A method on my status_request model which will bring back a collection of requests that need to be sent
A method in my status_request model which will log that a notification was sent
Since this will behave as a service that gets called periodically from an outside scheduler I don't think I'll need a view for this operation. (Will, of course, need views to show users and admins what requests have been sent, but that's later...).
As someone new to Rails, I'm asking for review of this approach and any suggestions you may have.
Thanks!
Instead of a controller which Jeff pointed out exposes a security risk, you may just want to expose a rake task and use cron to invoke it on an hourly basis.
If you are still interested in building a controller, look at devise gem and its single access token, token_authenticatable, for securing the methods you are exposing.
You may also want to look at delayed_job or resque to offload the call to status_request and the loop to AWS simple service to a background worker process.
You may want a seperate controller and view for the log file so you can review progress on demand.
And if you want to get real fancy use Amazon SNS to send you alerts when the service reaches some unacceptable level of failures, backlog, etc.
Since you are trying to invoke this from an outside process, your approach should work. You could also have a worker process that processes task when they are there.
You will need routes to expose your service, and you may want to also make security decisions. How will the service that invokes your application authenticate so all others can't hit it at will?
Another consideration should be how many emails are you sending. If there are enough, we may want to look into the fact that writing this sort of loop is going to be extremely top heavy; and may affect users on the current system if it's a web application.
In the end, there are many ways to do this. I would focus on the performance/usage you expect as well as security. There's never one perfect way to solve a problem like this, and your way should just be aware of the variables it will need to be operating within.
Resque and Redis might be helpful to you in scheduling and performing operatio n .They are simple and superfast, [here](http://railscasts.com/episodes/271-resque] is a simple tut on same.
I try to create a mailing list feature in Rails which is based on delayed_jobs. For now I send mails by iterating over a users table and .deliver a mail to every mail address.
how can i integrate it into delayed_jobs, so it sends 50 mails every 5 minutes and remembers which adresses are already done? do i need to make a seperate table where i store all sent mails and check back everytime i send another 50 mails?
thanks in advance.
You will probably want to have table entries for sent emails. That way it serves as an audit trail if processes go down or somehow fail.
Suggest you look at doing this with an elastic cloud database like MongoLab, MongoHQ or SimpleDB. (Mongo-based services make it easy to extend the schema for new email entries.)
If you do that, then a cloud worker queue like SimpleWorker can make it easy to send out lots of emails concurrently or in batches to get around any rate limits. (full disclosure: I work at Iron.io/SimpleWorker)
You're taking a good approach to bundle multiple email sends into a single worker task to amortize the worker setup costs. With an elastic cloud worker system, you could have master workers come off schedule and then queue up a number of slave worker tasks, each with a set of users to send.
With table entries, you can then go back through the data tables and address any emails that failed or didn't go through.
50 emails is not really so much can be sent in seconds, I think. Use foreverb for sending emails every 5 minutes.
Let delayed job do all the work:
User.all.each_with_index |user, index|
Mailer.delay({:run_at => ((index / 50) * 5).minutes.from_now}).send_newsletter(user)
end
This should work but untested.