Rails 3 & MailChimp - Speeding things up - ruby-on-rails

Currently I have a Rails 3 app that subscribes new users up to MailChimp. As part of my user model, I have this:
after_create :add_user_to_mailchimp
before_destroy :remove_user_from_mailchimp
before_save :update_mailchimp_values
Then, each of those three actions are some variation on this:
def add_user_to_mailchimp
mailchimp = Hominid::API.new(MAILCHIMP_API_KEY)
list_id = mailchimp.find_list_id_by_name MAILCHIMP_LIST_NAME
info = { }
mailchimp.list_subscribe(list_id, self.email, info, 'html', false, true, false, false))
end
The problem is that this is slowing down the registration process... It can take 3 or 4 seconds to return, and I'm worried that once the floodgates open on the site (later today, probably), it'll be ridiculously out of hand.
Is there an easy way to make this faster, or do I need to set up something like delayed_job?

Because you're relying on the response time of their API then it would be best to use delayed_job to handle the processing that way you can return focus back to the user and the site - this equally applies when sending emails etc which need to establish a connection to a third party.

Related

Rails 5 - Best way to prevent emails from being sent to unsubscribed users

I am using Rails 5.
I have an Affiliate model, with a boolean attribute email_notifications_on.
I am building a quite robust email drip system for affiliates and can't figure out where the best place is to check if the affiliate has email notifications on before delivering the email.
Most of my emails are being sent from Resque BG jobs, a few others from controllers.
Here is an example of how I am checking the subscribe status from a BG job:
class NewAffiliateLinkEmailer
#queue = :email_queue
def self.perform(aff_id)
affiliate = Affiliate.find(aff_id)
if affiliate.email_notifications_on?
AffiliateMailer.send_links(affiliate).deliver_now
end
end
end
It seems like writing if affiliate.email_notifications_on? in 10+ areas is not the right way to do this, especially if I need another condition to be met in the future. Or is this fine?
I thought maybe some sort of callback in the AffiliteMailer would work, but saw many people advising against business logic in the Mailer.
Any thoughts/advice would be appreciated.
To be honest, I don't think any better way than creating a method in Affiliate model as follows,
def should_send_email?
# all business logic come here
# to start with you will just have following
# email_notifications_on?
# later you can add `&&` or any business logic for more conditions
end
You can use this method instead of the attribute. It is more re-usable and extendable. You will still have to use the method in every call. If you like single liners then you can use lambda.

Webhook firing multiple times, causing heavy API calls

My app has some heavy callback validations when I create a new customer. Basically I check multiple APIs to see if there's a match before creating a new customer record. I don't want this to happen after create, because I'd rather not save the record in the first place if there aren't any matches.
I have a webhook setup that creates a new customer. The problem is that, because my customer validations take so long, the webhook continues to fire because it doesn't get the immediate response.
Here's my Customer model:
validates :shopify_id, uniqueness: true, if: 'shopify_id.present?'
before_validation :get_external_data, :on => :create
def get_external_data
## heavy API calls that I don't want to perform multiple times
end
My hook:
customer = shop.customers.new(:first_name => first_name, :last_name => last_name, :email => email, :shopify_url => shopify_url, :shopify_id => id)
customer.save
head :ok
customer.save is taking about 20 seconds.
To clarify, here's the issue:
Webhook is fired
Heavy API Calls are made
Second Webhook is fired (API calls still being made from first webhook). Runs Heavy API Calls
Third Webhook is fired
This happens until finally the first record is saved so that I can now check to make sure shopify_id is unique
Is there a way around this? How can I defensively program to make sure no duplicate records start to get processed?
What an interesting question, thank you.
Asynchronicity
The main issue here is the dependency on external web hooks.
The latency required to test these will not only impact your save times, but also prevent your server from handling other requests (unless you're using some sort of multi processing).
It's generally not a good idea to have your flow dependent on more than one external resource. In this case, it's legit.
The only real suggestion I have is to make it an asynchronous flow...
--
Asynchronous vs synchronous execution, what does it really mean?
When you execute something synchronously, you wait for it to finish
before moving on to another task. When you execute something
asynchronously, you can move on to another task before it finishes.
In JS, the most famous example of making something asynchronous is to use an Ajax callback... IE sending a request through Ajax, using some sort of "waiting" process to keep user updated, then returning the response.
I would propose implementing this for the front-end. The back-end would have to ensure the server's hands are not tied whilst processing the external API calls. This would either have to be done using some other part of the system (not requiring the use of the web server process), or separating the functionality into some other format.
Ajax
I would most definitely use Ajax on the front-end, or another asynchronous technology (web sockets?).
Either way, when a user creates an account, I would create a "pending" screen. Using ajax is the simplest example of this; however, it is massively limited in scope (IE if the user refreshes the page, he's lost his connection).
Maybe someone could suggest a way to regain state in an asynchronous system?
You could handle it with Ajax callbacks:
#app/views/users/new.html.erb
<%= form_for #user, remote: true do |f| %>
<%= f.text_field ... %>
<%= f.submit %>
<% end %>
#app/assets/javascripts/application.js
$(document).on("ajax:beforeSend", "#new_user", function(xhr, settings){
//start "pending" screen
}).on("ajax:send", "#new_user", function(xhr){
// keep user updated somehow
}).on("ajax:success", "#new_user", function(event, data, status, xhr){
// Remove "pending" screen, show response
});
This will give you a front-end flow which does not jam up the server. IE you can still do "stuff" on the page whilst the request is processing.
--
Queueing
The second part of this will be to do with how your server processes the request.
Specifically, how it deals with the API requests, as they are what are going to be causing the delay.
The only way I can think of at present will be to queue up requests, and have a separate process go through them. The main benefit here being that it will make your Rails app's request asynchronous, instead of having to wait around for the responses to come.
You could use a gem such as Resque to queue the requests (it uses Redis), allowing you to send the request to the Resque queue & capture its response. This response will then form your response to your ajax request.
You'd probably have to set up a temporary user before doing this:
#app/models/user.rb
class User < ActiveRecord::Base
after_create :check_shopify_id
private
def check_shopify_id
#send to resque/redis
end
end
Of course, this is a very high level suggestion. Hopefully it gives you some better perspective.
This is a tricky issue since your customer creation is dependant on an expensive validation. I see a few ways you can mitigate this, but it will be a "lesser of evils" type decision:
Can you pre-call/pre-load the customer list? If so you can cache the list of customers and validate against that instead of querying on each create. This would require a cron job to keep a list of customers updated.
Create the customer and then perform the customer check as a "validation" step. As in, set a validated flag on the customer and then run the check once in a background task. If the customer exists, merge with the existing customer; if not, mark the customer as valid.
Either choice will require work arounds to avoid the expensive calls.

Breaking rails MVC: Sending data from model directly to view via AJAX

I am creating a MUD using Rails. Here is what I got so far:
Right now I am working on a combat system. My combat system will work like this:
current_user sees characters and non_player_characters in room
When current_user attacks another character, the other characters have 5 seconds to "deflect" the attack or they are hit. (Not fully implemented)
When current_user attacks an NPC, there is a 50% the NPC will deflect the attack
NPC will send attacks back to user and user will have to deflect attacks within the proper time interval (Not fully implemented).
In order to implement this combat system I decided I needed to use multithreading and timers:
def initiate_attack
Thread.new do
sleep(5)
hit_target
ActiveRecord::Base.connection.close
end
end
def non_player_character_failed_to_deflect
(1 + rand(10)) < 5
end
def is_non_player_character?
#attack.target_type == "NonPlayerCharacter"
end
def hit_target
if is_non_player_character?
if non_player_character_failed_to_deflect
damage_target
else
puts "Deflected"
end
else
"hit player"
end
end
def damage_target
#target.update(power_level: #target.power_level - 10)
end
This works as far as pure functionality is concerned, but the problem is I can't figure out how to get the strings back to the view so the user can see them. The user should see a message upon anyone initiating an attack, and upon the completion of an attack. I think the main issue with doing this is that by using multithreading MVC is broken because my threads in model are still running after the control has been returned to the controller and view.
So to summarize my question:
1)How do I make it so my view is continuously updated using AJAX with data coming from the model?
For more information please visit the github page for this project:
You need a way to push data to the browser. To do that you have a few options:
Use polling or long-polling. message_bus makes it very simple.
Use websockets as suggested by Justin.
Use another new technology to push events from the server like (server-sent events)[http://www.w3schools.com/html/html5_serversentevents.asp].
I would give the message_bus gem a try.
EDIT: You might as well try (Sidekiq)[http://sidekiq.org/] to run the asynchronous code - I believe you'll find your code using it easier to maintain in the long run, specially compared to the approach of using threads directly.

How to handle Shopify API connection with Shopify gem?

Hi I'm using the Shopify gem in my Shopify app and I'm looking for suggestions on how to handle the API connection to Shopify.
I'm using webhooks and delayed_jobs so I need a way to open the connection outside of the controller.
At the moment I added this method to my Shop model:
def connect_to_store
session = ShopifyAPI::Session.new(self.url, self.access_token)
session.valid?
ShopifyAPI::Base.activate_session(session)
end
So I can open the connection very easily, for example:
Shop.find(1).connect_to_store
ShopifyAPI::Shop.current.name
The problem is that, inside my Product module, I need the connection open inside several methods but I end up calling the connect_to_store method several times and I'm worried about opening several connections to the same store, without a real need.
Is there a way to check if a connection is already opened and open a new one only if another one is not found?
Thanks,
Augusto
------------------- UPDATE -------------------
I explain better my issue.
Let's say that in my Product model I want to see if a given product has a compare_at_price greater than its price and, in this case, I want to add a "sale" tag to the Shopify product.
In my Product model I have:
class Product < ActiveRecord::Base
belongs_to :shop
def get_from_shopify
self.shop.connect_to_store
#shopify_p = ShopifyAPI::Product.find(self.shopify_id)
end
def add_tag(tag)
#shopify_p = self.get_from_shopify
shopify_p_tags = shopify_p.tags.split(",")
shopify_p_tags.collect{|x| x.strip!}
unless shopify_p_tags.include?(tag)
shopify_p_tags << tag
shopify_p_tags.join(",")
shopify_p.tags = shopify_p_tags
shopify_p.save
end
end
def on_sale?
#shopify_p = self.get_from_shopify
sale = false
shopify_p.variants.each do |v|
unless v.compare_at_price.nil?
if v.compare_at_price > v.price
sale = true
end
end
end
return sale
end
def update_sale_tag
if self.on_sale?
self.add_tag("sale")
end
end
end
My problem is that if I call:
p.update_sale_tag
the Shop.connect_to_store is called several times and I authenticate several times while I'm already authenticated.
How would you refactor this code?
I approach this by storing the OAuth token that is returned by Shopify with the store (you should be doing this anyway). All you need to access the API is the token, so in your shop model you would have a method like:
def shopify_api_path
"https://#{Rails.configuration.shopify_api_key}:#{self.shopify_token}##{self.shopify_domain}/admin"
end
Then if you want to access the API for a particular store in a Delayed Job worker, you would simply:
begin
ShopifyAPI::Base.site = shop.shopify_api_path
# Make whatever calls to the API that you want here.
products = ShopifyAPI::Product.all
ensure
ShopifyAPI::Base.site = nil
end
Hopefully that helps a little. I find working with Sessions outside of controllers to be a bit messy, particularly since this is nice and easy.
Once your application has authenticated once, you can hold on to that computed password – it’s good until the app is uninstalled for that particular store.
In other words, authenticate just the once when the merchant first installs the app, save the password to a db, and load it up whenever you need it. Your self.shop.connect_to_store call should then just set the ShopifyAPI::Session instance.
I think there is some misunderstanding here. You do know that you are really just using Active Resource for all your API work? And therefore when you authenticate, you are probably authenticating a session? And that once authenticated, no matter how many times you actually use the API, you're not actually opening "new" connections.
You are doing it wrong if you are constantly authenticating in a single session to do more than one API call.
If you happen to be in a block of code that has no authentication (for example your App may process a WebHook from N shops) or a Delayed Job, simply pass the myshopify_domain string to those code blocks, look up the Shop in your DB, find the auth token, authenticate (once)... and away you go... it really quite simple.

Rails - ping user without authenticating?

So I'm writing a Facebook clone for a school project using Rails and I need some way to keep track of which users are logged in. At the moment, I'm a bit time-pressed, so I decided just to update the User model every time they visit a page with a last_seen attribute.
Problem is, the user model requires revalidation to successfully update_attributes. So I'm wondering two things:
Is there a better way to do this that I'm missing?
If not (or if it would take too long) is there a way to bypass the validation?
to 1.: I cant give you an exact answer but I think itwould be better to deal with this problem using a javascript on the clientside with a timer that sends an ajax request all xxx secounds and an action that receives this requests and saves it in a seperate table associated with the User.
to 2.: Yes there are some ways to bypass validations The most pragmatic way is to bypass the :validate => false option when saving the object but then you can use update_attributes:
object.save(:validate => false)
So there is also the possibility to use conditional validations that are only used when a specific condition is complyed. There is a railscast about that => http://railscasts.com/episodes/41-conditional-validations .

Resources