I've developed an application that allows our customers to create their own membership protected websites. My application then connects to an outside API service (customer specific api_key/api_url) to sync/update/add data to this other service. Well, I've had an API wrapper written for this other service that has worked up to this point. However, I'm now seeing very random drops where the connection is nil. Here is how I'm currently using the connection:
I have a xml/rpc connection class
class ApiConnection
attr_accessor :api_url, :api_key, :retry_count
def initialize(url, key)
#api_url = url
#api_key = key
#retry_count = 1
end
def api_perform(class_type, method, *args)
server = XMLRPC::Client.new3({'host' => #api_url, 'path' => "/api/xmlrpc", 'port' => 443, 'use_ssl' => true})
result = server.call("#{class_type}.#{method}", #api_key, *args)
return result
end
end
I also have a module that I can include in my models to access and call the api methods
module ApiService
# Set account specific ApiConnection obj
def self.set_account_api_conn(url, key)
if ac = Thread.current[:api_conn]
ac.api_url, ac.api_key = url, key
else
Thread.current[:api_conn] = ApiConnection.new(url, key)
end
end
########################
### Email Service ###
########################
def api_email_optin(email, reason)
# Enables you to opt contacts in
Thread.current[:api_conn].api_perform('APIEmailService', 'optIn', email, reason)
end
### more methods here ###
end
Then in the application controller I create a new ApIConnection object on every request using a before filter which sets the Thread.current[:api_conn]. This is because I have hundreds of customers each with their own api_key and api_url, using the application at the same time.
# In before_filter of application controller
def set_api_connection
Thread.current[:api_conn] = ApiService.set_account_api_conn(url, key)
end
Well my question is that I've read that using Thread.current is not the most ideal way of handling this, and I'm wondering if this is the cause for the ApiConnection to be nil on random requests. So I would like to know how I could better setup this wrapper.
Answer 1
I'd expect that the problem is the next request coming before the connection has finished, and then the before_filter overwrites the connection for the still ongoing connection. I'd try to stay away from threads. It's easier to fork_off, but there's certain caveats to that as well, especially regarding performance.
I try to move logic like this over to a background job of some sort. A common solution is delayed job https://github.com/collectiveidea/delayed_job that way you don't have to mess with threads and it's more robust and easy to debug. You can then start background jobs to asynchronously sync the service whenever somebody logs in.
#account.delay.optin_via_email(email,user)
This will serialize the account, save it to the job queue, where it will be picked up by delayed job unserialized and the method after delay will be called. You can have any number of background jobs, and even some job queues dedicated to certain types of actions (via using job priorities - let's say two bj for high prio jobs and one dedicated to low prio jobs)
Answer 2
Just make it as an object instead
def before_filter
#api_connection = ApiConnection.new(url, key)
end
then you can use that connection in your controller methods
def show
#just use it straight off
#api_connection.api_perform('APIEmailService', 'optIn', email, reason)
# or send the connection as a parameter to some other class
ApiService.do_stuff(#api_connection)
end
Answer 3
The easiest solution might just be to create the api connection whenever you need it
class User < ActiveRecord::Base
def api_connection
# added caching of the the connection in object
# doing this makes taking a block a little pointless but making methods take blocks
# makes the scope of incoming variables more explicit and looks better imho
# might be just as good to not keep #conn as an instance variable
#conn = ApiConnection.new(url, key) unless #conn
if block_given?
yield(#conn)
else
#conn
end
end
end
that way you can easily just forget about the creation of the connection and have a fresh one handy. There might be performance penalities with this but I suspect that they are insignificant unless there's an extra login request
#user.api_connection do { |conn| conn.optin_via_email(email,user) }
Related
I have two controllers which have
before_action :get_user
and a private method in both of those controllers
def get_user
#user = User.find(params[:id])
#user.create_profile if #user.profile.nil?
end
The create_profile method makes an API call to the third-party service to create the profile. When two concurrent requests hit these two controllers from the same user, it makes duplicate API calls(two) when the profile is nil in the database. How can I make sure that I don't make a single request instead of two here?
Ideally the the API requests should be idempotent, so doing it twice will not create two profiles but e.g. just return the profile if it already exists.
However, if this API is not under your control and you need to make sure the request is only done once there are several ways to do this.
One way could be to use a database lock by find_or_create the record before you do the request, lock it and flag it as externally created after the response.
Something like
class User
def create_profile
profile = profile.first_or_create
return if profile.external_created_at
profile.lock do
return if profile.external_created_at
api_response
profile.update_attributes!(external_created_at: Time.now)
end
end
end
Otherwise you can e.g. use a distributed lock with Redis.
https://github.com/leandromoreira/redlock-rb
https://api.rubyonrails.org/classes/ActiveRecord/Locking/Pessimistic.html
I want to create an ActiveRecord-like interface for Salesforce, such that I can call
class Account < Salesforce::Model
end
and be able to call methods like Account.find_by_FirstName() using my own method_missing function.
However, connecting to Salesforce can be done in two ways: username and password, and oauth. If a username/password is used, I can have it defined in a salesforce.yml file and load automatically. But with oauth, I can't do that since each user will have this defined. I don't want to initialize a class with Account.new('oauth', oauth_parmas) or Account.new('username','password','sec_token'), but have the model determine which to use based off of rules and by seeing if one or the other is present.
Is there a way to implement this? In other words, is there a way for the model to know if the current user has a current oauth token or if a username/password defined?
Additionally, if I were to use this in a Rails app, the user would be logging in after the app was started, so the oauth token would be defined after the application started, and would be different for each of the multiple users. For example, let's say I call Account.find_by_FirstName('John') in AccountController#Show. I want the Account model to use the oauth token or usename/password without having to be asked. I also don't want to establish connection directly in my show method in the controller. I have two questions:
How would I implement this? Should I use a before_filter in the controller, or is there a way to implement this application-wide?
If I have multiple users connecting to Salesforce, would this cause issues in my application? In other words, would I have to worry about a connection being used by another user since the connection is dynamic?
Your needing is not different from ActiveRecord::Base connection establishment: you establish the connection using ActiveRecord::Base.establish_connection and every model you use after the connection establishment know which connection to use, because you memorized the connection at superclass level.
For Salesforce you can use the same concept:
class Salesforce::Model
def self.oauth_params
#oauth_params
end
def self.establish_connection(oauth_params)
#oauth_params = oauth_params
end
def self.find(id)
# use oauth_params here
end
end
class Account < Salesforce::Model
end
Now you can do something like
Salesforce::Model.establish_connection ['username', 'password']
Account.find 2 # without specifying authentication params
Since you know authentication params after knowing the logged user, you can establish the connection after the user is logged:
def sign_user
# user = ...
oauth_params = get_oauth_params(user)
Salesforce::Model.establish_connection(oauth_params)
end
Concurrency (threads)
If I have multiple users connecting to Salesforce, would this cause issues in my application? In other words, would I have to worry about a connection being used by another user since the connection is dynamic?
Legitimate question. If you run the Rails application in a threaded environment (threaded application server - f.e. Puma, multi-threaded architecture - JRuby, Rubinius...) AND Rails is configured as threadsafe (config.threadsafe!), you could have concurrency problems (the explanation is not trivial - check out this).
If this is your case you can scope the #oauth_params variable accessor to Thread.current:
class Salesforce::Model
#oauth_params = { Thread.current => nil }
def self.oauth_params
#oauth_params[Thread.current]
end
def self.establish_connection(oauth_params)
#oauth_params[Thread.current] = oauth_params
end
Would it be possible that the thread for the current user changes?
It is possible, if some code you execute runs inside a new thread. F.e.:
Salesforce::Model.establish_connection(oauth_params)
Thread.new{ p Salesforce::Model.oauth_params }.join #=> puts nil
In this case you have to reestablish the connection in the new thread (I can do it just if you need it).
I could request something on thread 1 and complete that request, but afterwards, someone else uses thread 1 and I have to use thread 2. Is this possible?
Thinking about it, you need to reset the variable at the beginning of the call in order to avoid that the next request uses the params set in any previous request:
before_action :reset_connection, :sign_user
def reset_connection
Salesforce::Model.establish_connection(nil)
end
def sign_user
# ...
I want to save information about requests to a certain action in a model named Impression.
I assume it's benificial for the visitor's response time to save this info in an after_filter, e.g:
after_filter :save_impression
private
def save_impression
Impression.create!(ip_address: request.remote_ip, controller_name: params[:controller], action_name: params[:action], referer: request.referer)
end
Can this code be optimized or am I doing it right?
A good solution for that would typically involve using a worker. Anything that is not mission critical to the request and that involves complex computing can be deferred and run later by a background job.
Two common implementations of workers are delayed_job and resque.
For example, with resque, you would have a job class in app/jobs/impression_creation_job.rb, containing something like that :
class ImpressionJob
#queue = :impression
def self.perform( attrs )
Impression.create!( attrs )
end
end
And you can call it in your controller like that :
after_filter :save_impression
private
def save_impression
Resque.enqueue( ImpressionJob, ip_address: request.remote_ip, controller_name: params[:controller], action_name: params[:action], referer: request.referer)
end
This will ensure a fast handling on the request part (it just loads data in redis) and will then be processed by a background process (see resque documentation for how to set it up and start workers).
Please note that this will be useful in your case in only two cases :
Your app is always under heavy load or need especially good response time
You do big computations in Impression#before_create or other callbacks
If not matching one of those conditions, it's probably more effective to just let your impression creation in a controller filter : accessing database has a cost, but not that much that a user will feel when you make a single insertion in database.
This will still run before render. To run after the render/redirect, you need to spawn a separate thread.
See this question
Hi I'm using the Shopify gem in my Shopify app and I'm looking for suggestions on how to handle the API connection to Shopify.
I'm using webhooks and delayed_jobs so I need a way to open the connection outside of the controller.
At the moment I added this method to my Shop model:
def connect_to_store
session = ShopifyAPI::Session.new(self.url, self.access_token)
session.valid?
ShopifyAPI::Base.activate_session(session)
end
So I can open the connection very easily, for example:
Shop.find(1).connect_to_store
ShopifyAPI::Shop.current.name
The problem is that, inside my Product module, I need the connection open inside several methods but I end up calling the connect_to_store method several times and I'm worried about opening several connections to the same store, without a real need.
Is there a way to check if a connection is already opened and open a new one only if another one is not found?
Thanks,
Augusto
------------------- UPDATE -------------------
I explain better my issue.
Let's say that in my Product model I want to see if a given product has a compare_at_price greater than its price and, in this case, I want to add a "sale" tag to the Shopify product.
In my Product model I have:
class Product < ActiveRecord::Base
belongs_to :shop
def get_from_shopify
self.shop.connect_to_store
#shopify_p = ShopifyAPI::Product.find(self.shopify_id)
end
def add_tag(tag)
#shopify_p = self.get_from_shopify
shopify_p_tags = shopify_p.tags.split(",")
shopify_p_tags.collect{|x| x.strip!}
unless shopify_p_tags.include?(tag)
shopify_p_tags << tag
shopify_p_tags.join(",")
shopify_p.tags = shopify_p_tags
shopify_p.save
end
end
def on_sale?
#shopify_p = self.get_from_shopify
sale = false
shopify_p.variants.each do |v|
unless v.compare_at_price.nil?
if v.compare_at_price > v.price
sale = true
end
end
end
return sale
end
def update_sale_tag
if self.on_sale?
self.add_tag("sale")
end
end
end
My problem is that if I call:
p.update_sale_tag
the Shop.connect_to_store is called several times and I authenticate several times while I'm already authenticated.
How would you refactor this code?
I approach this by storing the OAuth token that is returned by Shopify with the store (you should be doing this anyway). All you need to access the API is the token, so in your shop model you would have a method like:
def shopify_api_path
"https://#{Rails.configuration.shopify_api_key}:#{self.shopify_token}##{self.shopify_domain}/admin"
end
Then if you want to access the API for a particular store in a Delayed Job worker, you would simply:
begin
ShopifyAPI::Base.site = shop.shopify_api_path
# Make whatever calls to the API that you want here.
products = ShopifyAPI::Product.all
ensure
ShopifyAPI::Base.site = nil
end
Hopefully that helps a little. I find working with Sessions outside of controllers to be a bit messy, particularly since this is nice and easy.
Once your application has authenticated once, you can hold on to that computed password – it’s good until the app is uninstalled for that particular store.
In other words, authenticate just the once when the merchant first installs the app, save the password to a db, and load it up whenever you need it. Your self.shop.connect_to_store call should then just set the ShopifyAPI::Session instance.
I think there is some misunderstanding here. You do know that you are really just using Active Resource for all your API work? And therefore when you authenticate, you are probably authenticating a session? And that once authenticated, no matter how many times you actually use the API, you're not actually opening "new" connections.
You are doing it wrong if you are constantly authenticating in a single session to do more than one API call.
If you happen to be in a block of code that has no authentication (for example your App may process a WebHook from N shops) or a Delayed Job, simply pass the myshopify_domain string to those code blocks, look up the Shop in your DB, find the auth token, authenticate (once)... and away you go... it really quite simple.
I decided to use the singleton design pattern while creating a view helper class. This got me thinking; will the singleton instance survive across requests? This led to another question, Which variables (if any) survive across web requests and does that change depending on deployment? (Fastcgi, Mongrel, Passenger, ...)
I know that Controller instance variables aren't persisted. I know Constants are persisted (or reloaded?). But I don't know about class variables, instance variables on a class, Eigenclasses, ...
The simple answer is none. Each request is treated as an independent event and no state information is carried over apart from what is stored in the user session and any external databases, caches, or file stores. It is best that you design your application with this in mind and not expect things to persist just because you've set them.
The more complicated story is that some things do persist. For example, you can create a class variable on a controller and this will be carried from one request to the next as you might expect. The catch is that this only applies to the singular instance of that controller, as contained within that process, and will not apply to requests served by other processes. If you need caching, make use of the Rails.cache infrastructure and avoid hacking in your own.
A typical production environment is a complicated, ever-changing thing, where processes are created and destroyed constantly and there is no way to determine in advance which process will ultimately end up serving a particular request. As many deployments involve not only multiple processes on a single machine, but multiple machines, there really is no practical way to create application-wide singleton objects.
The best thing you can do is build a layer on top of the caching engine where your singleton object is merely a wrapper to functions that fetch and write from the cache. This gives you the appearance of a singleton object while maintaining inter-process consistency.
I know that this post is old, but for who is looking a solution, it's possible to use Rails.Cache, like this:
class TestEventsController < ApplicationController
require 'httparty'
##cache = ActiveSupport::Cache::MemoryStore.new(expires_in: 5.minutes)
before_action :get_data, only: [:get]
before_action :get_response, only: [:set]
def get
uri = "https://hooks.zapier.com/hooks/catch/zap_id/"
event_id = event_id_generate()
##cache.write(event_id, "")
result = HTTParty.post(uri.to_str,
:body => {id: event_id, data: #data}.to_json,
:headers => {'content-Type' => 'application/json'})
sleep 2
render json: { 'value': ##cache.read(event_id) }, status: 200
end
def set
##cache.write(#id, #value)
render json: { 'value': ##cache.read(#id) }, status: 200
end
def get_data
#data = params["data"]
end
def get_response
#id = params["id"]
#value = params["value"]
end
def event_id_generate
token = SecureRandom.urlsafe_base64(10, false)
end
end
What I'm doing is receive a request in a route, sending a GET to Zapier, and waiting for the answer in another route. Rails opens a new Thread for each request, so I write in the RAM my data in format 'key: value'
Rails.cache in different threads not caching?
The web is a stateless medium. Unless you purposely save data in a session or pass it in a get or post, each web request starts with a blank slate. Any objects created with the current request are destroyed once the page is delivered to the web browser.