Using Cassandra with rails app - ruby-on-rails

I would like to use cassandra with my rails application. There are few questions in my mind:
* How can I connection pool the cassandra clients?
* How can I store cassandra client object in a place that is shared among all my model objects during the duration of request. Of course if there is a connection pool, I need to return the object back to the pool at the end of request processing.
Thanks a lot
Behrang

I found the solution:
I should use Thread.current[] to ensure the cassandra client is not recreated per request.
Something like:
Thread.current[:cassandra_client] ||= Cassandra.new(keyspace, servers)

Related

Do constants stay the same for ALL users?

I have a web app that I built. It communicates with the Salesforce API. I have users and administrators. All connections to the API use the same credentials.
I am concerned that my API connection is going to be created multiple times because each admin that is logged in has their own instance of the connection.
If I hold the API connection in a constant, do all other sessions/users have access to that exact connection or do I have to connect for each user, or how can I share one single API connection for ALL users?
A stateless API will never have a persistent connection, so there's no use in holding these in constants. Every HTTP request is a separate TCP connection by definition.
It's only things like database or Websocket connections that persist and if you need to manage those you need a connection pool, not a simple constant. If the connection ever fails it needs to be replaced, and if more than one thread potentially requires it you have to handle acquisition and locking properly.
Create your API connectors as necessary. Unless you have a measurable performance problem don't worry about it.
A Ruby constant is like a variable, except that its value is supposed to remain constant for the duration of the program. The Ruby interpreter does not actually enforce the constancy of constants, but it does issue a warning if a program changes the value of a constant.
Reference: http://rubylearning.com/satishtalim/ruby_constants.html

Ruby/Rails db connection pool implementation

I have a ruby on rails application that takes a user http request, connects to the database, and sends back the response. To make the application faster, I would like to implement the db connection pool to avoid creating a new connection every time. I tried looking into the connection pool library, but did not fully grasp how to use it. Any help or pointers would be highly appreciated? Thanks.
ActiveRecord is the default ORM library that Rails uses and it automatically handles connection pooling for you so unless your using some other library you don't need to do anything.
Some of the pool options are configurable if you feel like you need to mess with them but I doubt you would http://api.rubyonrails.org/classes/ActiveRecord/ConnectionAdapters/ConnectionPool.html

Is it okay to authenticate with MongoDB per request?

I have a Rails controller that needs to write some data to my MongoDB. This is what it looks like at the moment.
def index
data = self.getCheckinData
dbCollection = self.getCheckinsCollection
dbCollection.insert(data)
render(:json => data[:_id].to_s())
end
protected
def getCheckinsCollection
connection = Mongo::Connection.new('192.168.1.2', 27017)
db = connection['local']
db.authenticate('arman', 'arman')
return db['checkins']
end
Is it okay to authenticate with MongoDB per request?
It is probably unnecessarily expensive and creating a lot more connections than needed.
Take a look at the documentation:
http://www.mongodb.org/display/DOCS/Rails+3+-+Getting+Started
They connect inside an initializer. It does some connection pooling so that connections are re-used.
Is there only one user in the database?
I'd say: don't do the db authentication. If MongoDB server is behind a good firewall, it's pretty secure. And it never ever should be exposed to the internet (unless you know what you're doing).
Also, don't establish a new connection per request. This is expensive. Initialize one on startup and reuse it.
In general, this should be avoided.
If you authenticate per request and you get many requests concurrently, you could have a problem where all connections to the database are taken. Moreover, creating and destroying database connections can use up resources within your database server -- it will add a load to the server that you can easily avoid.
Finally, this approach to programming can result in problems when database connections aren't released -- eventually your database server can run out of connections.

Rails+PostgreSQL: search_path depending on subdomain

In our rails 2.x application the search_path of the database connection depends on the subdomain through which the application is contacted (basically search_path = "production_"+subdomain). Because the search_path is defined per connection and database connections are shared over requests, even concurrently, this is a problem. I would rather not change concurrency to only serve one request at a time for obvious reasons.
So is there a way to group the database connections in the connection pool and set some kind of policy that only a fitting connection is used for the request? Or is there a way to use one connection pool per subdomain (where the pools are automatically discarded after a timeout)? Starting a rails instance for each subdomain is no option because there might be many idling subdomains (it's some kind of pro-account where you get a subdomain and your own "world" that differs from the rest of the site in some tables).
What would be the best solution for this problem?
You can just set connection.search_path at the beginning of the request, before any objects are loaded, and you'll be fine. In our case we have a Rack app that wraps our rails app and does this setup for us based on the incoming domain.

How to configure login when using multiple servers running a distributed service (HAProxy, Apache, Ruby on Rails)

I have 3 servers running a website. I now need to implement login system and I am having problems with it as user gets a different behavior (logged in or logged out) depending on the server it gets connected to.
I am using Memcache for session store in Rails -
config.action_controller.session_store = :mem_cache_store
ActiveSupport::Cache::MemCacheStore.new("server1","server2","server3")
I thought the second line will either keep caches in sync or something like that ...
Each server has its own db with 1 master, 2 slaves. I have tried going the route of doing sessions in sql store but that really hurts sql servers and replication load becomes really heavy.
Is there an easy way to say, use this Memcache for all session store on all 3 servers?
Will that solve my problem?
I will really appreciate it.
I haven't used memcached to store sessions before ( I feel like redis is a better solution ), but I think as long as you have the
ActiveSupport::Cache::MemCacheStore.new("server1","server2","server3")
line on each of your application servers, your sessions should stay synced up.
I've had a lot of success with just using regular cookie sessions using the same setup you've described.

Resources