What is the accepted method for a global state? - ruby-on-rails

I was wondering what would you guys consider the best way to go about having some simple stuff stored across sessions without using the DB.
I'm looking to have like having 'modes' to a website. So it can be in mode a or b, and depending on the mode, buttons would do different things.
Would using Rails.cache.read and write be the best option ? I've heard it has issues with heroku if you leave the cache as filesystem, then has problems as a memcache because of multi-threading ?
I'm really trying to avoid having a whole table on DB getting used for users checking a global state of the site each request.

In order to have a "global" state, then you need to create a singke dependency for each instance of your application.
In fact, you can't rely on cookies or sessions, are they are client-oriented and they are not shared between clients.
The database is the most common approach. You may be able to use the file system in some cases, but not for Heroku as there may be several different instances of your app running under different file systems.
Any solution that can easily be shared across instances will work:
Memory database like Redis
SQL or NoSQL database
Cache systems, as long as they are not specific to one instance. Memcached may work, but it's not persistent (hence you may lose the state)
External storage (such as Amazon S3)
To me, a relational database or a memory database such as Redis seems to be the most plausible solution.

If you want per-user setting - consider storing in session (which can be stored in cookies), or directly in cookies.
Both methods end up storing some data (but not lots of it, because cookies are passed by browser with each request) inside clients' browsers.

You could put it in a table - just so that you have it, but then make the value available via the ApplicationContoller, with a simple cache method in between
so something like: - not tested!!
def get_mode
if #mode_timeout.nil? or #mode_timeout < Time.now
#mode = ModeModel.first.mode
#mode_timeout = Time.now + 60.seconds
end
#mode
end
You'll have to create a model, or you could update if via a controller with a set_mode method instead, but that would be more transient.
Then you can just call get_mode from your controller.

Related

Rails, handle two sites with different url and design but with the same db

I'm looking for the best way to solve a problem.
At this moment I have a site for a customer, example.domain.com
My customer ask to create another website with some changes in design, but the contents are the same of the first website. I don't want to duplicate the website, because every feature I add to the site A must be deployed also to site B, and I'm looking a smart way to handle the situation.
I need to keep two different domains and I need also custom mailers and other small tweaks in the controllers (and maybe in some models).
My idea is to put in application controller a before filter like this
before_action :detect_domain
private
def detect_domain
case request.env['HTTP_HOST']
when "example.domain.com"
request.variant = :host1
when "example1.domain.com"
request.variant = :host2
end
end
Then I use the variant with some conditional to choose the mailer, to customize the views and to apply some code changes.
Any other idea?
Using a before filter and a per-request variable like your proposal will work, with a couple caveats that I'll mention below. I'd recommend a tool like the request_store gem to actually store the per-request value of which "skin" is selected.
Now, for the caveats. First, the main problem with per-request variables is that your Rails app does not always exist in the context of a request. Background jobs and console sessions operate outside of the usual request/response flow of your app. You will need to think about what happens when your models or other non-controller/view code is executed when that variable isn't set. I would suggest simply not having your models depend on RequestStore at all -- have the controllers pass any request-specific information down into the models, if needed.
Secondly, it's not clear from your description if you want any data or logical separation between the two domains, or if you just want different look-and-feels. If the former, you might consider the apartment gem, which aims to make database multi-tenancy easier.
EDIT: I also want to mention that, as an alternative to the multi-tenant solution above, you also have the option of a multi-instance solution. Wherein, you use an environment variable to indicate which version of the site should be displayed, and spin up multiple instances of your app (either on the same server with a reverse proxy, or on separate servers with separate DNS entries or a reverse proxy). The downside is increased infrastructure costs, but the context problem I mentioned above no longer exists (everything always has access to environment variables).

How can I cache external API requests to the SQL database using Rails cache API?

I'm just wondering how I can cache slow requests to external APIs in the database, as I do not want to maintain a memcache service, but I do want a distributed system that will work across multiple dynos and workers on heroku.
I can do it by building my own cache table, but I'm wondering if there's an easier way, especially one that works with the existing caching syntax.
Thanks!
You can cache just about anything in Rails via a call to Rails.cache.fetch. You pass in a key for the cache to look up, and if there's a value in the cache (a "cache hit") then it will get used instead of your slow code.
Let's say we've got an API that takes two airport codes and a date, and returns the best price it can find. This can be a slow lookup, so it's a good candidate for caching:
def find_best_price(start_airport, end_airport, date)
Rails.cache.fetch(cache_key_for(start_airport, end_airport, date)) do
AirportPriceAPI.find_best_price(start_airport, end_airport, date)
end
end
# Different routes & dates will have different prices, so we
# need to have different cache keys for them.
def cache_key_for(start_airport, end_airport, date)
"best_price:#{start_airport}:#{end_airport}:#{date}"
end
You can configure the data store for your cache in config/application.rb by setting config.cache_store. Rails has several kinds built in, but not all are suitable for Heroku. You can't use the FileStore because dynos don't have persistent storage and it's not shared between dynos. MemoryStore isn't shared across dynos and will be wiped out if your dyno restarts.
Your best bet for Heroku is the MemCacheStore. It's supported out-of-the-box in Rails, and Heroku will give you 30Mb of memcache space for free. It's fast and perfect for use between multiple dynos.
But, if you did want to cache values in a database, you can provide your app with a custom cache class. All you have to do is extend ActiveSupport::Cache::Store and be sure to implement read, write, exist?, delete, and fetch. And someone's already packaged a DB-backed cache store as a gem, so you don't even have to implement the low-level details yourself. :)

Is this an ok design decision? is there a better way?

So, For the sake of performance, I'm using database sessions. I figure that while the sessions are server side, I might as well store commonly accessed objects in the session. So, I'm storing serialized versions of the current_user, current_account, and the current_user's permissions.
The User model handels a lot of the permissions methods (things like user.can_do_whatever), but since i'm trying to be more efficient, and store commonly accessed things in the session (this allows for far fewer DB accesses), does it make sense / break any design standards to (upon each request) store the session in an instance variable in the current_user ?
As of right now, I can't think of any alternatives.
ROR application have by default a RESTful design. One rules of REST is stateless. that mean each request from client to server must contain all of the information necessary to understand the request, and cannot take advantage of any stored context on the server.
If you have trouble with Database performance, use a cache system like memcached wich is already integrated in rails (Caching with Rails).
I found a couple of references warning against storing non-primitive data types in the session, but they were all just warnings, and boiled down to: Storing complex objects is "Expecially discouraged" [sic], but if you decide you need to... well, just be careful.
Anyway, I'm kinda taken by the idea of having the users table double as the sessions table, but serialization still seems a bit sketchy. If you're just trying to cut down the number of DB requests, what about storing IDs and using :joins when looking up your user (might require a bit of hackery to get that worked into the default session loading). That avoids synchronization problems and serialization sketchiness, and still only generates a single DB query. Just make sure to use :joins and not :include, as the latter generates a query for each table.
Hope that helps!

Keep value in memory across requests and across users in Rails controller? Use class variable?

We're on Rails 3.0.6.
We maintain a list of numbers that changes only once a month, but nearly every page request requires access to this list.
We store the list in the database.
Instead of hitting the database on every request and grabbing the list, we would like to grab the data once and stash it in memory for efficient access.
If we store the list in each user session, we still need to hit the database for each session.
Is there a way to only hit the database once and let the values persist in memory across all users and all sessions? We need access to the list from the controller. Should we define a class variable in the controller?
Thanks!
I think Rails.cache is the answer to your problem here. It's a simple interface with multiple backends, the default stores the cache in memory, but if you're already using Memcached, Redis or similar in your app you can plug it into those instead.
Try throwing something similar to this in your ApplicationController
def list_of_numbers
#list_of_numbers ||= Rails.cache.fetch(:list_of_numbers, :expires_in => 24.hours) do
# Read from database
end
end
It will try to read from the cache, but if it doesn't find it, will do the intensive stuff and store it for next time
The pattern you're looking for is known as a singleton which is a simple way to cache stuff that doesn't change over time, for example, you'll often see something like this in application_controller.rb -- your code always calls the method
def current_user(user_id)
#current_user ||= User.find user_id
end
When it does, it checks the instance variable #current_user and returns it if not nil, otherwise it does the database lookup and assigns the result to the instance variable, which it returns.
Your problem is similar, but broader, since it applies to all instances.
One solution is with a class variable, which is documented here http://www.ruby-doc.org/docs/ProgrammingRuby/html/tut_classes.html#S3 -- a similar solution to the one above applies here.
This might be a good solution in your case, but has some issues. In specific, (assuming this is a web app) depending on your configuration, you may have multiple instances of Rails loaded in different processes, and class variables only apply to their specific instance. The popular Passenger module (for Apache and Nginx) can be configured to allow class variables to be accessible to all of it's instances ... which works great if you have only one server.
But when you have multiple servers, things get a little tricky. Sure, you could use a class variable and accept that you'll have to make one hit to the database for each server. This works great except for the when that the variable ... varies! You'll need some way of invalidating the variable across all servers. Depending on how critical the it is, this could create various very gnarly and difficult to track down errors (I learned the hard way :-).
Enter memcached. This is a wonderful tool that is a general purpose caching tool. It's very lightweight, and very, very smart. In particular, it can create distributed caches across a cluster of servers -- the value is only ever stored once (thus avoiding the synchronization problem noted above) and each server knows which server to look on to find any given cache key. It even handles when servers go down and all sorts of other unpleasantries.
Setup is remarkably easy, and Rails almost assumes you'll use it for your various caching needs, and the Rails gem just makes it as simple as pie.
On the assumption that there will be other opportunities to cache stuff that might not be as simple as a value you can store in a class variable, that's probably the first place to start.

Store Selected User Info in Database

Using Symfony 1.4.x (with Propel), I've been given a task that requires me to share specific user info with multiple external systems. This is currently stored as a session (in memory) attribute, but I need to get it into a database so that I can create an API that will provide that info to authorized consumers.
I'd rather not overhaul the system to store all session data in the database (unless it's trivial and can handle namespaces), but I can't find any information on a recommended way for the myUser class to write data to the database. Is it possible to do this since the class doesn't have a model, per se (that I'm aware of)? Are there any recommended solutions or best practices for doing this?
Thanks.
UPDATE
If I were to boil this all the way down to its bare essentials, I guess the key question is this: What's the "best" way to read from/write to a database from the myUser class? Or, alternatively, is there another path that's recommended to accomplish the same end result?
Will storing result of json_encodeing or serializeing of
$myUserInstance->getAttributeHolder()->getAll()
do the job?
In the absence of a good means of accessing a database from the myUser class, I opted for a slightly different path. I installed memcached in a location accessible by both apps and extended PHP's Memcached class to apply a few customizations. The apps can now share information by writing specially formatted keys to the memcached instance.
I opted not to overhaul my existing cache storage mechanism (why upset the apple cart?) and am reading from/writing to memcached selectively for information that truly needs to be shared.

Resources