Persistent resources in a rails app - ruby-on-rails

First, this may not be the best title, but it seems to make sense at this time.
What I'm looking at is loading a resource which should live for the life of the web app. There may be some provisioning at a later point for a manual refresh, but currently that is not the case.
We have a complex permission structure which resides in the database for multiple reasons. I do not want to incur the overhead of retrieving this for each page load, thus I want it to reside in memory. My first instinct is to create a singleton which I load this into and use it whenever needed to lookup a permission. I understand the hesitance towards singletons and wonder if that is a poor approach.
I do not want to go down the route of yaml or another storage mechanism, the permissions must reside in the DB for other dependencies. That said, in Rails, what would be the most appropriate way to efficiently load and read the data?

This sounds like the perfect use of the cache
permissions = Rails.cache.fetch( 'permissions' ) do
# Permissions don't exist yet, perform long operation and load from DB
load_permissions_from_db
end
More details here.

I'm not totally sure what you mean but i think there are a few ways you could go
caching (e.g. caches_page :page or caches_action :action in the controller )
or possibly storing something in a cookie/ session data, of course i don't totally understand the nature of this data so I don't Know what would work better, if at all

Related

What is the accepted method for a global state?

I was wondering what would you guys consider the best way to go about having some simple stuff stored across sessions without using the DB.
I'm looking to have like having 'modes' to a website. So it can be in mode a or b, and depending on the mode, buttons would do different things.
Would using Rails.cache.read and write be the best option ? I've heard it has issues with heroku if you leave the cache as filesystem, then has problems as a memcache because of multi-threading ?
I'm really trying to avoid having a whole table on DB getting used for users checking a global state of the site each request.
In order to have a "global" state, then you need to create a singke dependency for each instance of your application.
In fact, you can't rely on cookies or sessions, are they are client-oriented and they are not shared between clients.
The database is the most common approach. You may be able to use the file system in some cases, but not for Heroku as there may be several different instances of your app running under different file systems.
Any solution that can easily be shared across instances will work:
Memory database like Redis
SQL or NoSQL database
Cache systems, as long as they are not specific to one instance. Memcached may work, but it's not persistent (hence you may lose the state)
External storage (such as Amazon S3)
To me, a relational database or a memory database such as Redis seems to be the most plausible solution.
If you want per-user setting - consider storing in session (which can be stored in cookies), or directly in cookies.
Both methods end up storing some data (but not lots of it, because cookies are passed by browser with each request) inside clients' browsers.
You could put it in a table - just so that you have it, but then make the value available via the ApplicationContoller, with a simple cache method in between
so something like: - not tested!!
def get_mode
if #mode_timeout.nil? or #mode_timeout < Time.now
#mode = ModeModel.first.mode
#mode_timeout = Time.now + 60.seconds
end
#mode
end
You'll have to create a model, or you could update if via a controller with a set_mode method instead, but that would be more transient.
Then you can just call get_mode from your controller.

SaaS approach to App_GlobalResources

We are building an app where multiple websites are powered by a single site on IIS
We have a web-based tool where webmaster can edit the "resx" files, like:
/App_GlobalResources/es/Backend.es.resx
However there is two problems with this:
changing these files effects all sites.
It also causes the entire IIS site to restart.
Is there another approach to this?
I think storing the strings in a DB may be a bad idea as it will cause hundreds of SQL lookups per page.
Use a database driven resource provider that supports caching. And you are in luck because someone else already done it
Do you provide an interface to edit the resx files? If so, cache them in the Application scope and expire the cache when they are updated. Then, store them in the database. This way, you'll have both speed and flexibility.
Just because the data is in database, doesn't mean it'll be slow. Cache is the solution. Of course, the first lookup will be slow, but subsequent lookups will be as fast as you can get.

Is this an ok design decision? is there a better way?

So, For the sake of performance, I'm using database sessions. I figure that while the sessions are server side, I might as well store commonly accessed objects in the session. So, I'm storing serialized versions of the current_user, current_account, and the current_user's permissions.
The User model handels a lot of the permissions methods (things like user.can_do_whatever), but since i'm trying to be more efficient, and store commonly accessed things in the session (this allows for far fewer DB accesses), does it make sense / break any design standards to (upon each request) store the session in an instance variable in the current_user ?
As of right now, I can't think of any alternatives.
ROR application have by default a RESTful design. One rules of REST is stateless. that mean each request from client to server must contain all of the information necessary to understand the request, and cannot take advantage of any stored context on the server.
If you have trouble with Database performance, use a cache system like memcached wich is already integrated in rails (Caching with Rails).
I found a couple of references warning against storing non-primitive data types in the session, but they were all just warnings, and boiled down to: Storing complex objects is "Expecially discouraged" [sic], but if you decide you need to... well, just be careful.
Anyway, I'm kinda taken by the idea of having the users table double as the sessions table, but serialization still seems a bit sketchy. If you're just trying to cut down the number of DB requests, what about storing IDs and using :joins when looking up your user (might require a bit of hackery to get that worked into the default session loading). That avoids synchronization problems and serialization sketchiness, and still only generates a single DB query. Just make sure to use :joins and not :include, as the latter generates a query for each table.
Hope that helps!

Keep value in memory across requests and across users in Rails controller? Use class variable?

We're on Rails 3.0.6.
We maintain a list of numbers that changes only once a month, but nearly every page request requires access to this list.
We store the list in the database.
Instead of hitting the database on every request and grabbing the list, we would like to grab the data once and stash it in memory for efficient access.
If we store the list in each user session, we still need to hit the database for each session.
Is there a way to only hit the database once and let the values persist in memory across all users and all sessions? We need access to the list from the controller. Should we define a class variable in the controller?
Thanks!
I think Rails.cache is the answer to your problem here. It's a simple interface with multiple backends, the default stores the cache in memory, but if you're already using Memcached, Redis or similar in your app you can plug it into those instead.
Try throwing something similar to this in your ApplicationController
def list_of_numbers
#list_of_numbers ||= Rails.cache.fetch(:list_of_numbers, :expires_in => 24.hours) do
# Read from database
end
end
It will try to read from the cache, but if it doesn't find it, will do the intensive stuff and store it for next time
The pattern you're looking for is known as a singleton which is a simple way to cache stuff that doesn't change over time, for example, you'll often see something like this in application_controller.rb -- your code always calls the method
def current_user(user_id)
#current_user ||= User.find user_id
end
When it does, it checks the instance variable #current_user and returns it if not nil, otherwise it does the database lookup and assigns the result to the instance variable, which it returns.
Your problem is similar, but broader, since it applies to all instances.
One solution is with a class variable, which is documented here http://www.ruby-doc.org/docs/ProgrammingRuby/html/tut_classes.html#S3 -- a similar solution to the one above applies here.
This might be a good solution in your case, but has some issues. In specific, (assuming this is a web app) depending on your configuration, you may have multiple instances of Rails loaded in different processes, and class variables only apply to their specific instance. The popular Passenger module (for Apache and Nginx) can be configured to allow class variables to be accessible to all of it's instances ... which works great if you have only one server.
But when you have multiple servers, things get a little tricky. Sure, you could use a class variable and accept that you'll have to make one hit to the database for each server. This works great except for the when that the variable ... varies! You'll need some way of invalidating the variable across all servers. Depending on how critical the it is, this could create various very gnarly and difficult to track down errors (I learned the hard way :-).
Enter memcached. This is a wonderful tool that is a general purpose caching tool. It's very lightweight, and very, very smart. In particular, it can create distributed caches across a cluster of servers -- the value is only ever stored once (thus avoiding the synchronization problem noted above) and each server knows which server to look on to find any given cache key. It even handles when servers go down and all sorts of other unpleasantries.
Setup is remarkably easy, and Rails almost assumes you'll use it for your various caching needs, and the Rails gem just makes it as simple as pie.
On the assumption that there will be other opportunities to cache stuff that might not be as simple as a value you can store in a class variable, that's probably the first place to start.

Cache strategy in a rails application using membase, how do I make sure I don't delete everything?

I have a rails application.
I am using membase/memcache to cache DB objects and HTML partials.
I cache db objects with the create operation and of course find operations etc...
now, when I do User.find(1).
this is cached as an object in memcache.
I have a pretty good strategy with caching these along side with the HTML content.
now, when I deploy, one of the thing my Capistrano script is doing is to clear the cache (because of the html partials that change) but there's really no reason to invalidate the cache of the db objects.
How can I only delete part of my cache?
Can this be done?
my cache keys look like this
DB: user_find_by_id_10000
HTML: user_profile_home_1000
Would appreciate you help
Thanks.
It might also be a good idea to user separate buckets for your DB cache and your HTML cache...then you can use the 'flush_all' command to clear out a whole bucket without affecting the other one.
Also, looking forward to Couchbase Server 2.0 which will be in a developer preview at the end of this week, you'll be able to create indexes and views to return just the data that you're looking for, you can then feed that through a little process to delete all the items that match a certain criteria.
Perry Krug
Solutions Architect, Couchbase Inc.
It's fairly simple to delete a cached item based on its key:
Rails.cache.delete('user_profile_home_1000')
In the code above I'm assuming you've set Rails' cache to use Memcached.

Resources