Ruby on Rails instance variable caching based on parameters - ruby-on-rails

I am storing users miscellaneous data in user_data table and when I am retrieving that data with the association I defined and then I am actually caching it using Ruby instance Variable caching like this.
def user_data(user_id)
#user_data || = User.find(user_id).data
end
but instance variable #user_data will be allocated value only first time when it's nil and once it hold data for a user lets say for user_id equal to 1,and when I am passing user_id 2 in this method it's returning data for user_id 1 , because it will not allocate new value to it so my question is how can I cache it's value based on parameters of function.

Take a look at the Caching with Rails guide. Rails can cache data on multiple levels from full page caching to fragment caching, I strongly advise you to read all this page so you can make a perceived choice.
For the low-level caching you can do this:
#user_data = Rails.cache.fetch("user_#{user_id}", expires_in: 1.hour) do
User.find(user_id).data
end
By default Rails stores cache on disk, but you can setup for memcache, memory store, and etc.

You can use a hash for a key-based intance-variable-cache. I think that does what you want.
def user_data(user_id)
#user_data ||= {}
#user_data[user_id.to_i] || = User.find(user_id).data
end

Related

Rails 4 - Low-level Caching Still Querying Database [duplicate]

My version is:
Rails: 3.2.6
dalli: 2.1.0
My env is:
config.action_controller.perform_caching = true
config.cache_store = :dalli_store, 'localhost:11211', {:namespace => 'MyNameSpace'}
When I write:
Rails.cache.fetch(key) do
User.where('status = 1').limit(1000)
end
The user model can't be cached. If I use
Rails.cache.fetch(key) do
User.all
end
it can be cached. How to cache query result?
The reason is because
User.where('status = 1').limit(1000)
returns an ActiveRecord::Relation which is actually a scope, not a query. Rails caches the scope.
If you want to cache the query, you need to use a query method at the end, such as #all.
Rails.cache.fetch(key) do
User.where('status = 1').limit(1000).all
end
Please note that it's never a good idea to cache ActiveRecord objects. Caching an object may result in inconsistent states and values. You should always cache primitive objects, when applicable. In this case, consider to cache the ids.
ids = Rails.cache.fetch(key) do
User.where('status = 1').limit(1000).pluck(:id)
end
User.find(ids)
You may argue that in this case a call to User.find it's always executed. It's true, but the query using primary key is fast and you get around the problem I described before. Moreover, caching active record objects can be expensive and you might quickly end up filling all Memcached memory with just one single cache entry. Caching ids will prevent this problem as well.
In addition to selected answer: for Rails 4+ you should use load instead of all for getting the result of scope.
Rails.cache.fetch caches exactly what the block evaluates to.
 User.where('status = 1').limit(1000)
Is just a scope, so what gets cached is just the ActiveRecord::Relation object, ie the query, but not its results (because the query has not been executed yet).
If you want something useful to be cached, you need to force execution of the query inside the block, for example by doing
User.where('status = 1').limit(1000).all
Note that on rails 4 , all doesn't force loading of the relation - use to_a instead
Using
User.where("status = 1").limit(1000).all
should work.

Best practice for a big array manipulation with values that never change and will be used in more than one view

What would be the best and more efficient way in Rails if I want to use a hash of about 300-500 integers (but it will never be modified) and use it in more than one view in the application?
Should I save the data in the database?, create the hash in each action that is used? (this is what I do now, but the code looks ugly and inefficient), or is there another option?
Why don't you put it in a constant? You said it will never change, so it fits either configuration or constant.
Using the cache has the downside that it can be dropped out of cache, triggering a reload, which seems quite useless in this case.
The overhead of having it always in memory is none, 500 integers are 4KB or something like that at most, you are safe.
You can write the hash manually or load a YAML file (or whatever) if you prefer, your choice.
My suggestion is create a file app/models/whatever.rb and:
module Whatever
MY_HASH = {
1 => 241
}.freeze
end
This will be preloaded by rails on startup (in production) and kept in memory all the time.
You can access those valus in view with Whatever::MY_HASH[1], or you can write a wrapper method like
module Whatever
MY_HASH = {
1 => 241
}.freeze
def self.get(id)
MY_HASH.fetch(id)
end
end
And use that Whatever.get(1)
If the data will never be changed, why not just calculate the values before hand and write them directly into the view?
Another option would be to put the values into a singleton and cache them there.
require 'singleton'
class MyHashValues
include Singleton
def initialize
#
#results = calculation
end
def result_key_1
#results[:result_key_1]
end
def calculation
Hash.new
end
end
MyHashValues.instance.result_key_1
Cache it, it'll do exactly what you want and it's a standard Rails component. If you're not caching yet, check out the Rails docs on caching. If you use the memory store, your data will essentially be in RAM.
You will then be able to do this sort of thing
# The block contains the value to cache, if there's a miss
# Setting the value is done initially and after the cache
# expires or is cleared.
# put this in application controller and make it a helper method
def integer_hash
cache.fetch('integer_hash') { ... }
end
helper_method :integer_hash

Cache with expiring keys

I'm working on a mashup site and would like to limit the number of fetches to scrape the source sites. There is essentially one bit of data I need, an integer, and would like to cache it with a defined expiration period.
To clarify, I only want to cache the integer, not the entire page source.
Is there a ruby or rails feature or gem that already accomplishes this for me?
Yes, there is ActiveSupport::Cache::Store
An abstract cache store class. There are multiple cache store
implementations, each having its own additional features. See the
classes under the ActiveSupport::Cache module, e.g.
ActiveSupport::Cache::MemCacheStore. MemCacheStore is currently the
most popular cache store for large production websites.
Some implementations may not support all methods beyond the basic
cache methods of fetch, write, read, exist?, and delete.
ActiveSupport::Cache::Store can store any serializable Ruby object.
http://api.rubyonrails.org/classes/ActiveSupport/Cache/Store.html
cache = ActiveSupport::Cache::MemoryStore.new
cache.read('Chicago') # => nil
cache.write('Chicago', 2707000)
cache.read('Chicago') # => 2707000
Regarding the expiration time, this can be done by passing the time as a initialization parameter
cache = ActiveSupport::Cache::MemoryStore.new(expires_in: 5.minutes)
If you want to cache a value with a different expiration time, you can also set this when writing a value to the cache
cache.write(key, value, expires_in: 1.minute) # Set a lower value for one entry
See Caching with Rails, particularly the :expires_in option to ActiveSupport::Cache::Store.
For example, you might go:
value = Rails.cache.fetch('key', expires_in: 1.hour) do
expensive_operation_to_compute_value()
end

how many octets are read from the cache?

I'm trying to do a simple benchmark to know how many octets are read from the cache in each page of my Rails site. I'm talking about the data retrieve from Rails.cache.
I would like to display something like 123Ko/145Ko at the bottom of my pages.
Does a gem exist to perform this task or perhaps is there something included in the ruby standard library?
One option is to subclass the store you're using and extend the protected read_entry method declared in ActiveSupport::Cache::Store, which FileStore and the other caches themselves subclass.
FileStoreWithReadTracking < ActiveSupport::Cache::FileStore
def start_page
#octets_read = 0
end
def octets_read
#octets_read
end
protected
def read_entry(key, options)
entry = super(key, options)
#octets_read += entry.size if entry
entry
end
end
When starting a page, you can call start_page to zero out the octet count. Since read_entry is a low-level method used every time the cache tries to perform a read, you can intercept any data read and get its size before returning it. You might have to convert size to octets.
To set this as your custom cache store, add config.cache_store = FileStoreWithReadTracking.new('/path/to/file/store') to your environment. I think you can subclass all the stores this way.

ActiveResource Caching

How would you cache an ActiveResource model? Preferably in memcached. Right now it's pulling a model from my REST API fine but it pulls dozens of records each time. Would be best to cache them.
I've been playing around with the same thing and I think I've found a pretty simple way to check redis for the cached object first. This will only work when you use the find method, but for my needs, I think this is sufficient.
By overriding find, I can check the checksum of the arguments to see if I already have the response saved in redis. If I do, I can pull the JSON response out of redis and create a new object right there. If I don't, I'll pass the find through to ActiveResource::Base's find and the normal action will happen.
I haven't implemented the saving of the responses into redis with ActiveResource yet, but my plan is to populate those caches elsewhere. This way, normally I can rely on my caches being there, but if they aren't, I can fall back to the API.
class MyResource < ActiveResource::Base
class << self
def find(*arguments)
checksum = Digest::MD5.hexdigest(arguments.md5key)
cached = $redis.get "cache:#{self.element_name}:#{checksum}"
if cached
return self.new JSON.parse(cached)
end
scope = arguments.slice!(0)
options = arguments.slice!(0) || {}
super scope, options
end
end
end
and a little patch so we can get an md5key for our array:
require 'digest/md5'
class Object
def md5key
to_s
end
end
class Array
def md5key
map(&:md5key).join
end
end
class Hash
def md5key
sort.map(&:md5key).join
end
end
Does that help?
Caching in rails is configurable. You can configure the cache to be backed by memcached. Typically you can cache when you retrieve. It's unclear if you are a rest consumer or service but it's really not relevant. If you cache on read (or retrieve) and then read the cache the next time, everything will work just fine. If you are pulling the data from a database, serve up the cache and if no cache is available then cache the read from the database.
I wrote a blog post about it here:
http://squarism.com/2011/08/30/memcached-with-rails-3/
However what I wrote about is really pretty simple. Just showing how to avoid an expensive operation with what is sort of similar to the ||= operator. For a better example, new relic has a scaling rails episode. For example, they show how to cache the latest 10 posts:
def self.recent
Rails.cache.fetch("recent_posts", :expires_in => 30.minutes) do
self.find(:all, :limit => 10)
end
end
Rails.cache has been configured to be a memcached cache, this is the configurable part I was talking about.
I would suggest looking into https://github.com/Ahsizara/cached_resource, almost all of the work is done for you through the gem.

Resources