Why this "fragment cache" won't work? - ruby-on-rails

I've just tried to implement caching when it's loading example.com/communities?sort=popular
My code is just like this. However it seems caching is not working.
It looks like it's still sending SQL every time it reloads...
What's wrong?
Then When after the user made or edited "Community" record, I'd like to clear all the stored caches that contains the string "community_index_sort_by_".
config/environment/development.rb
...
config.consider_all_requests_local = true
config.action_controller.perform_caching = true
config.cache_store = :dalli_store
...
community_controller.rb
def index
#key = "community_index_sort_by_" + params[:sort].to_s + "_page_" + params[:page].to_s
if params[:sort] == 'popular'
unless read_fragment(:controller => "communities", :action => "index", :action_suffix => #key)
#communities = Community.scoped.page(params[:page]).order("cached_votes_up DESC")
end
elsif params[:sort] == 'latest'
#communities = Community.scoped.page(params[:page]).order("created_at DESC")
end
end
I haven't touched in view

The code you've shown only attempts to read from the cache, it never stores anything to it. If you want to populate the cache if no value is found (e.g., on a cache miss) you can use Rails.cache.fetch rather than read_fragment. fetch will return the cached value if one exists. Otherwise, if a block was passed then it will be run when a cache miss occurs and the return value will be stored in the cache. For instance, the relevant part of your code snippet would be something like
#communities = Rails.cache.fetch(["communities", "index", #key]) do
Community.scoped.page(params[:page]).order("cached_votes_up DESC")
end
The recommended approach for expiring cached data when an object is modified is to have the cache key include some piece of data that changes whenever the object is modified. This is commonly an updated_at timestamp field, which ActiveRecord will automatically update when the object is saved. The updated_at field also has the advantage of being automatically used as part of the cache key when the object is used directly as part of the cache key (e.g., a cache key of #community would result in a cache key of something like communities/1-20130116113736). This will typically require a small amount of restructuring to ensure that a relevant object is available to be used in the cache key. David Heinemeier Hansson discusses this in more detail. Step 5 in particular is most relevant to what I've mentioned here.

Related

How to set an expiry on a cached Ruby search?

I have a function, which returns a list of ID's, in the Rails caching guide I can see that an expiration can be set on the cached results, but I have implemented my caching somewhat differently.
def provide_book_ids(search_param)
#returned_ids ||= begin
search = client.search(query: search_param, :reload => true)
search.fetch
search.options[:query] = search_str
search.fetch(true)
search.map(&:id)
end
end
What is the recomennded way to set a 10 minute cache expiry, when written as above?
def provide_book_ids(search_param)
#returned_ids = Rails.cache.fetch("zendesk_ids", expires_in: 10.minutes) do
search = client.search(query: search_param, :reload => true)
search.fetch
search.options[:query] = search_str
search.fetch(true)
search.map(&:id)
end
end
I am assuming this code is part of some request-response cycle and not something else (for example a long running worker or some class that is initialized once in your app. In such a case you wouldn't want to use #returned_ids directly but instead call provide_book_ids to get the value, but from I understand that's not your scenario so provided approach above should work.

cache_unless still computes the cache key although the condition is true

I have a simple fragement cache :
-cache_unless user_signed_in?, ['show', #question, #question.user.username, #question.user.score, #question.user.avatar_url] do
The page is not being cached when I'm singed in, which is what I want, however, the cache key is still being computed and I can see ActiveRecord queries in my log.
I was expecting that the cache doesn't compute the new cache key if the condition was true.
I had to use simple if else logic :
if user_signed_in?
#render uncached resources
else
#cache
end

Rails 3 Cache Returning Nil

Using Rails memory cache like this in one controller.
def form_config_cache
Rails.cache.fetch("form_config", :expires_in => 12.hours) do
puts 'Building cache...'
form_config = s3_read_object('form_config.js')
return JSON.parse(form_config)
end
end
This is working fine on the controller where it is defined. But when I try to read the value from another controller, it is returning as nil. Can anyone explain what might be going on? Here is how I am trying to read it in another controller.
form_config = Rails.cache.read('form_config')
Your code doesn't actually ever cache anything: return returns form the whole method, so the bit of fetch that stores values in the cache never executes and there is nothing for your call to read to return.
You could either use next or nothing at all:
def form_config_cache
Rails.cache.fetch("form_config", :expires_in => 12.hours) do
form_config = s3_read_object('form_config.js')
JSON.parse(form_config)
end
end

Rails Cache Key generated as ActiveRecord::Relation

I am attempting to generate a fragment cache (using a Dalli/Memcached store) however the key is being generated with "#" as part of the key, so Rails doesn't seem to be recognizing that there is a cache value and is hitting the database.
My cache key in the view looks like this:
cache([#jobs, "index"]) do
The controller has:
#jobs = #current_tenant.active_jobs
With the actual Active Record query like this:
def active_jobs
self.jobs.where("published = ? and expiration_date >= ?", true, Date.today).order("(featured and created_at > now() - interval '" + self.pinned_time_limit.to_s + " days') desc nulls last, created_at desc")
end
Looking at the rails server, I see the cache read, but the SQL Query still runs:
Cache read: views/#<ActiveRecord::Relation:0x007fbabef9cd58>/1-index
Read fragment views/#<ActiveRecord::Relation:0x007fbabef9cd58>/1-index (1.0ms)
(0.6ms) SELECT COUNT(*) FROM "jobs" WHERE "jobs"."tenant_id" = 1 AND (published = 't' and expiration_date >= '2013-03-03')
Job Load (1.2ms) SELECT "jobs".* FROM "jobs" WHERE "jobs"."tenant_id" = 1 AND (published = 't' and expiration_date >= '2013-03-03') ORDER BY (featured and created_at > now() - interval '7 days') desc nulls last, created_at desc
Any ideas as to what I might be doing wrong? I'm sure it has to do w/ the key generation and ActiveRecord::Relation, but i'm not sure how.
Background:
The problem is that the string representation of the relation is different each time your code is run:
|This changes|
views/#<ActiveRecord::Relation:0x007fbabef9cd58>/...
So you get a different cache key each time.
Besides that it is not possible to get rid of database queries completely. (Your own answer is the best one can do)
Solution:
To generate a valid key, instead of this
cache([#jobs, "index"])
do this:
cache([#jobs.to_a, "index"])
This queries the database and builds an array of the models, from which the cache_key is retrieved.
PS: I could swear using relations worked in previous versions of Rails...
We've been doing exactly what you're mentioning in production for about a year. I extracted it into a gem a few months ago:
https://github.com/cmer/scope_cache_key
Basically, it allows you to use a scope as part of your cache key. There are significant performance benefits to doing so since you can now cache a page containing multiple records in a single cache element rather than looping each element in the scope and retrieving caches individually. I feel that combining this with with the standard "Russian Doll Caching" principles is optimal.
I have had similar problems, I have not been able to successfully pass relations to the cache function and your #jobs variable is a relation.
I coded up a solution for cache keys that deals with this issue along with some others that I was having. It basically involves generating a cache key by iterating through the relation.
A full write up is on my site here.
http://mark.stratmann.me/content_items/rails-caching-strategy-using-key-based-approach
In summary I added a get_cache_keys function to ActiveRecord::Base
module CacheKeys
extend ActiveSupport::Concern
# Instance Methods
def get_cache_key(prefix=nil)
cache_key = []
cache_key << prefix if prefix
cache_key << self
self.class.get_cache_key_children.each do |child|
if child.macro == :has_many
self.send(child.name).all.each do |child_record|
cache_key << child_record.get_cache_key
end
end
if child.macro == :belongs_to
cache_key << self.send(child.name).get_cache_key
end
end
return cache_key.flatten
end
# Class Methods
module ClassMethods
def cache_key_children(*args)
#v_cache_key_children = []
# validate the children
args.each do |child|
#is it an association
association = reflect_on_association(child)
if association == nil
raise "#{child} is not an association!"
end
#v_cache_key_children << association
end
end
def get_cache_key_children
return #v_cache_key_children ||= []
end
end
end
# include the extension
ActiveRecord::Base.send(:include, CacheKeys)
I can now create cache fragments by doing
cache(#model.get_cache_key(['textlabel'])) do
I've done something like Hopsoft, but it uses the method in the Rails Guide as a template. I've used the MD5 digest to distinguish between relations (so User.active.cache_key can be differentiated from User.deactivated.cache_key), and used the count and max updated_at to auto-expire the cache on updates to the relation.
require "digest/md5"
module RelationCacheKey
def cache_key
model_identifier = name.underscore.pluralize
relation_identifier = Digest::MD5.hexdigest(to_sql.downcase)
max_updated_at = maximum(:updated_at).try(:utc).try(:to_s, :number)
"#{model_identifier}/#{relation_identifier}-#{count}-#{max_updated_at}"
end
end
ActiveRecord::Relation.send :include, RelationCacheKey
While I marked #mark-stratmann 's response as correct I actually resolved this by simplifying the implementation. I added touch: true to my model relationship declaration:
belongs_to :tenant, touch: true
and then set the cache key based on the tenant (with a required query param as well):
<% cache([#current_tenant, params[:query], "#{#current_tenant.id}-index"]) do %>
That way if a new Job is added, it touches the Tenant cache as well. Not sure if this is the best route, but it works and seems pretty simple.
Im using this code:
class ActiveRecord::Base
def self.cache_key
pluck("concat_ws('/', '#{table_name}', group_concat(#{table_name}.id), date_format(max(#{table_name}.updated_at), '%Y%m%d%H%i%s'))").first
end
def self.updated_at
maximum(:updated_at)
end
end
maybe this can help you out
https://github.com/casiodk/class_cacher , it generates a cache_key from the Model itself, but maybe you can use some of the principles in the codebase
As a starting point you could try something like this:
def self.cache_key
["#{model_name.cache_key}-all",
"#{count}-#{updated_at.utc.to_s(cache_timestamp_format) rescue 'empty'}"
] * '/'
end
def self.updated_at
maximum :updated_at
end
I'm having normalized database where multiple models relate to the same other model, think of clients, locations, etc. all having addresses by means of a street_id.
With this solution you can generate cache_keys based on scope, e.g.
cache [#client, #client.locations] do
# ...
end
cache [#client, #client.locations.active, 'active'] do
# ...
end
and I could simply modify self.updated from above to also include associated objects (because has_many does not support "touch", so if I updated the street, it won't be seen by the cache otherwise):
belongs_to :street
def cache_key
[street.cache_key, super] * '/'
end
# ...
def self.updated_at
[maximum(:updated_at),
joins(:street).maximum('streets.updated_at')
].max
end
As long as you don't "undelete" records and use touch in belongs_to, you should be alright with the assumption that a cache key made of count and max updated_at is sufficient.
I'm using a simple patch on ActiveRecord::Relation to generate cache keys for relations.
require "digest/md5"
module RelationCacheKey
def cache_key
Digest::MD5.hexdigest to_sql.downcase
end
end
ActiveRecord::Relation.send :include, RelationCacheKey

Performance: minimize database hitting

I am using Ruby on Rails 3.0.7 and I am trying to minimize database hitting. In order to do that I retrieve from the database all Article objects related to a User and then perform a search on those retrieved objects.
What I do is:
stored_objects = Article.where(:user_id => <id>) # => ActiveRecord::Relation
<some_iterative_function_1>.each { |...|
stored_object = stored_objects.where(:status => 'published').limit(1)
...
# perform operation on the current 'stored_object' considered
}
<some_iterative_function_2>.each { |...|
stored_object = stored_objects.where(:visibility => 'public').limit(1)
...
# perform operation on the current 'stored_object' considered
}
<some_iterative_function_n>.each { |...|
...
}
The stored_object = stored_objects.where(:status => 'published') code will really avoid to hitting the database (I ask this because in my log file it seams still run a database query for each iteration)? If no, how can I minimize database hitting?
P.S.: in few words, what I would like to do is to work on the ActiveRecord::Relation (an array of ) but the where method called on it seams to hit the database.
Rails has functionality to grab chunks of the database at one time, then iterate over the rows without having to hit the database again.
See "Retrieving Multiple Objects in Batches" for more information about find_each and find_in_batches.
Once you start iterating over stored_objects (if that's what you're doing), they'll be loaded from the database. If you want to load only the users's published articles, you could do this:
stored_objects = Article.where(:user_id => id, :status => 'published')
If you instead want to load published and unpublished articles and do something different with the published ones, you could do this:
stored_objects = Article.where(:user_id => id)
stored_objects.find_all { |a| a.status == 'published' }. each do |a|
# ... do something with a published article
end
Or perhaps:
Article.where(:user_id => id).each do |article|
case article.status
when 'published'
# ... do something with a published article
else
# ... do something with an article that's not published
end
end
Each of these examples performs only one database query. Choosing which one depends on which data you really want to work with.

Resources