Using Rails memory cache like this in one controller.
def form_config_cache
Rails.cache.fetch("form_config", :expires_in => 12.hours) do
puts 'Building cache...'
form_config = s3_read_object('form_config.js')
return JSON.parse(form_config)
end
end
This is working fine on the controller where it is defined. But when I try to read the value from another controller, it is returning as nil. Can anyone explain what might be going on? Here is how I am trying to read it in another controller.
form_config = Rails.cache.read('form_config')
Your code doesn't actually ever cache anything: return returns form the whole method, so the bit of fetch that stores values in the cache never executes and there is nothing for your call to read to return.
You could either use next or nothing at all:
def form_config_cache
Rails.cache.fetch("form_config", :expires_in => 12.hours) do
form_config = s3_read_object('form_config.js')
JSON.parse(form_config)
end
end
Related
I have a function, which returns a list of ID's, in the Rails caching guide I can see that an expiration can be set on the cached results, but I have implemented my caching somewhat differently.
def provide_book_ids(search_param)
#returned_ids ||= begin
search = client.search(query: search_param, :reload => true)
search.fetch
search.options[:query] = search_str
search.fetch(true)
search.map(&:id)
end
end
What is the recomennded way to set a 10 minute cache expiry, when written as above?
def provide_book_ids(search_param)
#returned_ids = Rails.cache.fetch("zendesk_ids", expires_in: 10.minutes) do
search = client.search(query: search_param, :reload => true)
search.fetch
search.options[:query] = search_str
search.fetch(true)
search.map(&:id)
end
end
I am assuming this code is part of some request-response cycle and not something else (for example a long running worker or some class that is initialized once in your app. In such a case you wouldn't want to use #returned_ids directly but instead call provide_book_ids to get the value, but from I understand that's not your scenario so provided approach above should work.
I want to make it so if its directReports it shouldn't get it from rails cache. I tried to just return it but that doesnt seem to work.
''' # dont get it from cache if DirectReports
when PerformanceManagement::V0::EmployeeFilter::DirectReports
return
else
filtered_results = loaded_records
end'''
plz don't be mean it's my first time posting.
For such things you can create a method that just acts as a wrapper to handle the cache-or-not condition:
def main_logic_here
... expensive calculation
end
def main_logic_wrapper
if true
main_logic_here
else
Rails.cache.fetch(...) { main_logic_here }
end
end
I've just tried to implement caching when it's loading example.com/communities?sort=popular
My code is just like this. However it seems caching is not working.
It looks like it's still sending SQL every time it reloads...
What's wrong?
Then When after the user made or edited "Community" record, I'd like to clear all the stored caches that contains the string "community_index_sort_by_".
config/environment/development.rb
...
config.consider_all_requests_local = true
config.action_controller.perform_caching = true
config.cache_store = :dalli_store
...
community_controller.rb
def index
#key = "community_index_sort_by_" + params[:sort].to_s + "_page_" + params[:page].to_s
if params[:sort] == 'popular'
unless read_fragment(:controller => "communities", :action => "index", :action_suffix => #key)
#communities = Community.scoped.page(params[:page]).order("cached_votes_up DESC")
end
elsif params[:sort] == 'latest'
#communities = Community.scoped.page(params[:page]).order("created_at DESC")
end
end
I haven't touched in view
The code you've shown only attempts to read from the cache, it never stores anything to it. If you want to populate the cache if no value is found (e.g., on a cache miss) you can use Rails.cache.fetch rather than read_fragment. fetch will return the cached value if one exists. Otherwise, if a block was passed then it will be run when a cache miss occurs and the return value will be stored in the cache. For instance, the relevant part of your code snippet would be something like
#communities = Rails.cache.fetch(["communities", "index", #key]) do
Community.scoped.page(params[:page]).order("cached_votes_up DESC")
end
The recommended approach for expiring cached data when an object is modified is to have the cache key include some piece of data that changes whenever the object is modified. This is commonly an updated_at timestamp field, which ActiveRecord will automatically update when the object is saved. The updated_at field also has the advantage of being automatically used as part of the cache key when the object is used directly as part of the cache key (e.g., a cache key of #community would result in a cache key of something like communities/1-20130116113736). This will typically require a small amount of restructuring to ensure that a relevant object is available to be used in the cache key. David Heinemeier Hansson discusses this in more detail. Step 5 in particular is most relevant to what I've mentioned here.
Let's suppose I have a method like this:
def foo
Rails.cache.fetch("cache_key", :expires_in => 60.minutes) do
return_something
end
end
return_something sometimes returns a nil value. When this happens, I don't want the nil value to be cached for 60 minutes. Instead, the next time I call foo, I want the block passed to fetch to be executed again.
Is Rails.cache.fetch working like this by default? Or do I have to implement this functionality?
Update (with Answer)
Turns out, the answer was no, at least when using Memcached.
it depends on the implementation of the cache-store that you are using. i would say that it should not cache nil values, but empty strings are ok to cache.
look at the dalli store implementation ie:
def fetch(name, options=nil)
options ||= {}
name = expanded_key name
if block_given?
unless options[:force]
entry = instrument(:read, name, options) do |payload|
payload[:super_operation] = :fetch if payload
read_entry(name, options)
end
end
if !entry.nil?
instrument(:fetch_hit, name, options) { |payload| }
entry
else
result = instrument(:generate, name, options) do |payload|
yield
end
write(name, result, options)
result
end
else
read(name, options)
end
end
The updated answer to this question is: By default fetch caches nil values, but using the dalli_store engine you can avoid it with cache_nils option:
Rails.cache.fetch("cache_key", expires_in: 60.minutes, cache_nils: false) do
return_something
end
Worth noting, the defaults for Dalli have changed in recent years - the flag for nil-caching is currently false by default. See https://github.com/petergoldstein/dalli
It's definitely worth adding a test to check that your setup does what you expect (especially for production mode)
I have a class with this method:
def telecom_info
Rails.cache.fetch("telecom_info_for_#{ref_num}", :expires_in=> 3.hours) do
info = Hash.new(0)
Telecom::SERVICES.each do |source|
results = TelecomUsage.find(:all,
:joins=>[:telecom_invoice=>{ :person=> :org_person}],
:conditions=>"dotted_ids like '%#{ref_num}%' and telecom_usages.ruby_type = '#{source}'",
:select=>"avg(charge) #{source.upcase}_AVG_CHARGE,
max(charge) #{source.upcase}_MAX_CHARGE,
min(charge) #{source.upcase}_MIN_CHARGE,
sum(charge) #{source.upcase}_CHARGE,
avg(volume) #{source.upcase}_AVG_VOLUME,
max(volume) #{source.upcase}_MAX_VOLUME,
min(volume) #{source.upcase}_MIN_VOLUME,
sum(volume) #{source.upcase}_VOLUME
")
results = results.first
['charge', 'volume'].each do |source_type|
info["#{source}_#{source_type}".to_sym] = results.send("#{source}_#{source_type}".downcase).to_i
info["#{source}_min_#{source_type}".to_sym] = results.send("#{source}_min_#{source_type}".downcase).to_i
info["#{source}_max_#{source_type}".to_sym] = results.send("#{source}_max_#{source_type}".downcase).to_i
info["#{source}_avg_#{source_type}".to_sym] = results.send("#{source}_avg_#{source_type}".downcase).to_i
end
end
return info
end
end
As you can see, this is an expensive call, and it is called ALOT for each request so I want to cache it. The problem is that memcached does not seem to work, in the log file, I am getting:
Cache read: telecom_info_for_60000000
Cache miss: telecom_info_for_60000000 ({})
The weird thing is that I know memcached is working since it does cache the results of some other functions I have in another model.
Any suggestions? I am running Rails 2.3.5 on REE 1.8.7
Replace return info with info.
Rails.cache.fetch("telecom_info_for_#{ref_num}", :expires_in=> 3.hours) do
# ...
info
end
The return keyword always returns from the current method, which means that info is never returned to your call to Rails.cache.fetch, nor is the rest of that method ever executed. When the last statement simply is info, this is the value that will be given to Rails.cache.fetch, and you will allow the method to finish its duty by storing this value in the cache.
Compare the following:
def my_method
1.upto(3) do |i|
# Calling return immediately causes Ruby to exit the current method.
return i
end
end
my_method
#=> 1
As a rule of thumb: always omit return unless you really mean to exit the current block and return from the current method.