I have a class with this method:
def telecom_info
Rails.cache.fetch("telecom_info_for_#{ref_num}", :expires_in=> 3.hours) do
info = Hash.new(0)
Telecom::SERVICES.each do |source|
results = TelecomUsage.find(:all,
:joins=>[:telecom_invoice=>{ :person=> :org_person}],
:conditions=>"dotted_ids like '%#{ref_num}%' and telecom_usages.ruby_type = '#{source}'",
:select=>"avg(charge) #{source.upcase}_AVG_CHARGE,
max(charge) #{source.upcase}_MAX_CHARGE,
min(charge) #{source.upcase}_MIN_CHARGE,
sum(charge) #{source.upcase}_CHARGE,
avg(volume) #{source.upcase}_AVG_VOLUME,
max(volume) #{source.upcase}_MAX_VOLUME,
min(volume) #{source.upcase}_MIN_VOLUME,
sum(volume) #{source.upcase}_VOLUME
")
results = results.first
['charge', 'volume'].each do |source_type|
info["#{source}_#{source_type}".to_sym] = results.send("#{source}_#{source_type}".downcase).to_i
info["#{source}_min_#{source_type}".to_sym] = results.send("#{source}_min_#{source_type}".downcase).to_i
info["#{source}_max_#{source_type}".to_sym] = results.send("#{source}_max_#{source_type}".downcase).to_i
info["#{source}_avg_#{source_type}".to_sym] = results.send("#{source}_avg_#{source_type}".downcase).to_i
end
end
return info
end
end
As you can see, this is an expensive call, and it is called ALOT for each request so I want to cache it. The problem is that memcached does not seem to work, in the log file, I am getting:
Cache read: telecom_info_for_60000000
Cache miss: telecom_info_for_60000000 ({})
The weird thing is that I know memcached is working since it does cache the results of some other functions I have in another model.
Any suggestions? I am running Rails 2.3.5 on REE 1.8.7
Replace return info with info.
Rails.cache.fetch("telecom_info_for_#{ref_num}", :expires_in=> 3.hours) do
# ...
info
end
The return keyword always returns from the current method, which means that info is never returned to your call to Rails.cache.fetch, nor is the rest of that method ever executed. When the last statement simply is info, this is the value that will be given to Rails.cache.fetch, and you will allow the method to finish its duty by storing this value in the cache.
Compare the following:
def my_method
1.upto(3) do |i|
# Calling return immediately causes Ruby to exit the current method.
return i
end
end
my_method
#=> 1
As a rule of thumb: always omit return unless you really mean to exit the current block and return from the current method.
Related
I have a function, which returns a list of ID's, in the Rails caching guide I can see that an expiration can be set on the cached results, but I have implemented my caching somewhat differently.
def provide_book_ids(search_param)
#returned_ids ||= begin
search = client.search(query: search_param, :reload => true)
search.fetch
search.options[:query] = search_str
search.fetch(true)
search.map(&:id)
end
end
What is the recomennded way to set a 10 minute cache expiry, when written as above?
def provide_book_ids(search_param)
#returned_ids = Rails.cache.fetch("zendesk_ids", expires_in: 10.minutes) do
search = client.search(query: search_param, :reload => true)
search.fetch
search.options[:query] = search_str
search.fetch(true)
search.map(&:id)
end
end
I am assuming this code is part of some request-response cycle and not something else (for example a long running worker or some class that is initialized once in your app. In such a case you wouldn't want to use #returned_ids directly but instead call provide_book_ids to get the value, but from I understand that's not your scenario so provided approach above should work.
I want to make it so if its directReports it shouldn't get it from rails cache. I tried to just return it but that doesnt seem to work.
''' # dont get it from cache if DirectReports
when PerformanceManagement::V0::EmployeeFilter::DirectReports
return
else
filtered_results = loaded_records
end'''
plz don't be mean it's my first time posting.
For such things you can create a method that just acts as a wrapper to handle the cache-or-not condition:
def main_logic_here
... expensive calculation
end
def main_logic_wrapper
if true
main_logic_here
else
Rails.cache.fetch(...) { main_logic_here }
end
end
Using Rails memory cache like this in one controller.
def form_config_cache
Rails.cache.fetch("form_config", :expires_in => 12.hours) do
puts 'Building cache...'
form_config = s3_read_object('form_config.js')
return JSON.parse(form_config)
end
end
This is working fine on the controller where it is defined. But when I try to read the value from another controller, it is returning as nil. Can anyone explain what might be going on? Here is how I am trying to read it in another controller.
form_config = Rails.cache.read('form_config')
Your code doesn't actually ever cache anything: return returns form the whole method, so the bit of fetch that stores values in the cache never executes and there is nothing for your call to read to return.
You could either use next or nothing at all:
def form_config_cache
Rails.cache.fetch("form_config", :expires_in => 12.hours) do
form_config = s3_read_object('form_config.js')
JSON.parse(form_config)
end
end
Let's suppose I have a method like this:
def foo
Rails.cache.fetch("cache_key", :expires_in => 60.minutes) do
return_something
end
end
return_something sometimes returns a nil value. When this happens, I don't want the nil value to be cached for 60 minutes. Instead, the next time I call foo, I want the block passed to fetch to be executed again.
Is Rails.cache.fetch working like this by default? Or do I have to implement this functionality?
Update (with Answer)
Turns out, the answer was no, at least when using Memcached.
it depends on the implementation of the cache-store that you are using. i would say that it should not cache nil values, but empty strings are ok to cache.
look at the dalli store implementation ie:
def fetch(name, options=nil)
options ||= {}
name = expanded_key name
if block_given?
unless options[:force]
entry = instrument(:read, name, options) do |payload|
payload[:super_operation] = :fetch if payload
read_entry(name, options)
end
end
if !entry.nil?
instrument(:fetch_hit, name, options) { |payload| }
entry
else
result = instrument(:generate, name, options) do |payload|
yield
end
write(name, result, options)
result
end
else
read(name, options)
end
end
The updated answer to this question is: By default fetch caches nil values, but using the dalli_store engine you can avoid it with cache_nils option:
Rails.cache.fetch("cache_key", expires_in: 60.minutes, cache_nils: false) do
return_something
end
Worth noting, the defaults for Dalli have changed in recent years - the flag for nil-caching is currently false by default. See https://github.com/petergoldstein/dalli
It's definitely worth adding a test to check that your setup does what you expect (especially for production mode)
I currently have the following code:
events.detect do |event|
#detect does the block until the statement goes false
self.event_status(event) == "no status"
end
What this does is output the instance of event (where events is a string of different Models that all collectively call Events) when the event_status method outputs a "no status".
I would like the output to also include the value for delay where:
delay = delay + contact.event_delay(event)
event_delay method hasn't been written, but it would be similar (maybe redundant but I'll deal with that later) to event_status in looking at the delay between when an event was done and when it was supposed to be done.
Here is how event_status looks currently for reference:
def event_status target
# check Ticket #78 for source
target_class= target.class.name
target_id = target_class.foreign_key.to_sym
assoc_name = "contact_#{target_class.tableize}"
r = send(assoc_name).send("find_by_#{target_id}", target.id)
return "no status" unless r
"sent (#{r.date_sent.to_s(:long)})"
end
My concept of output should be [event,delay] so that, for example, I can access it as Array[:event] or Array[:delay] to get at the value.
****I was thinking maybe I should use yield on a method, but haven't quite put the pieces together (should the block passed to the method be the delay =+ for example, I think it is).**
I am not wed to the .detect method, it's what I started with and it appears to work, but it isn't allowing me to run the tally alongside it.
It's not entirely clear what you're asking for, but it sounds like you're trying to add up a delay until you reach a certain condition, and return the record that triggered the condition at the same time.
You might approach that using Enumerable#detect like you have, but by keeping a tally on the side:
def next_event_info
next_event = nil
delay = 0
events.detect do |event|
case (self.event_status(event))
when "no status"
true
else
delay += contact.event_delay(event)
false
end
end
[ next_event, delay ]
end
Update for if you want to add up all delays for all events, but also find the first event with the status of "no status":
def next_event_info
next_event = nil
delay = 0.0
events.each do |event|
case (self.event_status(event))
when "no status"
# Only assign to next_event if it has not been previously
# assigned in this method call.
next_event ||= event
end
# Tally up the delays for all events, converting to floating
# point to ensure they're not native DB number types.
delay += contact.event_delay(event).to_f
end
{
:event => next_event,
:delay => delay
}
end
This will give you a Hash in return that you can interrogate as info[:event] or info[:delay]. Keep in mind to not abuse this method, for example:
# Each of these makes a method call, which is somewhat expensive
next_event = next_event_info[:event]
delay_to_event = next_event_info[:delay]
This will make two calls to this method, both of which will iterate over all the records and do the calculations. If you need to use it this way, you might as well make a special purpose function for each operation, or cache the result in a variable and use that:
# Make the method call once, save the results
event_info = next_event_info
# Use these results as required
next_event = event_info[:event]
delay_to_event = event_info[:delay]