Rails: Skip controller if cache fragment exist (with cache_key) - ruby-on-rails

I have been using cache for a long time and recently discovered that my fragment caching doesn't stop my controller from executing the code, as it always has been. I have boiled the problem down to have to do with the cache_key that seems to be a new feature?
This is my previous solution that no longer works as expected.
Product#Show-view:
cache('product-' + #product.id.to_s) do
# View stuff
end
Product#Show-controller:
unless fragment_exist?('product-19834') # ID obviously dynamically loaded
# Perform slow calculations
end
The caching works fine. It writes and reads the fragment, but it still executes the controller (which is the whole reason I want to use caching). This boils down to the fragment has an added unique id so the fragment created is something like:
views/product-19834/b05c4ed1bdb428f73b2c73203769b40f
So when I check if the fragment_exist I am not checking for the right string (since I am checking for 'views/product-19834'). I have also tried to use:
fragment_exist?("product-#{#product.id}/#{#product.cache_key}")
but it checks with a different cache_key than is actually created.
I would rather use this solution than controller-caching or gems like interlock.
My question is:
- How do I, in the controller, check if a fragment exist for a specific view considering this cache key?

As Kelseydh pointed out in the link, the solution to this is to use skip_digest => true in the cache request:
View
cache ("product" + #product.id, :skip_digest => true)
Controller
fragment_exist?("product-#{#product.id}")

It might be worth pointing out that while the proposed solution (fragment_exist?) could work, it's more like a hack.
In your question, you say
It writes and reads the fragment, but it still executes the controller
(which is the whole reason I want to use caching)
So what you actually want is "controller caching". But fragment caching is "view caching":
Fragment Caching allows a fragment of view logic to be wrapped in a
cache block and served out of the cache store
(Rails Guides 5.2.3)
For "controller caching", Rails already provides some options:
Page Caching
Action Caching
Low-Level Caching
Which are all, from my point of view, better suited for your particular use case.

Related

Rails 4 / Heroku smart expire cache

We have in our application some blocks which are cached.
According to some login we sometimes modify some of them, and in this case we have a logic that expires the relevant blocks.
When we perform changes in the code, we need to expire these blocks via console. In this case we need to detect and be precise with the exact logic in order to expire all modified blocks. For example, if we change header html of closed streams, it will look like:
a = ActionController::Base.new
Stream.closed.each {|s| a.expire_fragment("stream_header_#{s.id}") }; nil
Actually, I think that must be a more generic way to simply compare cached blocks with how it should be rendered, and expire only blocks which their html is different that their cached version.
I wonder if there is a gem that does this task, and if not - if somebody has already written some deploy hook to do it.
============== UPDATE ============
Some thought:
In a rake task one can get the cached fragment, as long as you know which fragments you have.
For example, in my case I can do:
a = ActionController::Base.new
Stream.each do |s|
cached_html = a.read_fragment("stream_header_#{s.id}")
:
:
If I could generate the non-cached html I could simply compare them, and expire the cached fragment in case they are different.
Is it possible?
How heavy do you think this task will be?
Not at all easy to answer with so little code.
You can use the
cache #object do
render somthing
end
So based on the hash of the object the cache will invalided itself. This is also true for the rendering template as Rails will create a has on this also and combined it with the hash of the object to invalidate it properly. This also can work at a deeper level and in this way it is possible to invalidate an entire branch of the render tree.
Let me point you toward the documentation of Rails and the Russian doll caching.
http://edgeguides.rubyonrails.org/caching_with_rails.html
There was also a great video on the caching by these guys:
https://www.codeschool.com/courses/rails-4-zombie-outlaws
They are free, but it look like you have to register now.
I hope this is in the right direction for your need.

Rails 4 Fragment Caching

I'm trying to increase performance for my app so I'm looking into fragment caching.
I'm trying to understand what to cache. For example, on all pages of my site I display a list of recent articles.
In my application controller I have a filter that sets:
#recent_articles = Article.get_recent
I have the following in my view/footer:
- cache(cache_key_for_recent_articles) do
%h3 RECENT ARTICLES
- #recent_articles.each do |article|
.recent-article
= link_to add_glyph_to_link("glyphicon glyphicon-chevron-right", article.name), article_path(article, recent: true)
- if Article.count > 4
= link_to "MORE ARTICLES", articles_path(), class: "btn btn-primary more-articles"
My question is. Am I properly caching this? I'm tailing the logs, but I see a query for the articles so I'm assuming no. It's not clear to me what this would do when I query in the controller, but cache a section of the page.
Is this a place for low level caching rather than fragment caching?
Thanks.
You're doing it right. It might seem silly, because it always has to make the db hit anyway, but the gains can be substantial. Imagine each article had threaded comments, with images. In this case, if you kept the controller the exact same, using the same caching construct would save you a tremendous amount of db effort. So, yeah, if you can pull from memcached instead of running through haml with a bunch of rails helpers (those link_tos aren't free) you'll save a bit for sure, but the real gains are found when you can subtly restructure your architecture (as lazy as possible) in order to really take advantage. And that initial hit on Articles? Your db should do a pretty good job of caching that call, I'm not sure you would want to cache it too aggressively, in this case anyway, given the name of the method called.

What is the right way to clear cache in Rails without sweepers

Observers and Sweepers are removed from Rails 4. Cool.
But what is the way to cache and clear cache then ?
I read about russian doll caching. It nice and all but it only concerns the view rendering cache. It doesn't prevent the database from being hit.
For instance:
<% cache #product do %>
Some HTML code here
<% end %>
You still need to get #product from the db to get its cache_key. So page or action caching can still be useful to prevent unnecessary load.
I could use some timeout to clear the cache sometimes but what for if the records didn't change ?
At least with sweepers you have control on that aspect. What is/will be the right way to do cache and to clear it ?
Thanks ! :)
Welcome to one of the two hard problems in computer science, cache invalidation :)
You would have to handle that manually since the logic for when a cached object, unlike a cached view which can be simply derived from the objects it displays, should be invalidated is application and situation dependent.
You goto method for this is the Rails.cache.fetch method. Rails.cache.fetch takes 3 arguments; the cache key, an options hash, and a block. It first tries to read a valid cache record based on the key; if that key exists and hasn’t expired it will return the value from the cache. If it can’t find a valid record it instead takes the return value from the block and stores it in the cache with your specified key.
For example:
#models = Rails.cache.fetch my_cache_key do
Model.where(condition: true).all
end
This will cache the block and reuse the result until something (tm) invalidates the key, forcing the block to be reevaluated. Also note the .all at the end of the method chain. Normally Rails would return an ActiveRecord relation object that would be cached and this would then be evaluated when you tried to use #models for the first time, neatly sidestepping the cache. The .all call forces Rails to eager load the records and ensure that it's the result that we cache, not the question.
So now that you get all your cache on and never talk to the database again we have to make sure we cover the other end, invalidating the cache. This is done with the Rails.cache.delete method that simply takes a cache key and removes it, causing a miss the next time you try to fetch it. You can also use the force: trueoption with fetch to force a re-evaluation of the block. Whichever suits you.
The science of it all is where to call Rails.cache.delete, in the naïve case this would be on update and delete for a single instance and update, delete, create on any member for a collection. There will always bee corner cases and they are always application specific, so I can't help you much there.
I assume in this answer that you will set up some sane cache store, like memcached or Redis.
Also remember to add this to config/environments/development.rb:
config.cache_store = :null_store
or you development environment will cache and you will end up hairless from frustration.
For further reference read: Everyone should be using low level caching in Rails and The rails API docs
It is also worth noting that functionality is not removed from Rails 4, merely extracted into a gem. If you need or would like the full features of the sweepers simply add it back to your app with a gem 'rails-observers' line in your Gemfile. That gem contains both the sweepers and observers that where removed from Rails 4 core.
I hope that helpt you get started.

How do i select result set without caching in rails

I want to run NOn_cached queries using rails. Any help?
Rails is only going to cache the query result DURING the request-response cycle, so it's only going to live for the duration of the request.
You should be able to use the uncached command
uncached do
find(blah)
end
at least within a controller action.
Another (BAD!) way to do it is add a query attribute that busts the cache
where("name= ? OR 0= ?", 'smith', rand(100))
or similar, so that you get OR '0=0.3567' which is always false AND different for each request.
You really want to use uncached though :)

Rails 3: Caching to Global Variable

I'm sure "global variable" will get the hair on the back of everyone's neck standing up. What I'm trying to do is store a hierarchical menu in an acts_as_tree data table (done). In application_helper.rb, I create an html menu by querying the database and walking the tree (done). I don't want to do this for every page load.
Here's what I tried:
application.rb
config.menu = nil
application_helper.rb
def my_menu_builder
return MyApp::Application.config.menu if MyApp::Application.config.menu
# All the menu building code that should only run once
MyApp::Application.config.menu = menu_html
end
menu_controller.rb
def create
# whatever create code
expire_menu_cache
end
protected
def expire_menu_cache
MyApp::Application.config.menu = nil
end
Where I stand right now is that on first page load, the database is, indeed, queried and the menu built. The results are stored in the config variable and the database is never again hit for this.
It's the cache expiration part that's not working. When I reset the config.menu variable to nil, presumably the next time through my_menu_builder, it will detect that change and rebuild the menu, caching the new results. Doesn't seem to happen.
Questions:
Is Application.config a good place to store stuff like this?
Does anyone see an obvious flaw in this caching strategy?
Don't say premature optimization -- that's the phase I'm in. The premature-optimization iteration :)
Thanks!
I would avoid global variables, and use Rails' caching facilities.
http://guides.rubyonrails.org/caching_with_rails.html
One way to achieve this is to set an empty hash in your application.rb file:
MY_VARS = {}
Then you can add whatever you want in this hash which is accessible everywhere.
MY_VARS[:foo] = "bar"
and elsewhere:
MY_VARS[:foo]
As you felt, this is not the Rails way to behave, even if it works. There are different ways to use caching in Rails:
simple cache in memory explained here:
Rails.cache.read("city") # => nil
Rails.cache.write("city", "Duckburgh")
Rails.cache.read("city") # => "Duckburgh"
use of a real engine like memcached
I encourage you to have a look at http://railslab.newrelic.com/scaling-rails
This is THE place to learn caching in all it's shapes.

Resources