My current homepage displays all available categories and all of the number of posts each category has. This of course is having a performance hit on the website and i was just wondering if this could be cached at all?
I don't mind if the cache is a little out of date and if the number of posts isn't 100% accurate at every refresh, but i would like it to only make the query every 30 minutes or so.
In Rails you can cache pretty much everything.
You can cache partials or queries. And you can expire them manually too.
For example
cache('categorylist') do
render partial: 'such'
end
And in the post.rb model, after each create of post, reset this and force it to evaluate on next run
after_create {
Rails.cache.delete('categorylist')
}
This way, your partial will only be evaluated (and written to cache) when a new post has been created. All other times, it will be fetched from cache.
ActiveRecord caches queries so that they don't have to be re-executed. However, this sort of caching does not persist between requests. View caching may work better for you - I'd look into Fragment Caching, the rails docs for which can be found here: http://edgeguides.rubyonrails.org/caching_with_rails.html
The crux though is that as you can cache objects as you render them like this (from the doc mentioned above):
<% #products.each do |product| %>
<% cache product do %>
<%= render product %>
<% end %>
<% end %>
This writes each product to a key/value store. It records the updated_at timestamp for the data so that you don't serve stale data.
Using the format from the above example, you could store the html listing the categories and posts per category in the cache.
Related
From what i understand of Russian doll caching in Rails it would be detrimental to eager load related objects or object lists when we are doing RDC (Russian Doll Caching) because in RDC we just load the top level object from the database, and look up its cached rendered template and serve. If we were to eager load related object lists that will be useless if the cache is not stale.
Is my understanding correct? If yes, how do we make sure that we eager load all the related objects on the very first call so as not to pay the cost of N+1 queries during the very first load (when the cache is not warm).
Correct - when loading a collection or a complicated object with many associations, a costly call to eager load all objects and associations can be avoided by doing a fast, simple call.
The rails guide for caching does have a good example, however, it's split up a bit. Looking at the common use case of caching a collection (ie the index action in Rails):
<% cache("products/all-#{Product.maximum(:updated_at).try(:to_i)}") do %>
All available products:
<% Product.all.each do |p| %>
<% cache(p) do %>
<%= link_to p.name, product_url(p) %>
<% end %>
<% end %>
<% end %>
This (condensed) example does 1 simple DB call Product.maximum(:updated_at) to avoid doing a much more expensive call Product.all.
For a cold cache (the second question), it is important to avoid N+1's by eager-loading associated objects. However, we know we need to do this expensive call because the first cache read for the collection missed. In Rails, this is typically done using includes. If a Product belongs to many Orders, then something like:
<% cache("products/all-#{Product.maximum(:updated_at).try(:to_i)}") do %>
All available products:
<% Product.includes(:orders).all.each do |p| %>
<% cache(p) do %>
<%= link_to p.name, product_url(p) %>
Bought at:
<ul>
<% p.orders.each do |o| %>
<li><%= o.created_at.to_s %></li>
<% end %>
</ul>
<% end %>
<% end %>
<% end %>
In the cold cache case we still do a cache read for collection and each member, however, in the partially warm cache case, we will skip rendering for a portion of the members. Note that this strategy relies on a Products associations being correctly set up to touch when associated objects are updated.
Update: This blog post describes a more complex pattern to further optimize building responses for partially cached collections. Instead of rebuilding the entire collection, it bulk fetches all available cached values then does a bulk query for the remaining values (and updates the cache). This is helpful in a couple of ways: the bulk cache read is faster than N+1 cache reads and the bulk query to the DB to build the cache is also smaller.
Let's say we have a Post model.
In the beginning of a view I need to display the total number of posts, typically by
#posts.size
Later in the view I need to display all posts, typically by
#posts.each do |post|
end
This results in two queries.
If I had done the queries in the opposite order, it had resulted in one single query (presumably utilizing CACHE).
Is there any "trick" where I can achieve the (first) mentioned order with only one db query?
You can use for example #fragment-caching
<% cache 'posts_size' do %>
<%= #posts.size %>
<% end %>
Then when a Post is created you can expire the fragment
expire_fragment 'posts_size'
I currently use a query of the form
#recordsfound = Model.where(...)
This is located in my controller and returns all records matching the query. I am using pagination to show a limited number of records at a time. However, when I select to show the next page the query is run again. Is there a way to store the records returned in a variable other than an instance variable and therefore not require the query to be rerun?
Thanks a lot guys
I'm not entirely sure I understand what you mean, but if you want the query to only run once per page even on subsequent visits, you can use fragment caching.
Rails will lazy load, and then use the cache when it hits the query in the view.
<% results.each do |result| %>
<% cache result do %>
<%= result.foo %>
<% end %>
<% end %>
If you have any dependent models, you'll have to make sure you expire the cache when they get updated if necessary:
belongs_to :result, touch: true
Note that if you are in development environment the query will still run. You can change this in your development.rb config file. If you do change this setting, don't forget to revert it . Otherwise strange things will happen and you'll waste your time trying to figure out why your changes aren't visible.
config.action_controller.perform_caching = true
I'm using fragment cache but i have inline code that is user specific like:
<% cache #page do %>
stuff here
<% if current_user %>
user specific
<% end %>
more here
<% end %>
So i want to exclude the several blocks of code that are user specific. Is there a way to do that in Rails or should i make an if statement in the beginning and make different caches for logged users and regular visitors? (i will have major duplication of code this way).
For per-user fragments, you can put models in array an array:
<% cache [#page, current_user] do %>
Rails will make a cache-key out of them, like:
pages/page_id-page_timestamp/users/user_id-user_timestamp
This way your fragments will be invalidated on a user/page update since the time-stamps are coming from their updated_at (see cache_key for details).
I have a home page which has some partials rendered all over the page.And it also has session header as well(login).
Partial contains set of books paginated. Now I want to cache this partial as it is getting updated once in a week.
Question 1 : How do I cache that particular partial (Without hitting
db) ?
Question 2 : How do I delete(expire) cached content when I update
that books model ?
You're looking for fragment caching here, which occurs on the view layer. Fragment caching and expiration of stored contents is surprisingly easy to do. You have a list of books, so let's say your view looks a bit like this:
<ul>
<% #books.each do |book| %>
<li><%= book.name %></li>
<% end %>
</ul>
To enable caching for just this bit, simply wrap it in cache:
<% cache do %>
<ul>
<% #books.each do |book| %>
<li><%= book.name %></li>
<% end %>
</ul>
<% end %>
Of course, this doesn't name the cache or do anything really special with it... while Rails will auto-select a unique name for this cache fragment, it won't be really helpful. We can do better. Let's use DHH's key-based cache expiration technique and give the cache a name relating to its content.
<% cache ['book-list', *#books] do %>
<ul>
<% #books.each do |book| %>
<li><%= book.name %></li>
<% end %>
</ul>
<% end %>
Passing arguments into cache builds the cache key from the supplied arguments. Strings are passed in directly -- so, here, the cache will always be prefaced with 'book-list'. This is to prevent cache collisions with other places you might be caching the same content, but with a different view. For each member of the #books array, Rails will call cache_key: for ActiveRecord objects, this yields a string composed of its model, ID, and crucially, the last time the object was updated.
This means that when you update the object, the cache key for this fragment will change. In other words, it's automatically getting expired -- when a book is updated, this cache statement will search for a nonexistent key, conclude it doesn't exist, and populate it with new content. Old, stale content will linger in your cache store until evicted by memory or age constraints (memcached does this automatically).
I use this technique in a number of production applications and it works wonderfully. For more information, check out that 37signals post, and for general caching information in Rails, see the Ruby on Rails caching guide.
"There are only two hard problems in Computer Science:
cache invalidation and naming things."
-- Phil Karlton
Caching
The Rails Guide to caching is probably always a good entry point for the built in caching strategies rails has to offer. Anyway here comes my very easy approach to caching
# _partial.html.erb
<% cache(some_key, :expires_in => 1.week) do %>
<%# ... content %>
<% end %>
Now some some_key can be any string as long as it is unique. But then again lets try to be a bit more clever about it and make the key somehow dependent on the list of books. Say you actually pass in the array of books some query returned then rails calls cache_key on each of its entries and eventually constructs a unique key for this collection. So when the collection changes the key changes. Thats because cache_key is implemented in ActiveRecord::Base and thus available on all Models. And further more it even uses the timestamps if available.
But then again this will hit the db every time a request is made.
The same in code:
# controller_method
#weekly_books = Books.where 'condition'
# _partial.html.erb
<% cache(#weekly_books) do %>
<%# ... content %>
<% end %>
To avoid hitting the db to often you can also cache the query its self by wrapping the call:
# Book.rb
def self.weeklies
Rails.cache.fetch("book_weeklies", :expires_in => 1.day) do
Books.where 'condition'
end
end
# controller_method
#weekly_books = Books.weeklies
# _partial.html.erb
<% cache(#weekly_books) do %>
<%# ... content %>
<% end %>