I have a Rails app which allows users to create unlimited number of static pages (I store these pages in postgres database), because these pages are statics I would like to use page caching, but I am not sure what's the limit number/size of caching pages in Rails, can I cache unlimited number of pages ?
we do this on our main company blog at https://reinteractive.net.
What you probably want to do is do fragment caching on the show action of the page you want to cache. There are some pitfalls to this, the biggest of which is expiring the cache if the underlying page changes.
It works like this:
In the routes.rb file:
AppName::Application.routes.draw do
get "page/:id" => "pages#show"
end
In the controller (app/controllers/pages_controller.rb):
class PagesController < ApplicationController::Base
def show
#page = Page.find(params[:id])
end
end
Note here in the controller you are hitting the database on every request but this should be a really fast request with the correct index in place.
Then in the view (app/views/pages/show.html.erb):
<% cache("pages/show/#{#page.id}-#{#page.updated_at.to_i}") do %>
<%# Render your complex page here %>
<% end %>
What you gain from caching a fragment like this is that the output of the rendering gets stored in the cache and if this page takes a while to render, you can get significant time savings, especially if it takes 100ms or more to render the page and all it's parts, you pay for it the first time, and then after that it loads in a few milliseconds.
Note we are checking the last update time of the page? This is so that if you want to expire this cache (and render again) all you need to do is call page.touch on the page object to update is updated_at time.
If you have any other objects that could also update (such as say a page.header_image or something, then you can also put the updated_at on these related objects into the cache key as well, or expire them manually when you update the page.
An alternative to adding updated_at into the cache key is expiring them in the model like this:
class Page < ActiveRecord::Base
after_save :expire_cache
def expire_cache
Rails.cache.delete("CACHE_KEY")
end
end
But this has it's own challenges.
Good luck!
#MikelLindsaar 's answer is good but kinda too much work for something that could be handled easier, ActiveRecord objects have a method called cache_key that auto generates a unique key for each record, if the record gets touched or updated, the generated key will be different, this key is used for generating the whole view caching key, which is a function that has a lot of parameters, like the object id, updated_at (from the cache_key), and the view's hash ( the cache gets invalidated if the view file is updated, not just the db), so there's no worry about any stale data, all you need to do would be:
The controller
class PagesController < ApplicationController::Base
def show
#page = Page.find(params[:id])
end
end
The view
<% cache #page do %>
<%# Render your page here %>
<% end %>
if you have multiple pages that use the same page object you could use a compound cache key, something like this for example
<% cache(prefix: 'page/show', page: #page) %>
and rails will handle the generation of the caching key.
More info about this over here http://guides.rubyonrails.org/caching_with_rails.html#fragment-caching
Related
The information in my header/footer is derived from a bunch of dictionary tables in my database, and the data is going to change extremely infrequently. I would think this is the perfect opportunity for caching, so that every page won't touch the database as far as rendering the header/footer is concerned.
The Rails guide gives an example of fragment caching so I'm thinking for my nav links would be something like...
<% #categories.each do |category| %>
<% cache category do %>
<li class="nav-link">
<%= link_to category.name, category_path(category) %>
</li>
<% end %>
<% end %>
But I don't understand how this prevents contact with the database or optimizes anything. The controller is making the call for #categories before the view renders which means the SQL query is made every page request... some HTML is being cached, but is the rendering process really a significant saving even for a larger snippet?? I feel like the strain on the database is what you really want to be limiting especially if dealing with a lot of concurrent traffic.
What is the appropriate caching strategy for things like footer / navigation partials.?
The Rails Guides example for fragment caching refers only to cache the view generated to show an object, but not the query, that's why you can't see where it caches the query, because it doesn't.
You can use low-level caching to cache the #categories query https://guides.rubyonrails.org/caching_with_rails.html#low-level-caching
It can cache any kind of information, so, you could have something like
class Category < ActiveRecord::Base
def self.for_navbar
Rails.cache.fetch("#{nav_cache_key}/categories_for_navbar", expires_in: 1.week) do
self.whatever_scope_you_need
end
end
end
What you will need to change is the nav_cache_key var that's used to identify the cache. I'm not sure about a best practice with this, but I would set a class variable the first time you need it with the current time_stamp and update it whenever the cache should be wiped.
Something like
def self.navbar_cache_key
##navbar_cache_key ||= Time.now
end
after_update :change_cache_key
def change_cache_key
##navbar_cache_key = Time.now
end
That way, every time a category is updated, it will change the ##navbar_cache_key for the class and the cache will be updated for the new key. I'm not sure what are the real conditions where the cache need to be updated, maybe the after_update callback is not the best or you need some extra actions.
This will cache only the query (not sure if you need to cache an array or caching the query works the same, maye you need a .to_a at the end), you can still use fragment caching if you want to cache the li elements too.
I suppose you could even cache the full html of the categories, since low-level cache accepts any kind of info, you only need to find the right place to use Rails.cache.fetch and the right place to save/update the cache key.
I am creating a multiple choice generator that can generate multiple choice answers based on input data. The random number generators are in the page's controller. The controller gets called during a refresh and as such, the random numbers change.
I thought of storing the random numbers in sessions. But this would cause all the random numbers to stay the same even when I do want to change my input data (such as loading a new page).
Is there a way to block the controller action during page refreshes? Or another easy way to have the random numbers stay the same on page refresh, but change when loading new page?
From what I understood, you want:
IF page reloaded, the random number would stay the same on the page
ELSE if changed / new page, the random number would be regenerated
Then you could do something like:
your_controller.rb:
before_action :set_random_number, only: [:index]
def set_random_number
message_encryptor = ActiveSupport::MessageEncryptor.new(Rails.application.secrets.secret_key_base)
# if no random number yet for this page, then generate, then redirect to same page but now with random_number param
if params[:random_number].blank?
# we need to encrypt the random_number so that user won't be able to manipulate the value in the query params
encrypted_random_number = Base64.urlsafe_encode64(
message_encryptor.encrypt_and_sign(
some_code_of_yours_that_generates_the_random_number
)
)
redirect_to request.query_parameters.merge(random_number: encrypted_random_number)
else
#random_number = message_encryptor.decrypt_and_verify(
Base64.urlsafe_decode64(params[:random_number])
)
end
end
index.html.erb:
<%= #random_number %>
I think cookies are good idea for your problem, try this as an example, in your controller:
def index
cookies[:number] ||= SecureRandom.hex(4)
end
Then in your view:
<%= cookies[:number] %>
Does that do what you want?
Say I have a Rails 4.2 site on Heroku with many Posts. When a post is viewed, the application hits the database to get the post's data and associated data from other models. Visitors (non-logged-in users) to my website should see Posts without any customizations, so displaying these pages should not require accessing the database. How can I set up simple page or action caching (or an equivalent) for all visitors to my site, so the database is skipped? Could I also let a CDN take over rendering these pages?
For starters, take a look at the Rails caching guide. Both page and action caching have been removed from Rails.
You can try fragment caching, which will grab the generated HTML of the post and store it away, though you'll need a conditional based on whether or not there is a current_user.
A super-simple version might look like:
app/views/posts/show.html.erb
<% cache_unless (current_user, #post) do %>
<%= #post.body %>
<% end %>
The cache method is going to be a read action if a cache exists for that post, and it will be a generate and write action if that cache doesn't exist.
Caching with an ActiveRecord object creates a key that includes updated_at by default, so if a post changes, the first visitor that loads the post will have a longer wait time, and every subsequent visitor will get the benefits of caching.
As for your other question, there are many other Ruby-based tools for serving static pages, and a review of all those is beyond my knowledge, and probably beyond the scope of this question. :)
Lanny Bose's answer really looks the best, but I'll add this link:
https://devcenter.heroku.com/articles/http-caching-ruby-rails#public-requests
The whole page is useful, but in particular it talks about marking requests as public or private. They are private by default, but if you mark it as public it allows pages to be served by an intermediary proxy cache.
The linked page talks about Rack::Cache + memcache as a proxy cache; I'm not sure if this still exists in rails 4.
This second heroku page talks about action caching, which could cache a specific action for you:
https://devcenter.heroku.com/articles/caching-strategies#action-caching
You don't really want caching here. You just want a conditional in both controller and view, like
# controller
if logged_in?
#post = Post.find(params[:id])
# load more records
end
# view
<% if #post %>
<%= #post.title %>
...
<% else %>
You're not logged in, blah blah.
<% end %>
.. assuming the controller is more than just loading a post record. I don't necessarily agree with conditionally loading the record as above, but your use case may require it.
Alternatively, if you DO require caching for performance reasons, you can just elegantly include the current user id in the fragment cache key like
<% cache "post-#{#post.id}-#{current_user ? current_user.id : 0}" do %>
<% if logged_in? %>
... # expensive operations
<% else %>
Please login to see more..
<% end %>
<% end %>
so that the cache works with the same code for users or anons.
I have a social feed.
If the user scrolls down a lot it is annoying to the user that by liking/commenting he is redirected to the top of the page instead of in the same part of the page to where he had scrolled to.
Is there any way to do this? Otherwise I'll just use paginate to make the pages smaller, which isn't ideal because that also takes away from user friendliness.
class ActivitiesController < ApplicationController
def index
#activities = Activity.order("created_at desc").where(user_id: current_user.following_ids)
end
def show
redirect_to(:back)
end
end
I've been on a roll with questions please check them out if you have time :)
Assuming that is being redirected to the top of the page because the page is being reloaded after a comment/favorite, you could try performing these actions using ajax instead.
This way, the page won't reload and you can modify the DOM to reflect the user's actions with javascript.
Here's some more information on ajax in rails:
http://guides.rubyonrails.org/working_with_javascript_in_rails.html
First generate anchors in your page by giving them ids. For example:
<div id="activity5">
..
</div>
Then in your controller, redirect to that part by using an anchor option:
redirect_to(request.env["HTTP_REFERER"] + '#activity5')
Note: redirect_to(:back) is the same as redirect_to(request.env["HTTP_REFERER"])
Having said that, using Javascript and AJAX is probably a better option.
Lets say you have a fragment of the page which displays the most recent posts, and you expire it in 30 minutes. I'm using Rails here.
<% cache("recent_posts", :expires_in => 30.minutes) do %>
...
<% end %>
Obviously you don't need to do the database lookup to get the most recent posts if the fragment exists, so you should be able to avoid that overhead too.
What I'm doing now is something like this in the controller which seems to work:
unless Rails.cache.exist? "views/recent_posts"
#posts = Post.find(:all, :limit=>20, :order=>"updated_at DESC")
end
Is this the best way? Is it safe?
One thing I don't understand is why the key is "recent_posts" for the fragment and "views/recent_posts" when checking later, but I came up with this after watching memcached -vv to see what it was using. Also, I don't like the duplication of manually entering "recent_posts", it would be better to keep that in one place.
Ideas?
Evan Weaver's Interlock Plugin solves this problem.
You can also implement something like this yourself easily if you need different behavior, such as more fine grained control. The basic idea is to wrap your controller code in a block that is only actually executed if the view needs that data:
# in FooController#show
#foo_finder = lambda{ Foo.find_slow_stuff }
# in foo/show.html.erb
cache 'foo_slow_stuff' do
#foo_finder.call.each do
...
end
end
If you're familiar with the basics of ruby meta programming it's easy enough to wrap this up in a cleaner API of your taste.
This is superior to putting the finder code directly in the view:
keeps the finder code where developers expect it by convention
keeps the view ignorant of the model name/method, allowing more view reuse
I think cache_fu might have similar functionality in one of it's versions/forks, but can't recall specifically.
The advantage you get from memcached is directly related to your cache hit rate. Take care not to waste your cache capacity and cause unnecessary misses by caching the same content multiple times. For example, don't cache a set of record objects as well as their html fragment at the same time. Generally fragment caching will offer the best performance, but it really depends on the specifics of your application.
What happens if the cache expires between the time you check for it in the controller
and the time it's beeing checked in the view rendering?
I'd make a new method in the model:
class Post
def self.recent(count)
find(:all, :limit=> count, :order=>"updated_at DESC")
end
end
then use that in the view:
<% cache("recent_posts", :expires_in => 30.minutes) do %>
<% Post.recent(20).each do |post| %>
...
<% end %>
<% end %>
For clarity, you could also consider moving the rendering of a recent post into its own partial:
<% cache("recent_posts", :expires_in => 30.minutes) do %>
<%= render :partial => "recent_post", :collection => Post.recent(20) %>
<% end %>
You may also want to look into
Fragment Cache Docs
Which allow you to do this:
<% cache("recent_posts", :expires_in => 30.minutes) do %>
...
<% end %>
Controller
unless fragment_exist?("recent_posts")
#posts = Post.find(:all, :limit=>20, :order=>"updated_at DESC")
end
Although I admit the issue of DRY still rears its head needing the name of the key in two places. I usually do this similar to how Lars suggested but it really depends on taste. Other developers I know stick with checking fragment exist.
Update:
If you look at the fragment docs, you can see how it gets rid of needing the view prefix:
# File vendor/rails/actionpack/lib/action_controller/caching/fragments.rb, line 33
def fragment_cache_key(key)
ActiveSupport::Cache.expand_cache_key(key.is_a?(Hash) ? url_for(key).split("://").last : key, :views)
end
Lars makes a really good point about there being a slight chance of failure using:
unless fragment_exist?("recent_posts")
because there is a gap between when you check the cache and when you use the cache.
The plugin that jason mentions (Interlock) handles this very gracefully by assuming that if you are checking for existence of the fragment, then you will probably also use the fragment and thus caches the content locally. I use Interlock for these very reasons.
just as a piece of thought:
in application controller define
def when_fragment_expired( name, time_options = nil )
# idea of avoiding race conditions
# downside: needs 2 cache lookups
# in view we actually cache indefinetely
# but we expire with a 2nd fragment in the controller which is expired time based
return if ActionController::Base.cache_store.exist?( 'fragments/' + name ) && ActionController::Base.cache_store.exist?( fragment_cache_key( name ) )
# the time_fraqgment_cache uses different time options
time_options = time_options - Time.now if time_options.is_a?( Time )
# set an artificial fragment which expires after given time
ActionController::Base.cache_store.write("fragments/" + name, 1, :expires_in => time_options )
ActionController::Base.cache_store.delete( "views/"+name )
yield
end
then in any action use
def index
when_fragment_expired "cache_key", 5.minutes
#object = YourObject.expensive_operations
end
end
in view
cache "cache_key" do
view_code
end