I'm using the stale? method to do conditional GET caching in my JSON API.
I use
#post = Post.find(params[:post_id])
if stale? #post
#continue
end
AFAIK This uses the post's updated_at to make a cache key. Is there any way I can expire the cache entry for this item without doing post.touch? I'm using Heroku.
create a file in config/initializers called bust_cache.rb with following contents:
ENV["RAILS_CACHE_ID"] = 'version1'
If in future you wish to bust the cache again change the value to 'version2' etc. If you wish to bust the cache with every deploy use:
ENV["RAILS_CACHE_ID"] = Time.now.to_s
***Note that this last strategy will not work with multiple dynos on Heroku or other such similar situations
thanks to Nathan Kontny
Related
In order to boost the performance of my Rails 4.0.2 app, I would like to cache the output of some of my static pages:
class PagesController < ApplicationController
def home
end
def about_us
end
def contact
end
end
In the Rails Guide on Caching it says that "Page Caching has been removed from Rails 4" and moved into a gem. In the gem description it says, however, that it will be maintained only until Rails 4.1. Some other observers also advise against using Page Caching and endorse Russian doll caching instead.
So what's the best way to cache a bunch of static pages that will never actually hit the database and only ever change (slightly) if a user signs in?
Thanks for any suggestions.
You can still using fragment caching for your static pages, although the benefits are obviously more visible with dynamic / DB-driven pages. It's worth considering doing this should you have a lot of partial being rendered or costly view logic. Just wrap your page's template with:
# about_us.html.erb
<% cache 'about_us' do %>
...
<% end %>
the first time you hit the page in an environment where config.action_controller.perform_caching = true, it'll generate the fragment (which in this case is your whole page), and it'll serve that the next time you reload it. The cache digest will be invalidated when the template is changed:
The template digest that's added to the cache key is computed by
taking an md5 of the contents of the entire template file. This
ensures that your caches will automatically expire when you change the
template file.
http://api.rubyonrails.org/classes/ActionView/Helpers/CacheHelper.html
I'm using redis as cache store, and I want to cache common data
#links = Rails.cache.fetch "FriendsLinks" do
Link.where(category_id: 0)
end
But data returned from Rails.cache.fetch is a String, not an array of List objects.
I have to delete the cache from redis, then the next request would work. But very often, it will break again.
Edit:
it happens only at Development environment
I've located the problem.
the problem is the related Model hasn't been load when data retrieved from redis/memcache. So ruby cannot perform unmarshal.
Simple solution is to set
config.cache_classes = true
the drawback is you have to restart your app when you make code changes.
Another one is just put the related Class before Rails.cache, such as
Link
#links = Rails.cache.fetch "FriendsLinks" do
Link.where(category_id: 0)
end
Or create a initializer and put it under config/initializers
if Rails.env == "development"
Dir.glob("#{Rails.root}/app/models/**/*.rb") do |model_name|
require_dependency model_name
end
end
I have a Rails 3.2.11 application running on Unicorn and is set up for file_store caching to a specific folder outside of the project.
I am also using the gem rails/cache_digests for automatic expiration.
In a certain page I'm doing fragment caching without setting time expiration.
When a fragment has expired, I see the new fragment created in the cache folder, but I also see the expired fragment as well. How it will be deleted from the cache folder by the cache management mechanism without doing it manually? If it doesn't get deleted then the cache folder will be bloated with garbage, expired fragments that are not used.
You can try to use this cleanup function to delete all the expired fragments. And you may set up a script to run this command periodically.
You can use ActionController::Caching::Sweeping to expire fragment cache.
See below example:
class ProductSweeper < ActionController::Caching::Sweeper
observe Product
#expire fragment after model update
def after_save
expire_fragment('all_available_products')
end
#expire different cache after controller method modifying state is called.
def after_products_update_some_state
expire_action(:controller => 'products', :action => 'index')
end
#can also use before:
def before_products_update_some_state
#do something before.
end
end
This url is also help you
I have an application that uses caches_page for certain controllers/actions. To expire the cache, I use a sweeper. All in all, it's a standard solution.
However, because some changes may cause a bit of a rush of requests on the server (because push notifications are sent out and may trigger client devices to fetch new data), I'd like to be able to pre-render the cache, so it's ready before the requests roll in. I could just wait for the first request to automatically write the cache, of course, but in this case, I know that the requests will come, that there might be many, and that they may be near-simultaneous. So I'd like to have the cache ready.
To add some complexity, the updates are done via a normal web page and handled in a standard, mostly scaffolded controller, while the "page" I want to cache is the JSON response for an entirely different controller that serves as an API.
So, how do I, from a sweeper (or from the controller handling the cache-expiring update) trigger a new page cache to be written immediately?
Another way to put it might be: How do I make an internal request from one controller to another?
Edit: Ended up doing something like what you see below. It's not terribly elegant, but it is effective
class ModelSweeper < ActionController::Caching::Sweeper
observe Model
def after_create(model)
expire_pages_for(model)
end
def after_update(model)
expire_pages_for(model)
end
def after_destroy(model)
expire_pages_for(model)
end
protected
def expire_pages_for(model)
# expire index page
expire_and_bake(models_url)
# expire show page
expire_and_bake(model_url(model))
end
def expire_and_bake(url)
# extract the path from the URL
path = url.sub(%r{\Ahttp://[^/]+}, "")
# expire the cache
expire_page(path)
# request the url (writes a new cache)
system "curl '#{url}' &> /dev/null &"
end
end
Warming a server's cache may fall outside of the realm of your application logic. I have implemented a cache warming system before using a rake task that wrapped the curl command and looped through all the areas in the website.
# lib/tasks/curl.rake
desc "curl"
task :curl do
paths.each do |path|
`curl #{path}`
end
end
You can call this task by issuing "rake curl" from inside your Rails project root.
Alternately, you could invoke this rake task (which wraps curl) from inside your sweeper method after you expire the cache. Check out the Railscast Ryan Bates did on invoking rake tasks in the background from inside your Rails application code: http://railscasts.com/episodes/127-rake-in-background
More information on curl here: http://curl.haxx.se/docs/manpage.html
On redmine 1.2/rails 2.3.11 I'm rendering a repository markdown file as html (as redmine_markdown_extra_viewer does), and now I'am trying to cache the result, which should be updated on each commit.
So I have a git hook that fetch the repo changes, and i'd like it to also clear the corresponding cache entries.
Cache generation (in a RepositoriesController::entry override):
cache_key =['repositories_md', #project.id.to_s, #path.to_s].join('/')
puts cache_key
#content = cache_store.fetch cache_key do
Kramdown::Document.new(#repository.cat(#path, #rev)).to_html
end
render :action => "entry_markdown"
The hook that should clear the cache, but has no effect:
# This is ok
ruby script/runner "Repository.fetch_changesets"
# This not
ruby script/runner "Rails.cache.delete_matched(/repositories_md\/.*/)"
So it doesn't work and i don't even know if i've taken the right direction to implement that. Any input much appreciated!
Which cache backend are you using?
If it's memcached or anything other than the FileStore or the MemoryStore, the delete_matched method is not supported.
You're probably better off letting them expire and just replace their cached contents as they get updated.
The problem is when using a Regular Expression as a fragment name, try using a String as a fragment name. Maybe get verbose. I had a similar problem with Dalli(with Memcached), and that was the reason.