I have rails 5 based api app, using fast_jsonapi
and after a while I observe that all most all my actions are having one common pattern
def action_name
#some_object.perform_action_name # this returns #some_object
render json: ControllerNameSerializer.new(#some_object).to_h
end
I do not wish to write the last render line here and it should work, for that I want that the returned value by the action should be processed by any hidden responder like thing, Serializer klass can be made out looking at controller name.
Perhaps this could be achieved by adding a small middleware. However at first, I find it not a good idea/practise to go for a middleware. In middleware, we do get rendered response, we need a hook prior to that.
I would imagine like
class SomeController ...
respond_with_returned_value
def action_name
#some_object.perform_action_name # this returns #some_object
end
Any suggestions?
Note, do not worry about error/failure cases, #some_object.errors could hold them and I have a mechanism to handle that separately.
Sketched out...
class ApplicationController < ...
def respond_with_returned_value
include MyWrapperModule
end
...
end
module MyWrapperModule
def self.included(base)
base.public_instance_methods.each do |method_name|
original_method_name = "original_#{method_name}".to_sym
rename method_name -> original_method_name
define_method(method_name) { render json: send(original_method_name) }
end
end
end
Seems like there really should be some blessed way to do this - or like someone must have already done it.
I guess this question is common with Rails 4, but my situation is different.
I am using Sidekiq to delay the creation of jobs; think this is possible as with simple data, it works. By means of simple data:
def perform
Foo.create(bar: "staff")
end
Here's my data with issues:
supports_controller.rb:
def create
params = support_params // seems to be issues here?
DelayedJobs.perform_in(1.minutes, current_user.id, params)
...
end
private
def support_params
params.require(:support).permit(:foo1, :foo2, :foo3)
end
app/workers/delayed_jobs.rb:
class DelayedJobs
include Sidekiq::Worker
def perform(user_id, params)
u = User.find(user_id)
support = u.supports.build(params)
Support.create(support) // create and save to db
end
end
Via web (localhost:3000/sidekiq/scheduled, I see the details. Great. But after a minute it goes to retries with the error. Any help on this one?
EDIT:
In the sidekiq web argument:
40, {"foo1"=>"a", "foo2"=>"b", "foo3"=>"c"}
Why is that the user_id (40) is outside?
The problem isn't with Sidekiq; it's an ActiveRecord problem with this line:
Support.create(support)
create only takes a hash, but you're giving it a Support.
This should work:
class DelayedJobs
include Sidekiq::Worker
def perform(user_id, params)
u = User.find(user_id)
u.supports.create!(params) # `create!` will raise an error if the save fails; allowing you to catch invalid params
end
end
Protip: you can eliminate Sidekiq as a suspect by running the body of your perform method in a Rails console. You'll see that you get the same error even when Sidekiq isn't involved.
I suggest that you call save method on support object because when you are using build method it returns a new instance of support so you need only to save it.
class DelayedJobs
include Sidekiq::Worker
def perform(user_id, params)
u = User.find(user_id)
support = u.supports.build(params)
support.save // save to db
end
end
In your controller try to change params to:
def create
params = params[:support]
In Rails notifications, I am subscribing to "process_action.action_controller", and would like to add more attributes to the payload. How can I do that?
I have tried using append_info_to_payload, but this seems to do nothing.
module AppendExceptionPayload
module ControllerRuntime
extend ActiveSupport::Concern
protected
def append_info_to_payload(payload)
super
payload[:happy] = "HAPPY"
end
end
end
The subscription and above code is in a Rails engine, so this is where I make the call to add it:
require 'append_exception_payload'
module Instrument
class Engine < ::Rails::Engine
ActiveSupport.on_load :action_controller do
include AppendExceptionPayload::ControllerRuntime
end
end
end
After putting up the bounty, I found a solution myself. Rails handles this really cleanly.
Basically, the append_info_to_payload method is meant exactly for this.
So to include session information and signed_in user information I added this to my application_controller.rb:
def append_info_to_payload(payload)
super
payload[:session] = request.session_options[:id] rescue ""
payload[:user_id] = session[:user_id] rescue "unknown"
end
So i jumped in and had a look at the api for the process_action method (private) and the append_info_to_payload instance method (public) and the proccess action method seems to call append_info_to_payload in its code like so:
ActiveSupport::Notifications.instrument("process_action.action_controller", raw_payload) do |payload|
result = super
payload[:status] = response.status
append_info_to_payload(payload)
result
end
and append_info_to_payload works something like this
def append_info_to_payload(payload) #:nodoc:
payload[:view_runtime] = view_runtime
end
I can suggest trying payload[:view_runtime] instead of payload[:happy] or trying to use payload[:status]
Let me know how you get on and I will try help more, unfortunately there is really no documentation for this stuff.
I have a gem I'm developing that is based around using filters on ApplicationController. It's basically for logging, and one of the modules defines an around filter like so:
module LogExceptionFilter
self.included(base)
base.around_filter :do_a_bunch_of_logging_stuff
end
def do_a_bunch_of_logging_stuff
...
end
end
It happens to be an around filter where I deal with exception logging, but my question would apply for any filter.
So it's supposed to be used like this
class ApplicationController
include LogExceptionFilter
end
So what I'm worried about is if someone does:
class ApplicationController
include LogExceptionFilter
include LogExceptionFilter
end
I don't want to execute do_a_bunch_of_logging_stuff twice. So first
1)If do_a_bunch_of_logging_stuff is included twice, will rails apply the filter twice?
2)Is it my responsibility to protect the user from doing this? I could do so with a class variable, something like:
module LogExceptionFilter
class << self
cattr_accessor :filter_loaded
end
self.included(base)
unless filter_loaded
base.around_filter :do_a_bunch_of_logging_stuff
filter_loaded = true
end
end
def do_a_bunch_of_logging_stuff
...
end
end
This variable is not thread safe so it's something that I'd want to be careful about putting in. But I don't want to write a library that can be easily broken. Thanks.
Here are some relevant links:
http://www.ruby-forum.com/topic/95269
http://www.ruby-forum.com/topic/164588
Basically, a module will only be included once, but the included callback may be called multiple times.
I have an expensive (time-consuming) external request to another web service I need to make, and I'd like to cache it. So I attempted to use this idiom, by putting the following in the application controller:
def get_listings
cache(:get_listings!)
end
def get_listings!
return Hpricot.XML(open(xml_feed))
end
When I call get_listings! in my controller everything is cool, but when I call get_listings Rails complains that no block was given. And when I look up that method I see that it does indeed expect a block, and additionally it looks like that method is only for use in views? So I'm guessing that although it wasn't stated, that the example is just pseudocode.
So my question is, how do I cache something like this? I tried various other ways but couldn't figure it out. Thanks!
an in-code approach could look something like this:
def get_listings
#listings ||= get_listings!
end
def get_listings!
Hpricot.XML(open(xml_feed))
end
which will cache the result on a per-request basis (new controller instance per request), though you may like to look at the 'memoize' helpers as an api option.
If you want to share across requests don't save data on the class objects, as your app will not be threadsafe, unless you're good at concurrent programming & make sure the threads don't interfere with each other's data access to the shared variable.
The "rails way" to cache across requests is the Rails.cache store. Memcached gets used a lot, but you might find the file or memory stores fit your needs. It really depends on how you're deploying and whether you want to prioritise cache hits, response time, storage (RAM), or use a hosted solution e.g. a heroku addon.
As nruth suggests, Rails' built-in cache store is probably what you want.
Try:
def get_listings
Rails.cache.fetch(:listings) { get_listings! }
end
def get_listings!
Hpricot.XML(open(xml_feed))
end
fetch() retrieves the cached value for the specified key, or writes the result of the block to the cache if it doesn't exist.
By default, the Rails cache uses file store, but in a production environment, memcached is the preferred option.
See section 2 of http://guides.rubyonrails.org/caching_with_rails.html for more details.
You can use the cache_method gem:
gem install cache_method
require 'cache_method'
In your code:
def get_listings
Hpricot.XML(open(xml_feed))
end
cache_method :get_listings
You might notice I got rid of get_listings!. If you need a way to refresh the data manually, I suggest:
def refresh
clear_method_cache :get_listings
end
Here's another tidbit:
def get_listings
Hpricot.XML(open(xml_feed))
end
cache_method :get_listings, (60*60) # automatically expire cache after an hour
You can also use cachethod gem (https://github.com/reneklacan/cachethod)
gem 'cachethod'
Then it is deadly simple to cache method's result
class Dog
cache_method :some_method, expires_in: 1.minutes
def some_method arg1
..
end
end
It also supports argument level caching
There was suggested cache_method gem, though it's pretty heavy. If you need to call method without arguments, solution is very simple:
Object.class_eval do
def self.cache_method(method_name)
original_method_name = "_original_#{method_name}"
alias_method original_method_name, method_name
define_method method_name do
#cache ||= {}
#cache[method_name] = send original_method_name unless #cache.key?(method_name)
#cache[method_name]
end
end
end
then you can use it in any class:
def get_listings
Hpricot.XML(open(xml_feed))
end
cache_method :get_listings
Note - this will also cache nil, which is the only reason to use it instead of #cached_value ||=
Late to the party, but in case someone arrives here searching.
I use to carry this little module around from project to project, I find it convenient and extensible enough, without adding an extra gem. It uses the Rails.cache backend, so please use it only if you have one.
# lib/active_record/cache_method.rb
module ActiveRecord
module CacheMethod
extend ActiveSupport::Concern
module ClassMethods
# To be used with a block
def cache_method(args = {})
#caller = caller
caller_method_name = args.fetch(:method_name) { #caller[0][/`.*'/][1..-2] }
expires_in = args.fetch(:expires_in) { 24.hours }
cache_key = args.fetch(:cache_key) { "#{self.name.underscore}/methods/#{caller_method_name}" }
Rails.cache.fetch(cache_key, expires_in: expires_in) do
yield
end
end
end
# To be used with a block
def cache_method(args = {})
#caller = caller
caller_method_name = args.fetch(:method_name) { #caller[0][/`.*'/][1..-2] }
expires_in = args.fetch(:expires_in) { 24.hours }
cache_key = args.fetch(:cache_key) { "#{self.class.name.underscore}-#{id}-#{updated_at.to_i}/methods/#{caller_method_name}" }
Rails.cache.fetch(cache_key, expires_in: expires_in) do
yield
end
end
end
end
Then in an initializer:
# config/initializers/active_record.rb
require 'active_record/cache_method'
ActiveRecord::Base.send :include, ActiveRecord::CacheMethod
And then in a model:
# app/models/user.rb
class User < AR
def self.my_slow_class_method
cache_method do
# some slow things here
end
end
def this_is_also_slow(var)
custom_key_depending_on_var = ...
cache_method(key_name: custom_key_depending_on_var, expires_in: 10.seconds) do
# other slow things depending on var
end
end
end
At this point it only works with models, but can be easily generalized.
Other answers are excellent but if you want a simple hand-rolled approach you can do this. Define a method like the below one in your class...
def use_cache_if_available(method_name,&hard_way)
#cached_retvals ||= {} # or initialize in constructor
return #cached_retvals[method_name] if #cached_retvals.has_key?(method_name)
#cached_retvals[method_name] = hard_way.call
end
Thereafter, for each method you want to cache you can put wrap the method body in something like this...
def some_expensive_method(arg1, arg2, arg3)
use_cache_if_available(__method__) {
calculate_it_the_hard_way_here
}
end
One thing that this does better than the simplest method listed above is that it will cache a nil. It has the convenience that it doesn't require creating duplicate methods. Probably the gem approach is cleaner, though.
I'd like to suggest my own gem https://github.com/igorkasyanchuk/rails_cached_method
For example:
class A
def A.get_listings
....
end
end
Just call:
A.cached.get_listings