I have EventsController create action that looks like this:
class EventsController < ApplicationController
def create
#event = Event.new(params[:event].slice(*Event.accessible_attributes))
if #event.save
DraftBuilder.new(event: #event).build(params[:event][:file].path)
redirect_to #event
else
render :new
end
end
end
params[:event][:file] is a file that user can submit through Event#new action via file_field_tag.
DraftBuilder#build method, among many other things, parses a given file and creates about 1000 records in a database(saves data to database across several tables).
Problem I have is that DraftBuilder#build is really slow. It is slow because I'am saving records in a loop and Active Record creates new transaction for every save.
Simplified DraftBuilder#build might look like this:
class DraftBuilder
def build(file)
#data = parse(file) ##data is an array of hashes with 1000+ records
#data.each do |pick_arguments|
Pick.create(pick_arguments)
end
end
end
I found one solution to this problem. Wrap controller create action in to ActiveRecord::Base.transaction:
class EventsController < ApplicationController
around_filter :transactions_filter, only: [:create]
def transactions_filter
ActiveRecord::Base.transaction do
yield
end
end
end
While this solution is working, creates just one transaction, and speeds the whole process by about 60 times. Is it a good way to tackle this problem?? Surely transactions haven't been design for this?? What are the other options for creating records from files with more then thousand entries??
The best solution to slow running processes is to use background jobs like delayed_job or resque or sidekiq.
You have two ways:
Instead of
#data.each do |pick_arguments|
Pick.create(pick_arguments)
end
Transactions
ActiveRecord::Base.transaction do
#data.each do |pick_arguments|
Pick.create(pick_arguments)
end
end
Gem activerecord-import
data = []
#data.each do |pick_arguments|
data << Pick.new(pick_arguments)
end
Pick.import data
Related
I have main 3 controllers, which has a function called activation that is taking the params from the form but before creating into the database, I have to merge some kind of hash into that params, Now am doing in this way:
class AccountsController < ApplicationController
def activation
activation_params = if valid_user?
# This is a service, there I am taking this params and adding up one hash and returning the same
# params back
Account::ActivationParamsModifier.new(params).call
else
params
end
#activations = Activation.new(activation_params[:activations])
if #activations.save!
# Code Here
end
end
private
def valid_user?
# Valid User Check
end
end
end
I am trying to figure out one thing here, this below same code in 3 controllers and also I guess this code is not with a good refactoring. Any kind of suggestions to improve this? I know this small code but shows in different controllers. I am not sure this before_action is a good fit for this.
activation_params = if valid_user?
Account::ActivationParamsModifier.new(params, user_id).call
else
params
end
Situation
I have a model User:
def User
has_many :cars
def cars_count
cars.count
end
def as_json options = {}
super options.merge(methods: [:cars_count])
end
end
Problem
When I need to render to json a collection of users, I end up being exposed to the N+1 query problem. It is my understanding that including cars doesn't solve the problem for me.
Attempted Fix
What I would like to do is add a method to User:
def User
...
def self.as_json options = {}
cars_counts = Car.group(:user_id).count
self.map do |user|
user.define_singleton_method(:cars_count) do
cars_counts[user.id]
end
user.as_json options
end
end
end
That way all cars counts would be queried in a single query.
Remaining Issue
ActiveRecord::Relation already has a as_json method and therefore doesn't pick the class defined one. How can I make ActiveRecord::Relation use the as_json method from the class when it is defined? Is there a better way to do this?
Edits
1. Caching
I can cache my cars_count method:
def cars_count
Rails.cache.fetch("#{cache_key}/cars_count") do
cars.count
end
end
This is nice once the cache is warm, but if a lot of users are updated at the same time, it can cause request timeouts because a lot of queries have to be updated in a single request.
2. Dedicated method
Instead of calling my method as_json, I can call it my_dedicated_as_json_method and each time I need to render a collection of users, instead of
render json: users
write
render json: users.my_dedicated_as_json_method
However, I don't like this way of doing. I may forget to call this method somewhere, someone else might forget to call it, and I'm losing clarity of the code. Monkey patching seems a better route for these reasons.
Have you considered using a counter_cache for cars_count? It's a good fit for what you're wanting to do.
This blog article also offers up some other alternatives, e.g. if you want to manually build a hash.
If you really wanted to continue down the monkey patching route, then ensure that you are patching ActiveRecord::Relation rather than User, and override the instance method rather than creating a class method. Note that this will then affect every ActiveRecord::Relation, but you can use #klass to add a condition that only runs your logic for User
# Just an illustrative example - don't actually monkey patch this way
# use `ActiveSupport::Concern` instead and include the extension
class ActiveRecord::Relation
def as_json(options = nil)
puts #klass
end
end
Option 1
In your user model:
def get_cars_count
self.cars.count
end
And in your controller:
User.all.as_json(method: :get_cars_count)
Option 2
You can create a method which will get all the users and their car count. And then you can call the as_json method on that.
It would roughly look like:
#In Users Model:
def self.users_with_cars
User.left_outer_joins(:cars).group(users: {:id, :name}).select('users.id, users.name, COUNT(cars.id) as cars_count')
# OR may be something like this
User.all(:joins => :cars, :select => "users.*, count(cars.id) as cars_count", :group => "users.id")
end
And in your controller you can call as_json:
User.users_with_cars.as_json
Here is my solution in case someone else is interested.
# config/application.rb
config.autoload_paths += %W(#{config.root}/lib)
# config/initializers/core_extensions.rb
require 'core_extensions/active_record/relation/serialization'
ActiveRecord::Relation.include CoreExtensions::ActiveRecord::Relation::Serialization
# lib/core_extensions/active_record/relation/serialization.rb
require 'active_support/concern'
module CoreExtensions
module ActiveRecord
module Relation
module Serialization
extend ActiveSupport::Concern
included do
old_as_json = instance_method(:as_json)
define_method(:as_json) do |options = {}|
if #klass.respond_to? :collection_as_json
scoping do
#klass.collection_as_json options
end
else
old_as_json.bind(self).(options)
end
end
end
end
end
end
end
# app/models/user.rb
def User
...
def self.collection_as_json options = {}
cars_counts = Car.group(:user_id).count
self.map do |user|
user.define_singleton_method(:cars_count) do
cars_counts[user.id]
end
user.as_json options
end
end
end
Thanks #gwcodes for pointing me at ActiveSupport::Concern.
I am trying to use Redis as a caching layer between my app and a PostgreSQL db.
Please see below, my routes, items_controller, items_helper files. I'm confused about how #fetch_items in the items_helper is supposed to get called.
Presently, I am rendering jbuilder templates from all of my controller actions. I need to retain this functionality.
routes
Rails.application.routes.draw do
resources :users
resources :items
get 'users/:id/sold_items' => 'users#sold_items'
get 'categories/:id/available_items' => 'categories#available_items'
get 'performances/:view' => 'performances#show'
items_controller.rb
class ItemsController < ApplicationController
include ItemsHelper
# Returns full list of items
def index
#items = Item.all
end
# Returns details for a single item
def show
#item = Item.find(params[:id])
end
end
items_helper
module ItemsHelper
def fetch_items
byebug
items = $redis.get("items")
if items.nil?
items = Item.all.to_json
$redis.set("items", items)
end
#items = JSON.load items
end
end
You need to call fetch_items manually because this method is not going to be called automatically. Given the code, I suppose that you can replace
#items = Item.all
with
#items = fetch_items
to use the fetch_items method.
PS. The fetch_items method won't return an array of Item objects, only an array of hashes, so you might need to adjust other parts of the code as well.
I would recommend benchmarking whether using Redis in this particular situation is faster before making it to production.
I would recommend reading the rails guide "caching with rails" before starting to use redis by itself.
for example you could do something like:
cachekey = "items/#{Item.maximum(:updated_at)}"
#items = Rails.cache.fetch(cachekey, expires_in: 12.hours) do
Item.all.as_json
end
The above code would keep a cache of all the items and stay updated to the most recent update. Probably wouldn't want to do this if your your items collection is huge, but worth considering.
I want to initialize some attributes in retrieved objects with values received from an external API. after_find and after_initialize callbacks won't work for me as this way I have to call the API for each received object, which is is quite slow. I want something like the following:
class Server < ActiveRecord::Base
attr_accessor :dns_names
...
after_find_collection do |servers|
all_dns_names = ForeignLibrary.get_all_dns_entries
servers.each do |s|
s.dns_names = all_dns_names.select{|r| r.ip == s.ip}.map{|r| r.fqdn}
end
end
end
Please note that caching is not a solution, as I need to always have current data, and the data may be changed outside the application.
You'd want to have a class method that enhances each server found with your data. so, something like:
def index
servers = Server.where(condition: params[:condition]).where(second: params[:second])
#servers = Server.with_domains_names(servers)
end
class Server
def self.with_domain_names(servers)
all_dns_names = ForeignLibrary.get_all_dns_entries
servers.each do |s|
s.dns_names = all_dns_names.select{|r| r.ip == s.ip}.map{|r| r.fqdn}
end
end
end
This way, the ForeignLibrary.get_all_dns_entries only gets run once, and you can enhance your servers with that extra information.
If you wanted to do this every time you initialize a server object, I'd simply delegate rather than use after_initialize. So you'd effectively store the all dns entries in a global variable, and then cache it for a period of time. ForeignLibrary.get_all_dns_entries call. So, it would be something like:
class Server
def dns_names
ForeignLibrary.dns_for_server(self)
end
end
class ForeignLibrary
def self.reset
##all_dns_names = nil
end
def self.dns_for_server(server)
all_dns_names.select{|r| r.ip == server.ip}.map{|r| r.fqdn}
end
def self.all_dns_names
Mutex.new.synchronize do
##all_dns_names ||= call_the_library_expensively
end
end
end
(I also used a mutex here since we are doing ||= with class variables)
to use it, you would:
class ApplicationController
before_filter do
ForeignLibrary.reset #ensure every page load has the absolute latest data
end
end
I have a table with data that needs to be updated at run-time by additional data from an external service. What I'd like to do is something like this:
MyModel.some_custom_scope.some_other_scope.enhance_with_external_data.each do |object|
puts object.some_attribute_from_external_data_source
end
Even if I can't use this exact syntax, I would like the end result to respect any scopes I may use. I've tried this:
def self.enhance_with_external_data
external_data = get_external_data
Enumerator.new do |yielder|
# mimick some stuff I saw in ActiveRecord and don't quite understand:
relation.to_a.each do |obj|
update_obj_with_external_data(obj)
yielder.yield(obj)
end
end
end
This mostly works, except it doesn't respect any previous scopes that were applied, so if I do this:
MyModel.some_custom_scope.some_other_scope.enhance_with_external_data
It gives back ALL MyModels, not just the ones scoped by some_custom_scope and some_other_scope.
Hopefully what I'm trying to do makes sense. Anyone know how to do it, or am I trying to put a square peg in a round hole?
I figured out a way to do this. Kind of ugly, but seems to work:
def self.merge_with_extra_info
the_scope = scoped
class << the_scope
alias :base_to_a :to_a
def to_a
MyModel.enhance(base_to_a)
end
end
the_scope
end
def self.enhance(items)
items.each do |item|
item = add_extra_item_info(item)
end
items
end
What this does is add a class method to my model - which for reasons unknown to me seems to also make it available to ActiveRecord::Relation instances. It overrides, just for the current scope object, the to_a method that gets called to get the records. That lets me add extra info to each record before returning. So now I get all the chainability and everything like:
MyModel.where(:some_attribute => some_value).merge_with_extra_info.limit(10).all
I'd have liked to be able to get at it as it enumerates versus after it's put into an array like here, but couldn't figure out how to get that deep into AR/Arel.
I achieved something similar to this by extending the relation:
class EnhancedModel < DelegateClass(Model)
def initialize(model, extra_data)
super(model)
#extra_data = extra_data
end
def use_extra_data
#extra_data.whatever
end
end
module EnhanceResults
def to_a
extra_data = get_data_from_external_source(...)
super.to_a.map do |model_obj|
EnhancedModel.new(model_obj, extra_data)
end
end
end
models = Model.where('condition')
models.extend(EnhanceResults)
models.each do |enhanced_model|
enhanced_model.use_extra_data
end