Short description
I need to save a field to a table. I used to do this from the controller and it worked perfectly, but now I need to set this field from the service instead. I am using attr_accessor but am not able to get it to work properly.
Long description
I wrote a service (ToolService) that uses an api to create an array of hashes. I have previously saved this array to the object via the controller.
Controller:
1: class ToolsController < ApplicationController
2: def create
3: tool_hash = params.delete('tool')
4: #tool = Tool.new
5: # blah blah get params
6: t = ToolService.new(# pass params to initialize service)
7: #tool.all_data = t.run_tool_report(# pass params to get result)
8: end
9: end
Service:
class ToolService
attr_accessor :all_data
def initialize(# params)
# initializing stuff
end
def run_tool_report(# params, including array_of_tools)
#all_data = Array.new # create an array to hold all hashes of data
array_of_tools.each do |each tool|
# run all api queries
#each_tool_data = # hash of query results
#all_data << #each_tool_data # add each hash of results to array
end
return #all_data
end
end
This works as expected. However, I need to implement Delayed Jobs because this query takes a long time. So, in the controller I have changed line 7 to t.delay.run_tool_report(# pass params to get result). I thought that including attr_accessor :all_data in the service would allow the service write to the #tool.all_data field in the table, but this doesn't seem to be the case.
When I use #tool.delay.all_data = t.run_tool_report(# pass params to get result), #tool.all_data is set to the id of the delayed job, not the array of results.
So, am I using attr_accessor incorrectly? Or is there some other way to set this field in the table?
Delayed job comes in handy when you want to run task asynchronously. When you write t.delay.run_tool_report it creates an entry in the delayed_jobs model to be run in the background. This object is returned to you in the #tool.all_data. If you want the result of the run_tool_report, you need to run without delay and optimise your queries. Preloading/eagerloading and caching techniques might come handy.
Related
I have ruby on rails app and my controller should process request which creates many objects. Objects data is passed from client via json using POST method.
Example of my request (log from controller):
Processing by PersonsController#save_all as JSON
Parameters: {"_json"=>[{"date"=>"9/15/2014", "name"=>"John"},
{"date"=>"9/15/2014", "name"=>"Mike"}], "person"=>{}}
So i need to save these two users but i have some issues:
How to verify strong parameters here? Only Name and Date attributes can be passed from client
How can I convert String to Date if i use Person.new(params)?
Can i somehow preprocess my json? For example i want to replace name="Mike" to name="Mike User" and only then pass it in my model
I want to enrich params of every person by adding some default parameters, for example, i want to add status="new_created" to person params
First of all I'd name the root param something like "users", then it gives a structure that is all connected to the controller name and the data being sent.
Regarding strong params. The config depends of your rails app version. <= 3.x doesn't have this included so you need to add the gem. If you're on >= 4.x then this is already part of rails.
Next in your controller you need to define a method that will do the filtering of the params you need. I should look something like:
class PeopleController < ApplicationController
def some_action
# Here you can call a service that receives people_params and takes
# care of the creation.
if PeopleService.new(people_params).perform
# some logic
else
# some logic
end
end
private
def base_people_params
params.permit(people: [:name, :date])
end
# Usually if you don't want to manipulate the params then call the method
# just #people_params
def people_params
base_people_params.merge(people: normalized_params)
end
# In case you decided to manipulate the params then create small methods
# that would that separately. This way you would be able to understand this
# logic when returning to this code in a couple of months.
def normalized_params
return [] unless params[:people]
params[:people].each_with_object([]) do |result, person|
result << {
name: normalize_name(person[:name]),
date: normalize_date(person[:date]),
}
end
end
def normalize_date(date)
Time.parse(date)
end
def normalize_name(name)
"#{name} - User"
end
end
If you see that the code starts to get to customized take into a service. It will help to help to keep you controller thin (and healthy).
When you create one reason at the time (and not a batch like here) the code is a bit simpler, you work with hashes instead of arrays... but it's all pretty much the same.
EDIT:
If you don't need to manipulate a specific param then just don't
def normalized_params
return [] unless params[:people]
params[:people].each_with_object([]) do |result, person|
result << {
name: person[:name],
date: normalize_date(person[:date]),
}
end
end
I have built an app that consumes a json api. I removed active record from my app because the data in the api can theoretically change and I don't want to wipe the database each time.
Right now I have a method called self.all for each class that loops through the json creating ruby objects. I then call that method in various functions in order to work with the data finding sums and percentages. This all works fine, but seems a bit slow. I was wondering if there is somewhere I should be storing my .all call rather than instantiating new objects for each method that works with the data.
...response was assign above using HTTParty...
def self.all
puppies = []
if response.success?
response['puppies'].each do |puppy|
accounts << new(puppy['name'],
puppy['price'].to_money,
puppy['DOB'])
end
else
raise response.response
end
accounts
end
# the methods below only accept arguments to allow testing with Factories
# puppies is passed in as Puppy.all
def self.sum(puppies)
# returns money object
sum = Money.new(0, 'USD')
puppies.each do |puppy|
sum += puppy.price
end
sum
end
def self.prices(puppies)
prices = puppies.map { |puppy| puppy.price }
end
def self.names(puppies)
names = puppies.map { |puppy| puppy.name }
end
....many more methods that take an argument of Puppy.all in the controller....
Should I use cacheing? should I bring back active record? or is how I'm doing it fine? Should I store Puppy.all somewhere rather than calling the method each time?
What I guess is happening is that you are making a request with HTTParty every time you call any class method. What you can consider is creating a class variable for the response and a class variable called expires_at. Then you can do some basic caching.
##expires_at = Time.zone.now
##http_response
def make_http_call
renew_http_response if ##expires_at.past?
end
def renew_http_response
# make HTTParty request here
##http_response = # HTTParty response
##expires_at = 30.minutes.from_now
end
# And in your code, change response to ##response
# ie response.success? to ##response.success?
This is all in memory and you lose everything if you restart your server. If you want more robust caching, the better thing to do would probably to look into rails low-level caching
I've set up a simple 'Settings' model so I can create some user-modifiable globals which I can use throughout my app.
This model has var_name and value columns. The var_name is a primary key.
I can access these using Setting.find('my_variable_name'), which works, but results in a database query every time.
I'd like to be able to 'cache' the settings once, instead of querying the db every time.
I'm not sure of the correct way to go about this, I'm still new to Rails.
I imagine something like settings = Setting.all in my application controller to put all the settings in a variable, then create a way to manipulate a single variable out of it - call it using something like settings.my_variable_name...
...But I don't know how :)
Any advice would be appreciated. Thanks.
Try adding this to your Setting model,
class Setting < ActiveRecord::Base
def self.fetch(name)
fetch_all.detect { |s| s.var_name == name }
end
def self.fetch_all
#fetch_all ||= Setting.all
end
def self.reload_all!
#fetch_all = Setting.all
end
end
Create some settings,
Setting.create(var_name: 'domain', value: 'google.com')
Setting.create(var_name: 'subdomain', value: 'www')
You can then fetch a setting using Setting.fetch('domain').value #=> 'google.com'.
Settings.fetch('domain') #=> Hits the database with Settings.all
Settings.fetch('subdomain') #=> Hits the results in #fetch_all
Settings.fetch('domain') #=> Hits the results in #fetch_all
#fetch_all ||= all will call Setting.all the first time Setting.fetch_all is called and store the results in #fetch_all. Subsequent calls will use the results stored in #fetch_all.
I have a method like this that goes through an array to find different APIs and launch a delayed_job instance for every API found like this.
def refresh_users_list
apis_array.each do |api|
api.myclass.new.delay.get_and_create_or_update_users
end
end
I have an after_filter on users#index controller to trigger this method. This is creating many jobs to be triggered that will eventually cause too many connections problems on Heroku.
I'm wondering if there's a way I can check for the presence of a Job in the database by each of the API that the array iterates. This would be very helpful so I can only trigger a particular refresh if that api wasn't updated on a given time.
Any idea how to do this?
In config/application.rb, add the following
config.autoload_paths += Dir["#{config.root}/app/jobs/**/"]
Create a new directory at app/jobs/.
Create a file at app/jobs/api_job.rb that looks like
class ApiJob < Struct.new(:attr1, :attr2, :attr3)
attr_accessor :token
def initialize(*attrs)
self.token = self.class.token(attr1, attr2, attr3)
end
def display_name
self.class.token(attr1, attr2, attr3)
end
#
# Class methods
#
def self.token(attr1, attr2, attr3)
[name.parameterize, attr1.id, attr2.id, attr3.id].join("/")
end
def self.find_by_token(token)
Delayed::Job.where("handler like ?", "%token: #{token}%")
end
end
Note: You will replace attr1, attr2, and attr3 with whatever number of attributes you need (if any) to pass to the ApiJob to perform the queued task. More on how to call this in a moment
For each of your API's that you queue some get_and_create_or_update_users method for you'll create another Job. For example, if I have some Facebook api model, I might have a class at app/jobs/facebook_api_job.rb that looks like
class FacebookApiJob < ApiJob
def perform
FacebookApi.new.get_and_create_or_update_users(attr1, attr2, attr3)
end
end
Note: In your Question you did not pass any attributes to get_and_create_or_update_users. I am just showing you where you would do this if you need the job to have attributes passed to it.
Finally, wherever your refresh_users_list is defined, define something like this job_exists? method
def job_exists?(tokens)
tokens = [tokens] if !tokens.is_a?(Array) # allows a String or Array of tokens to be passed
tokens.each do |token|
return true unless ApiJob.find_by_token(token).empty?
end
false
end
Now, within your refresh_users_list and loop, you can build new tokens and call job_exists? to check if you have queued jobs for the API. For example
# Build a token
def refresh_users_list
apis_array.each do |api|
token = ApiJob.token(attr1, attr2, attr3)
next if job_exists?(token)
api.myclass.new.delay.get_and_create_or_update_users
end
end
Note: Again I want to point out, you won't be able to just drop in the code above and have it work. You must tailor it to your application and the job's you're running.
Why is this so complicated?
From my research, there's no way to "tag" or uniquely identify a queued job through what delayed_job provides. Sure, each job has a unique :id attribute. You could store the ID values for each created job in some hash somewhere
{
"FacebookApi": [1, 4, 12],
"TwitterApi": [3, 193, 44],
# ...
}
and then check corresponding hash key for an ID, but I find this limiting, and not always sufficient for the problem When you need to identify a specific job by multiple attributes like above, we must create a way to find these jobs (without loading every job into memory and looping over them to see if one matches our criteria).
How is this working?
The Struct that the ApiJob extends has a :token attribute. This token is based on the attributes passed (attr1, attr2, attr3) and is built when a new class extending ApiJob is instantiated.
The find_by_token class method simply searches the string representation of the job in the delayed_job queue for a match based on a token built using the same token class method.
Does anyone know if its possible to create a model instance and apply the ID and any other attributes without having to load it from the database? I tried doing this, but the associations are not fetched from the database :( Any ideas?
EDIT
What I want to accomplish is simply this:
Fetch an existing record from the database.
Store as "hashed" output of the record into redis or some other memory store.
Next time when that record is fetched, fetch the cached store first and if it is not found then goto step 1.
If there is a cache hit, then load all the cached attributes into that model and make that model instance behave as if it were a model fetched from the database with a finite set of columns.
This is where I am stuck, what I've been doing is creating a Model.new object and setting each of the params manually. This works, but it treats the instantiated model object as a new record. There has got to be an intermediate subroutine in ActiveRecord that does the attribute setting.
I solved the problem by doing the following.
Create a new model class which extends the model class that I want to have cached into memory.
Set the table_name of the new class to the same one as the parent class.
Create a new initialize method, call the super method in it, and then allow a parameter of that method to allow for a hash variable containing all the properties of the parent class.
Overload the method new_record? and set that to false so that the associations work.
Here's my code:
class Session < User
self.table_name = 'users'
METHODS = [:id, :username] # all the columns that you wish to have in the memory hash
METHODS.each do |method|
attr_accessor method
end
def initialize(data)
super({})
if data.is_a?(User)
user = data
data = {}
METHODS.each do |key|
data[key] = user.send(key)
end
else
data = JSON.parse(data)
end
data.each do |key,value|
key = key.to_s
self.send(key+'=',value)
end
end
def new_record?
false
end
end
The memcached gem will allow you to shove arbitrary Ruby objects into it. This should all get handled for you transparently, if you're using it.
Otherwise, take a look at ActiveRecord::Base#instantiate to see how it's done normally. You're going to have to trace through a bunch of rails stack, but that's what you get for attempting such hackery!