I have built an app that consumes a json api. I removed active record from my app because the data in the api can theoretically change and I don't want to wipe the database each time.
Right now I have a method called self.all for each class that loops through the json creating ruby objects. I then call that method in various functions in order to work with the data finding sums and percentages. This all works fine, but seems a bit slow. I was wondering if there is somewhere I should be storing my .all call rather than instantiating new objects for each method that works with the data.
...response was assign above using HTTParty...
def self.all
puppies = []
if response.success?
response['puppies'].each do |puppy|
accounts << new(puppy['name'],
puppy['price'].to_money,
puppy['DOB'])
end
else
raise response.response
end
accounts
end
# the methods below only accept arguments to allow testing with Factories
# puppies is passed in as Puppy.all
def self.sum(puppies)
# returns money object
sum = Money.new(0, 'USD')
puppies.each do |puppy|
sum += puppy.price
end
sum
end
def self.prices(puppies)
prices = puppies.map { |puppy| puppy.price }
end
def self.names(puppies)
names = puppies.map { |puppy| puppy.name }
end
....many more methods that take an argument of Puppy.all in the controller....
Should I use cacheing? should I bring back active record? or is how I'm doing it fine? Should I store Puppy.all somewhere rather than calling the method each time?
What I guess is happening is that you are making a request with HTTParty every time you call any class method. What you can consider is creating a class variable for the response and a class variable called expires_at. Then you can do some basic caching.
##expires_at = Time.zone.now
##http_response
def make_http_call
renew_http_response if ##expires_at.past?
end
def renew_http_response
# make HTTParty request here
##http_response = # HTTParty response
##expires_at = 30.minutes.from_now
end
# And in your code, change response to ##response
# ie response.success? to ##response.success?
This is all in memory and you lose everything if you restart your server. If you want more robust caching, the better thing to do would probably to look into rails low-level caching
Related
In my rails app I send data to the react front end in JSON API standard format via the jsonapi-serializer gem. This works well for data coming from our database, but when we have failing form data, which is sent in the form of rails nested resources, sometimes this data needs to be returned with errors before the records are actually saved.
When new model data has failing validations, I need those errors to come back to my front end and have the associations intact. I've been trying to run them through jsonapi-serializer as well, but the records that don't have ids, don't make it into the relationships key. The problem is that failing new records don't have an id, and they come back to the front end un-associated.
My only solution to this so far is to manually shim fake temporary ids in there. I'm not sure if I'm missing something obvious. That id is removed on the front end for re-submission. My solution isn't perfect and has limitations and issues.
Is there a built-in way to do this or am I stuck with my own implementation? Is there some kind of alternate key I could use for this association?
For what it's worth, this is my implementation so far, just to clarify what I'm doing. It's far from perfect and a lot of code for what seems like it should be a problem with a built-in solution. It's based on trying to iterate over the object tree being serialized and seeing if a "fake" id is needed. This still chokes on some things that can be passed via include: []
class BaseSerializer
module ShimIdConcerns
extend ActiveSupport::Concern
private
# Very Important! Controls relationship of json api assocs in front end
# The id that is returned to the front end needs to be smarter than simply
# the actual id or nil. The id is used to associate jsonapi objects to their
# main resource whether they have an id yet or not. Sometimes these objects
# are not persisted, but have errors and need to be associated properly in the UI.
# Works in conjuncton with attribute :is_persisted
def shim_id_on_failing_new_associations(resource, options)
shim_id_on_failing_new_associations_recursive resource, options[:include].to_a
end
# dot_assocs: expects assocs in 'packages.package_services.service' format
def shim_id_on_failing_new_associations_recursive(resource, dot_assocs)
assocs = simplify_assocs(dot_assocs)
assocs.each do |assoc_path|
next if assoc_path.blank?
segments = assoc_path.split('.')
method = segments.shift
next unless resource.respond_to?(method)
assoc = resource.send(method)
if assoc.is_a? ActiveRecord::Base
shim_id(resource)
shim_id_on_failing_new_associations_recursive assoc, segments.join('.')
elsif assoc.is_a?(ActiveRecord::Associations::CollectionProxy) || assoc.is_a?(Array)
assoc.each do |one_of_many|
shim_id_on_failing_new_associations_recursive one_of_many, segments.join('.')
end
end
end
end
# Gives a temp id on failing or new resources so they can be associated in react
# in jsonapi/react. Ensure this id is not re-submitted, however
def shim_id(resource)
resource.id = rand(1000000000000) if resource.id.nil? && resource.new_record?
end
# turns
# [ :reservation, :'reservation.client', :'reservation.client.credit_card' ]
# into
# [ "reservation.client.credit_card" ]
# to avoid duplicate processing
def simplify_assocs(dot_assocs)
ap dot_assocs
all = [dot_assocs].flatten.map(&:to_s)
simp = []
# yes quadratic, but will be little data
all.each do |needle|
matches = 0
all.each do |hay|
matches += 1 if hay.index(needle) === 0
end
simp << needle if matches === 1
end
simp
end
end
end
I'm using the OMDB API to learn about using 3rd Party apis in Rails. I've setup my app so all I have to input is the movie title and 6 other attributes get populated from the OMDB API. All of the method calls to retrieve the data from the api are very similar. The only thing that changes is one word in the method name and one word in the method body. Here is one such call:
app/services/omdb_service.rb
def get_image_by_title(title)
response = HTTP.get("http://www.omdbapi.com/?t=#{title}&apikey=123456789").to_s
parsed_response = JSON.parse(response)
parsed_response['Poster']
end
The things that change are the word after get in the method name and the word in the parsed_response['Poster']. They will change depending on what attribute I'm trying to get back.
I thought I could use method_missing to prevent duplication, but I'm having no success with it. Here is my method_missing call:
app/services/omdb_service.rb
def method_missing(method, *args)
if method.to_s.end_with?('_by_title')
define_method(method) do | args |
response = HTTP.get("http://www.omdbapi.com/?t=#{args[0]}&apikey=123456789").to_s
parsed_response = JSON.parse(response)
parsed_response['args[1]']
end
end
end
Can anyone see what is wrong with my method_missing call?
First of all, let me stress that this isn't necessarily a good use case for method_missing because there doesn't seem to be a way to get self-explanatory method names, parameters and such. Nevertheless, I'll try to answer your question as best as I can.
First of all, you need to adopt your method naming to the things that the API gives you to reduce the number of parameters. In the example you've given, you'd want to change the method call to get_poster_by_t because poster is the output and t is the input variable based on the URL and response you've shared.
Following this logic, you'd have to write method missing like so:
def method_missing(method, *args)
if method =~ /\Aget_([^_]+)_by_([^_]+)\z/
response = HTTP.get("http://www.omdbapi.com/?#{$~[2]}=#{args[0]}&apikey=123456789").to_s
parsed_response = JSON.parse(response)
parsed_response[$~[1].capitalize]
end
end
Then you should also incorporate Ruby's rules for implementing method_missing, namely calling super when your rule doesn't match and also overriding respond_to_missing?. This then gives you:
def method_missing(method, *args)
if method.to_s =~ /\Aget_([^_]+)_by_([^_]+)\z/
response = HTTP.get("http://www.omdbapi.com/?#{$~[2]}=#{args[0]}&apikey=123456789").to_s
parsed_response = JSON.parse(response)
parsed_response[$~[1].capitalize]
else
super
end
end
def respond_to_missing?(method, *args)
method.to_s =~ /\Aget_([^_]+)_by_([^_]+)\z/ || super
end
Also see https://makandracards.com/makandra/9821-when-overriding-method_missing-remember-to-override-respond_to_missing-as-well.
Personally, I'd not use method_missing here but instead go with an expressive method call – something like this:
def get_field_by_param(field:, param:, value:)
response = HTTP.get("http://www.omdbapi.com/?#{param}=#{value}&apikey=123456789").to_s
parsed_response = JSON.parse(response)
parsed_response[field]
end
You can then do things like get_field_by_param(field: "Poster", param: :t, value: "Whatever").
I have ruby on rails app and my controller should process request which creates many objects. Objects data is passed from client via json using POST method.
Example of my request (log from controller):
Processing by PersonsController#save_all as JSON
Parameters: {"_json"=>[{"date"=>"9/15/2014", "name"=>"John"},
{"date"=>"9/15/2014", "name"=>"Mike"}], "person"=>{}}
So i need to save these two users but i have some issues:
How to verify strong parameters here? Only Name and Date attributes can be passed from client
How can I convert String to Date if i use Person.new(params)?
Can i somehow preprocess my json? For example i want to replace name="Mike" to name="Mike User" and only then pass it in my model
I want to enrich params of every person by adding some default parameters, for example, i want to add status="new_created" to person params
First of all I'd name the root param something like "users", then it gives a structure that is all connected to the controller name and the data being sent.
Regarding strong params. The config depends of your rails app version. <= 3.x doesn't have this included so you need to add the gem. If you're on >= 4.x then this is already part of rails.
Next in your controller you need to define a method that will do the filtering of the params you need. I should look something like:
class PeopleController < ApplicationController
def some_action
# Here you can call a service that receives people_params and takes
# care of the creation.
if PeopleService.new(people_params).perform
# some logic
else
# some logic
end
end
private
def base_people_params
params.permit(people: [:name, :date])
end
# Usually if you don't want to manipulate the params then call the method
# just #people_params
def people_params
base_people_params.merge(people: normalized_params)
end
# In case you decided to manipulate the params then create small methods
# that would that separately. This way you would be able to understand this
# logic when returning to this code in a couple of months.
def normalized_params
return [] unless params[:people]
params[:people].each_with_object([]) do |result, person|
result << {
name: normalize_name(person[:name]),
date: normalize_date(person[:date]),
}
end
end
def normalize_date(date)
Time.parse(date)
end
def normalize_name(name)
"#{name} - User"
end
end
If you see that the code starts to get to customized take into a service. It will help to help to keep you controller thin (and healthy).
When you create one reason at the time (and not a batch like here) the code is a bit simpler, you work with hashes instead of arrays... but it's all pretty much the same.
EDIT:
If you don't need to manipulate a specific param then just don't
def normalized_params
return [] unless params[:people]
params[:people].each_with_object([]) do |result, person|
result << {
name: person[:name],
date: normalize_date(person[:date]),
}
end
end
In my Rails app, I'm trying to take my working API calls and have them handled by background workers.
I have the following in app/jobs/api_request_job.rb:
class ApiRequestJob
def self.perform(params)
Query.new(params).start
end
end
The Query class is where the HTTParty requests are being executed (there are lots of methods for different query types with the same basic format as the parks method:
require 'ostruct'
class Query
include FourSquare
attr_reader :results,
:first_address,
:second_address,
:queries,
:radius
def initialize(params)
#results = OpenStruct.new
#queries = params["query"]
#first_address = params["first_address"]
#second_address = params["second_address"]
#radius = params["radius"].to_f
end
def start
queries.keys.each do |query|
results[query] = self.send(query)
end
results
end
def parks
category_id = "4bf58dd8d48988d163941735"
first_address_results = FourSquare.send_request(#first_address, radius_to_meters, category_id)["response"]["venues"]
second_address_results = FourSquare.send_request(#second_address, radius_to_meters, category_id)["response"]["venues"]
response = [first_address_results, second_address_results]
end
And, finally, the controller. Before trying to farm this action out to background workers, this line was working fine: #results = Query.new(params).start
class ComparisonsController < ApplicationController
attr_reader :first_address, :second_address
def new
end
def show
#first_address = Address.new(params["first_address"])
#second_address = Address.new(params["second_address"])
if #first_address.invalid?
flash[:notice] = #first_address.errors.full_messages
redirect_to :back
elsif Query.new(params).queries.nil?
flash[:notice] = "You must choose at least one criteria for your comparison."
redirect_to comparisons_new_path(request.params)
else
#queries = params["query"].keys
#results = Resque.enqueue(ApiRequestJob, params) # <-- this is where I'm stuck
end
end
end
I'm running redis, have resque installed, and am running the task/starting the workers. The current value being returned for #results is true instead of the hash of results I was need to get back. Is there a way to have the results of the Resque job persist and return data instead of true? What am I missing about how to get background workers to return the same type of data my regular api calls were returning?
Many thanks in advance!
The true you are receiving means the job was scheduled enqueued successfully. The worker will pick it up and run it on the background asynchronously, which means, not at same time as the thread that enqueued it. So there's no way to retrieve the returned value from the job.
If you need the value from that process, you have to run it from the controller without the worker. Also, you wouldn't gain anything from just pushing the work to be done by another process as the web process would have to wait for the response to then keep going anyway.
If you need that returned value right away and are doing this for performance reasons, then you could look into other forms of concurrency, like having another thread doing the request and then only grabbing the result when you need it on the view like:
class AsyncValue
def initialize(&block)
#thr = Thread.new(&block)
end
def value
#thr.join
end
end
on the controller
#results = AsyncValue.new { Query.new(params).start }
and on the view
<%= #results.value.each .... %>
but you'd still have to work around error handling which can get pretty complicated, but is doable.
Personally, I'd just make the request in place, but you know your domain better than me.
I have a callback on my ActiveRecord model as shown below:
before_save :sync_to_external_apis
def sync_to_external_apis
[user, assoc_user].each {|cuser|
if cuser.google_refresh
display_user = other_user(cuser.id)
api = Google.new(:user => cuser)
contact = api.sync_user(display_user)
end
}
end
I would like to write an rspec test which tests that calling save! on an instance of this model causes sync_user to be called on a new Google instance when google_refresh is true. How could I do this?
it "should sync to external apis on save!" do
model = Model.new
model.expects(:sync_to_external_apis)
model.save!
end
As an aside, requesting unreliable resources like the internet during the request-response cycle is a bad idea. I would suggest creating a background job instead.
The usual method for testing is to ensure the results are as expected. Since you're using an API in this case that may complicate things. You may find that using mocha to create a mock object you can send API calls would allow you to substitute the Google class with something that works just as well for testing purposes.
A simpler, yet clunkier approach is to have a "test mode" switch:
def sync_to_external_apis
[ user, assoc_user ].each do |cuser|
if (Rails.env.test?)
#synced_users ||= [ ]
#synced_users << cuser
else
# ...
end
end
end
def did_sync_user?(cuser)
#synced_users and #synced_users.include?(cuser)
end
This is a straightforward approach, but it will not validate that your API calls are being made correctly.
Mocha is the way to go. I'm not familiar with rspec, but this is how you would do it in test unit:
def test_google_api_gets_called_for_user_and_accoc_user
user = mock('User') # define a mock object and label it 'User'
accoc_user = mock('AssocUser') # define a mock object and label it 'AssocUser'
# instantiate the model you're testing with the mock objects
model = Model.new(user, assoc_user)
# stub out the other_user method. It will return cuser1 when the mock user is
# passed in and cuser2 when the mock assoc_user is passed in
cuser1 = mock('Cuser1')
cuser2 = mock('Cuser2')
model.expects(:other_user).with(user).returns(cuser1)
model.expects(:other_user).with(assoc_user).returns(cuser2)
# set the expectations on the Google API
api1 - mock('GoogleApiUser1') # define a mock object and lable it 'GoogleApiUser1'
api2 - mock('GoogleApiUser2') # define a mock object and lable it 'GoogleApiUser2'
# call new on Google passing in the mock user and getting a mock Google api object back
Google.expects(:new).with(:user => cuser1).returns(api1)
api1.expects(:sync_user).with(cuser1)
Google.expects(:new).with(:user => cuser2).returns(api2)
api2.expects(:sync_user).with(cuser2)
# now execute the code which should satisfy all the expectations above
model.save!
end
The above may seem complicated, but it's not once you get the hang of it. You're testing that when you call save, your model does what it is supposed to do, but you don't have the hassle, or time expense of really talking to APIs, instantiating database records, etc.