How to flatten data between Jbuilder and Normalizr - normalization

I'm working on a small Vue project with a Rails API, and am having issues figuring out the best way to normalize my data. I'm using Jbuilder on the backend and wanting to flatten/normalize the data that comes through so that it's easier to access in the frontend, I fear that I'm just not quite "getting" how to get Normalizr to work correctly.
Currently for the jbuilder, I've got several things related to a user that I eventually want stored in Vuex.
get_current_user.json.jbuilder
json.user #user
# All of user's lists and associated categories and items
json.lists #user.lists do |list|
json.(list, :id, :name, :active, :user_id)
json.categories list.categories do |category|
json.(category, :id, :name, :user_id, :created_at, :updated_at, :list_id)
json.items category.items do |item|
json.(item, :id, :name, :website, :price, :information, :notes, :category_id, :quantity)
end
end
end
# User's total hiking miles
json.total_m #user.m_by_month
# User's upcoming trips properties
json.upcoming_trips #user.upcoming_trips do |trip|
json.(trip, :id, :name, :start_date, :end_date, :data_id, :list_id)
# User's todo items
json.todos trip.todo_items.sort_by {|t| t[:position]} do |todo|
json.(todo, :id, :title, :completed, :position, :upcoming_trip_id, :user_id)
end
# User's shopping list items
json.shopping_list_items trip.shopping_list_items.sort_by {|s| s[:position]} do |item|
json.(item, :id, :title, :completed, :position, :quantity, :upcoming_trip_id, :user_id)
end
# User's trip details
json.trip_details hike.trip_details.order(updated_at: :desc) do |detail|
json.(detail, :id, :trip_type, :label, :url, :upcoming_trip_id)
end
end
So then on the front end, before normalizing, the user object is essentially:
- user (Object)
- packs (Array)
- categories (Array)
- items (Array)
- total_m (Array)
- upcoming_trips (Array)
- shopping_list_items (Array)
- todos (Array)
- trip_details (Array)
- user (Object)
schema.js
import { schema } from 'normalizr';
// Define user schema
export const userSchema = new schema.Entity('user');
export const itemSchema = new schema.Entity('items', {
user: userSchema,
});
export const listSchema = new schema.Entity('lists', {
user: userSchema,
items: [itemSchema],
});
// Define category schema
export const categorySchema = new schema.Entity('categories', {
user: userSchema,
packs: [listSchema],
});
// Define upcoming trip schema
export const upcomingTripSchema = new schema.Entity('upcomingTrips', {
user: userSchema,
categories: [categorySchema],
});
After calling const normalizedUser = normalize(user, userSchema), I'll receive something like:
{
- entities (Object)
- user (Object)
- undefined (Object)
- lists (Array)
- total_m (Array)
- upcoming_trips (Array)
- user (Object)
- result: undefined
}
The good thing is that since I have control over the front- and back-end, I can modify it however I want. Does anyone have any suggestions? I would greatly appreciate the help!

Thank you for that interesting question. If this were me, I would use the power of the back-end ActiveRecord queries and JBuilder to produce the JSON out of the API exactly as I want to consume it. The use of normalizr as a secondary filter would be unnecessary.
One thing that concerns me is the duplication of schema necessary to have normalizr working. You have the schema in Rails, and then another schema in JS. Changing one necesitates change to the other. Keeping them aligned creates integration problems.
As such, this isn't quite an answer but rather a question and suggestion. What is normalizr doing that can't be done with ARel and JBuilder? Having the Rails schema and a sample of the target JSON, someone might give you the ARel/JBuilder to produce the JSON.

Related

activeadmin and dynamic store accessors fails on new resource

I want to generate forms for a resource that has a postgres jsonb column :data, and I want the schema for these forms to be stored in a table in the database. After a lot of research I am 90% there but my method fails in ActiveAdmin forms upon create (not update). Can anyone explain this?
Sorry for the long code snippets. This is a fairly elaborate setup but I think it would be of some interest since if this works one could build arbitrary new schemas dynamically without hard-coding.
I am following along this previous discussion with Rails 6 and ActiveAdmin 2.6.1 and ruby 2.6.5.
I want to store Json Schemas in a table SampleActionSchema that belong_to SampleAction (using the json-schema gem for validation)
class SampleActionSchema < ApplicationRecord
validates :category, uniqueness: { case_sensitive: false }, allow_nil: false, allow_blank: true
validate :schema_is_json_schema
private
def schema_is_json_schema
metaschema = JSON::Validator.validator_for_name("draft4").metaschema
unless JSON::Validator.validate(metaschema, schema)
errors.add :schema, 'not a compliant json schema'
end
end
end
class SampleAction < ActiveRecord::Base
belongs_to :sample
validate :is_sample_action
validates :name, uniqueness: { case_sensitive: false }
after_initialize :add_field_accessors
before_create :add_field_accessors
before_update :add_field_accessors
def add_store_accessor field_name
singleton_class.class_eval {store_accessor :data, field_name.to_sym}
end
def add_field_accessors
num_fields = schema_properties.try(:keys).try(:count) || 0
schema_properties.keys.each {|field_name| add_store_accessor field_name} if num_fields > 0
end
def schema_properties
schema_arr=SampleActionSchema.where(category: category)
if schema_arr.size>0
sc=schema_arr[0]
if !sc.schema.empty?
props=sc.schema["properties"]
else
props=[]
end
else
[]
end
end
private
def is_sample_action
sa=SampleActionSchema.where(category: category)
errors.add :category, 'not a known sample action' unless (sa.size>0)
errors.add :base, 'incorrect json format' unless (sa.size>0) && JSON::Validator.validate(sa[0].schema, data)
end
end
This all works correctly; For example, for a simple schema called category: "cleave", where :data looks like data: {quality: "good"}, I can create a resource as follows in the rails console:
sa=SampleAction.new(sample_id: 6, name: "test0", data: {}, category: "cleave" )
=> #<SampleAction id: nil, name: "test0", category: "cleave", data: {}, created_at: nil, updated_at: nil, sample_id: 6>
sa.quality = "good" => true
sa.save => true
To make this system work in AA forms, I call the normal path (new or edit)_admix_sample_action_form with params: {category: "cleave"} and then I generate permit_params dynamically:
ActiveAdmin.register SampleAction, namespace: :admix do
permit_params do
prms=[:name, :category, :data, :sample_id, :created_at, :updated_at]
#the first case is creating a new record (gets parameter from admix/sample_actions/new?category="xxx"
#the second case is updating an existing record
#falls back to blank (no extra parameters)
categ = #_params[:category] || (#_params[:sample_action][:category] if #_params[:sample_action]) || nil
cat=SampleActionSchema.where(category: categ)
if cat.size>0 && !cat[0].schema.empty?
cat[0].schema["properties"].each do |key, value|
prms+=[key.to_sym]
end
end
prms
end
form do |f|
f.semantic_errors
new=f.object.new_record?
cat=params[:category] || f.object.category
f.object.category=cat if cat && new
f.object.add_field_accessors if new
sas=SampleActionSchema.where(category: cat)
is_schema=(sas.size>0) && !sas[0].schema.empty?
if session[:active_sample]
f.object.sample_id=session[:active_sample]
end
f.inputs "Sample Action" do
f.input :sample_id
f.input :name
f.input :category
if !is_schema
f.input :data, as: :jsonb
else
f.object.schema_properties.each do |key, value|
f.input key.to_sym, as: :string
end
end
end
f.actions
end
Everything works fine if I am editing an existing resource (as created in the console above). The form is displayed and all the dynamic fields are updated upon submit. But when creating a new resource where e.g. :data is of the form data: {quality: "good"} I get
ActiveModel::UnknownAttributeError in Admix::SampleActionsController#create
unknown attribute 'quality' for SampleAction.
I have tried to both add_accessors in the form and to override the new command to add the accessors after initialize (these should not be needed because the ActiveRecord callback appears to do the job at the right time).
def new
build_resource
resource.add_field_accessors
new!
end
Somehow when the resource is created in the AA controller, it seems impossible to get the accessors stored even though it works fine in the console. Does anyone have a strategy to initialize the resource correctly?
SOLUTION:
I traced what AA was doing to figure out the minimum number of commands needed. It was necessary to add code to build_new_resource to ensure that any new resource AA built had the correct :category field, and once doing so, make the call to dynamically add the store_accessor keys to the newly built instance.
Now users can create their own original schemas and records that use them, without any further programming! I hope others find this useful, I certainly will.
There are a couple ugly solutions here, one is that adding the parameters to the active admin new route call is not expected by AA, but it still works. I guess this parameter could be passed in some other way, but quick and dirty does the job. The other is that I had to have the form generate a session variable to store what kind of schema was used, in order for the post-form-submission build to know, since pressing the "Create Move" button clears the params from the url.
The operations are as follows: for a model called Move with field :data that should be dynamically serialized into fields according to the json schema tables, both
admin/moves/new?category="cleave" and admin/moves/#/edit find the "cleave" schema from the schema table, and correctly create and populate a form with the serialized parameters. And, direct writes to the db
m=Move.new(category: "cleave") ==> true
m.update(name: "t2", quality: "fine") ==> true
work as expected. The schema table is defined as:
require "json-schema"
class SampleActionSchema < ApplicationRecord
validates :category, uniqueness: { case_sensitive: false }, allow_nil: false, allow_blank: true
validate :schema_is_json_schema
def self.schema_keys(categ)
sas=SampleActionSchema.find_by(category: categ)
schema_keys= sas.nil? ? [] : sas[:schema]["properties"].keys.map{|k| k.to_sym}
end
private
def schema_is_json_schema
metaschema = JSON::Validator.validator_for_name("draft4").metaschema
unless JSON::Validator.validate(metaschema, schema)
errors.add :schema, 'not a compliant json schema'
end
end
end
The Move table that employs this schema is:
class Move < ApplicationRecord
after_initialize :add_field_accessors
def add_field_accessors
if category!=""
keys=SampleActionSchema.schema_keys(category)
keys.each {|k| singleton_class.class_eval{store_accessor :data, k}}
end
end
end
Finally, the working controller:
ActiveAdmin.register Move do
permit_params do
#choice 1 is for new records, choice 2 is for editing existing
categ = #_params[:category] || (#_params[:move][:category] if #_params[:move]) || ""
keys=SampleActionSchema.schema_keys(categ)
prms = [:name, :data] + keys
end
form do |f|
new=f.object.new_record?
f.object.category=params[:category] if new
if new
session[:current_category]=params[:category]
f.object.add_field_accessors
else
session[:current_category] = ""
end
keys=SampleActionSchema.schema_keys(f.object.category)
f.inputs do
f.input :name
f.input :category
keys.each {|k| f.input k}
end
f.actions
end
controller do
def build_new_resource
r=super
r.assign_attributes(category: session[:current_category])
r.add_field_accessors
r
end
end
end

Programmatically get all database column types

I am building a Rails gem for which I might need to know the currently available column types. So say for Postgres, I am looking for something like: ActiveRecord::Base.available_column_types. I looked through the source with no success so far.
I can't find an ActiveRecord method to get what you want. But I can show you two ways you can achieve this:
With any path you need to create an initializer and Monkey Patch ActiveRecord. For example: /config/initializers/active_record_extensions.rb. Then, the options:
OPTION 1: get data types based on your models
class ActiveRecord::Base
def self.available_column_types
types = []
ActiveRecord::Base.subclasses.collect{ |type| type.name }.each do |model_name|
types += eval("#{model_name}.columns.map(&:type)")
end
types.uniq
end
end
Then you can do rails console on your terminal and write:
irb(main):001:0> User.available_column_types
=> [:integer, :string, :text, :datetime, :boolean, :date, :hstore]
irb(main):002:0> ActiveRecord::Base.available_column_types
=> [:integer, :string, :text, :datetime, :boolean, :date, :hstore]
irb(main):003:0>
OPTION 2: get all posible data types based on you db adapter
class ActiveRecord::Base
if defined?(ActiveRecord::ConnectionAdapters::PostgreSQLAdapter) and
ActiveRecord::Base.connection.instance_of?(ActiveRecord::ConnectionAdapters::PostgreSQLAdapter)
types = ActiveRecord::Base.connection.execute("select * from pg_type;")
return types.inject([]) { |result, record| result << record["typname"] }
# Too much info on pg_type table, you can get whatever you need.
end
if defined?(ActiveRecord::ConnectionAdapters::MysqlAdapter) and
ActiveRecord::Base.connection.instance_of?(ActiveRecord::ConnectionAdapters::MysqlAdapter)
# I don't know, it's just an example. Yo can add all adapters you want
return
end
# maybe raise an Exception with NO ADAPTER! message
end
end
Once again, on your console, you can do ActiveRecord::Base.available_column_types to see the result.
Note: you need to adapt this in order to make it work with your gem.

Rails: use existing model validation rules against a collection instead of the database table

Rails 4, Mongoid instead of ActiveRecord (but this should change anything for the sake of the question).
Let's say I have a MyModel domain class with some validation rules:
class MyModel
include Mongoid::Document
field :text, type: String
field :type, type: String
belongs_to :parent
validates :text, presence: true
validates :type, inclusion: %w(A B C)
validates_uniqueness_of :text, scope: :parent # important validation rule for the purpose of the question
end
where Parent is another domain class:
class Parent
include Mongoid::Document
field :name, type: String
has_many my_models
end
Also I have the related tables in the database populated with some valid data.
Now, I want to import some data from an CSV file, which can conflict with the existing data in the database. The easy thing to do is to create an instance of MyModel for every row in the CSV and verify if it's valid, then save it to the database (or discard it).
Something like this:
csv_rows.each |data| # simplified
my_model = MyModel.new(data) # data is the hash with the values taken from the CSV row
if my_model.valid?
my_model.save validate: false
else
# do something useful, but not interesting for the question's purpose
# just know that I need to separate validation from saving
end
end
Now, this works pretty smoothly for a limited amount of data. But when the CSV contains hundreds of thousands of rows, this gets quite slow, because (worst case) there's a write operation for every row.
What I'd like to do, is to store the list of valid items and save them all at the end of the file parsing process. So, nothing complicated:
valids = []
csv_rows.each |data|
my_model = MyModel.new(data)
if my_model.valid? # THE INTERESTING LINE this "if" checks only against the database, what happens if it conflicts with some other my_models not saved yet?
valids << my_model
else
# ...
end
end
if valids.size > 0
# bulk insert of all data
end
That would be perfect, if I could be sure that the data in the CSV does not contain duplicated rows or data that goes against the validation rules of MyModel.
My question is: how can I check each row against the database AND the valids array, without having to repeat the validation rules defined into MyModel (avoiding to have them duplicated)?
Is there a different (more efficient) approach I'm not considering?
What you can do is validate as model, save the attributes in a hash, pushed to the valids array, then do a bulk insert of the values usint mongodb's insert:
valids = []
csv_rows.each |data|
my_model = MyModel.new(data)
if my_model.valid?
valids << my_model.attributes
end
end
MyModel.collection.insert(valids, continue_on_error: true)
This won't however prevent NEW duplicates... for that you could do something like the following, using a hash and compound key:
valids = {}
csv_rows.each |data|
my_model = MyModel.new(data)
if my_model.valid?
valids["#{my_model.text}_#{my_model.parent}"] = my_model.as_document
end
end
Then either of the following will work, DB Agnostic:
MyModel.create(valids.values)
Or MongoDB'ish:
MyModel.collection.insert(valids.values, continue_on_error: true)
OR EVEN BETTER
Ensure you have a uniq index on the collection:
class MyModel
...
index({ text: 1, parent: 1 }, { unique: true, dropDups: true })
...
end
Then Just do the following:
MyModel.collection.insert(csv_rows, continue_on_error: true)
http://api.mongodb.org/ruby/current/Mongo/Collection.html#insert-instance_method
http://mongoid.org/en/mongoid/docs/indexing.html
TIP: I recommend if you anticipate thousands of rows to do this in batches of 500 or so.

Getting rails3-autocomplete-jquery gem to work nicely with Simple_Form with multiple inputs

So I am trying to implement multiple autocomplete using this gem and simple_form and am getting an error.
I tried this:
<%= f.input_field :neighborhood_id, collection: Neighborhood.order(:name), :url => autocomplete_neighborhood_name_searches_path, :as => :autocomplete, 'data-delimiter' => ',', :multiple => true, :class => "span8" %>
This is the error I get:
undefined method `to_i' for ["Alley Park, Madison"]:Array
In my params, it is sending this in neighborhood_id:
"search"=>{"neighborhood_id"=>["Alley Park, Madison"],
So it isn't even using the IDs for those values.
Does anyone have any ideas?
Edit 1:
In response to #jvnill's question, I am not explicitly doing anything with params[:search] in the controller. A search creates a new record, and is searching listings.
In my Searches Controller, create action, I am simply doing this:
#search = Search.create!(params[:search])
Then my search.rb (i.e. search model) has this:
def listings
#listings ||= find_listings
end
private
def find_listings
key = "%#{keywords}%"
listings = Listing.order(:headline)
listings = listings.includes(:neighborhood).where("listings.headline like ? or neighborhoods.name like ?", key, key) if keywords.present?
listings = listings.where(neighborhood_id: neighborhood_id) if neighborhood_id.present?
#truncated for brevity
listings
end
First of all, this would be easier if the form is returning the ids instead of the name of the neighborhood. I haven't used the gem yet so I'm not familiar how it works. Reading on the readme says that it will return ids but i don't know why you're only getting names. I'm sure once you figure out how to return the ids, you'll be able to change the code below to suit that.
You need to create a join table between a neighborhood and a search. Let's call that search_neighborhoods.
rails g model search_neighborhood neighborhood_id:integer search_id:integer
# dont forget to add indexes in the migration
After that, you'd want to setup your models.
# search.rb
has_many :search_neighborhoods
has_many :neighborhoods, through: :search_neighborhoods
# search_neighborhood.rb
belongs_to :search
belongs_to :neighborhood
# neighborhood.rb
has_many :search_neighborhoods
has_many :searches, through: :search_neighborhoods
Now that we've setup the associations, we need to setup the setters and the attributes
# search.rb
attr_accessible :neighborhood_names
# this will return a list of neighborhood names which is usefull with prepopulating
def neighborhood_names
neighborhoods.map(&:name).join(',')
end
# we will use this to find the ids of the neighborhoods given their names
# this will be called when you call create!
def neighborhood_names=(names)
names.split(',').each do |name|
next if name.blank?
if neighborhood = Neighborhood.find_by_name(name)
search_neighborhoods.build neighborhood_id: neighborhood.id
end
end
end
# view
# you need to change your autocomplete to use the getter method
<%= f.input :neighborhood_names, url: autocomplete_neighborhood_name_searches_path, as: :autocomplete, input_html: { data: { delimiter: ',', multiple: true, class: "span8" } %>
last but not the least is to update find_listings
def find_listings
key = "%#{keywords}%"
listings = Listing.order(:headline).includes(:neighborhood)
if keywords.present?
listings = listings.where("listings.headline LIKE :key OR neighborhoods.name LIKE :key", { key: "#{keywords}")
end
if neighborhoods.exists?
listings = listings.where(neighborhood_id: neighborhood_ids)
end
listings
end
And that's it :)
UPDATE: using f.input_field
# view
<%= f.input_field :neighborhood_names, url: autocomplete_neighborhood_name_searches_path, as: :autocomplete, data: { delimiter: ',' }, multiple: true, class: "span8" %>
# model
# we need to put [0] because it returns an array with a single element containing
# the string of comma separated neighborhoods
def neighborhood_names=(names)
names[0].split(',').each do |name|
next if name.blank?
if neighborhood = Neighborhood.find_by_name(name)
search_neighborhoods.build neighborhood_id: neighborhood.id
end
end
end
Your problem is how you're collecting values from the neighborhood Model
Neighborhood.order(:name)
will return an array of names, you need to also collect the id, but just display the names
use collect and pass a block, I beleive this might owrk for you
Neighborhood.collect {|n| [n.name, n.id]}
Declare a scope on the Neighborhood class to order it by name if you like to get theat functionality back, as that behavior also belongs in the model anyhow.
edit>
To add a scope/class method to neighborhood model, you'd typically do soemthing like this
scope :desc, where("name DESC")
Than you can write something like:
Neighborhood.desc.all
which will return an array, thus allowing the .collect but there are other way to get those name and id attributes recognized by the select option.

Sorting objects by field availability

I have a place object that has the following parameters: phone, category, street, zip, website.
I also have an array of place objects: [place1, place2, place3, place4, place5].
What's the best way to sort the array of places, based on the parameter availability? I.e., if place1 has the most available parameters, or the least number of parameters that are nil, it should be reordered to first and so on.
Edit: These objects are not ActiveRecord objects
I'd let each Place object know how complete it was:
class Place
attr_accessor :phone, :category, :street, :website, :zip
def completeness
attributes.count{|_,value| value.present?}
end
end
Then it is easy to sort your place objects by completeness:
places.sort_by(&:completeness)
Edit: Non-ActiveRecord solution:
I had assumed this was an ActiveRecord model because of the Ruby on Rails tag. Since this is a non-ActiveRecord model, you can use instance_variables instead of attributes. (By the way, congratulations for knowing that domain models in Rails don't have to inherit from ActiveRecord)
class Place
attr_accessor :phone, :category, :street, :website, :zip
def completeness
instance_variables.count{|v| instance_variable_get(v).present?}
end
end
Edit 2: Weighted attributes
You have a comment about calculating a weighted score. In this case, or when you want to choose specific attributes, you can put the following in your model:
ATTR_WEIGHTS = {phone:1, category:1, street:2, website:1, zip:2}
def completeness
ATTR_WEIGHTS.select{|k,v| instance_variable_get(k).present?}.sum(&:last)
end
Note that the sum(&:last) is equivalent to sum{|k,v| v} which in turn is a railsism for reduce(0){|sum, (k,v)| sum += v}.
I'm sure there's a better way to do it, but this is a start :
ruby fat one liner
values = {phone: 5, category: 3, street: 5, website: 3, zip: 5} #Edit these values to ponderate.
array = [place1, place2, place3, place4, place5]
sorted_array = array.sort_by{ |b| b.attributes.select{ |k, v| values.keys.include?(k.to_sym) && v.present? }.inject(0){ |sum, n| sum + values[n[0]] } }.reverse
So we're basically creating a sub-hash of the attributes of your ActiveRecord object by only picking the key-value pairs that are in the values hash and only if they have a present? value.
Then on this sub-hash, we're invoking inject that will sum the ponderated values we've put in the values hash. Finally, we reverse everything so you have the highest score first.
To make it clean, I suggest you implement a method that will compute the score of each object in an instance method in your model, like mark suggested
If you have a class Place:
class Place
attr_accessor :phone, :category, :street, :website, :zip
end
and you create an instance place1:
place1 = Place.new
place1.instance_variables # => []
place1.instance_variables.size # => 0
place1.phone = '555-1212' # => "555-1212"
place1.instance_variables # => [ :#phone ]
place1.instance_variables.size # => 1
And create the next instance:
place2 = Place.new
place2.phone = '555-1212'
place2.zip = '00000'
place2.instance_variables # => [ :#phone, :#zip ]
place2.instance_variables.size # => 2
You can sort by an ascending number of instance variables that have been set:
[place1, place2].sort_by{ |p| p.instance_variables.size }
# => [ #<Place:0x007fa8a32b51a8 #phone="555-1212">, #<Place:0x007fa8a31f5380 #phone="555-1212", #zip="00000"> ]
Or sort in descending order:
[place1, place2].sort_by{ |p| p.instance_variables.size }.reverse
# => [ #<Place:0x007fa8a31f5380 #phone="555-1212", #zip="00000">, #<Place:0x007fa8a32b51a8 #phone="555-1212"> ]
This uses basic Ruby objects, Rails is not needed, and it asks the object instances themselves what is set, so you don't have to maintain any external lists of attributes.
Note: this breaks if you set an instance variable to something, then set it back to nil.
This fixes it:
[place1,place2].sort_by{ |p|
p.instance_variables.reject{ |v|
p.instance_variable_get(v).nil?
}.size
}.reverse
and this shortens it by using Enumerable's count with a block:
[place1,place2].sort_by{ |p|
p.instance_variables.count{ |v|
!p.instance_variable_get(v).nil?
}
}.reverse

Resources