Why GraphQL-Ruby does not understand dynamically generated schema definitions? - ruby-on-rails

In Rails app, I have the following part of the schema defined:
# frozen_string_literal: true
module Types
module KVInfo
def self.kv_value_scalar(typename, raw_type: String, typeid:)
clazz = Class.new(BaseObject) do
graphql_name "KVEntry#{typename}Value"
field :value, raw_type, null: false
end
clazz.define_singleton_method(:typeid) { typeid }
clazz.define_singleton_method(:typename) { typename }
clazz
end
# typeids taken from enum in (.../kv_info.ts)
KVScalars = [
kv_value_scalar('String', typeid: 0),
kv_value_scalar('Markdown', typeid: 1),
kv_value_scalar(
'Date',
raw_type: GraphQL::Types::ISO8601DateTime,
typeid: 2
),
kv_value_scalar('Country', typeid: 3),
kv_value_scalar('Address', typeid: 5)
].freeze
KVScalars.each { |t| KVInfo.const_set(t.graphql_name, t) }
class KVScalarValue < BaseUnion
possible_types(*KVScalars)
def self.resolve_type(obj, _ctx)
KVScalars.select { |t| t.typeid == obj['type'] }.first
end
end
def self.kv_value_array(subtype)
clazz = Class.new(BaseObject) do
graphql_name "KVEntryArray#{subtype.typename}"
field :value, [subtype], null: false
end
clazz.define_singleton_method(:sub_typeid) { subtype.typeid }
clazz
end
KVArrays = KVScalars.map { |s| kv_value_array(s) }
KVArrays.each { |t| KVInfo.const_set(t.graphql_name, t) }
class KVArrayValue < BaseUnion
possible_types(*KVArrays)
def self.resolve_type(obj, _ctx)
KVArrays.select { |t| t.sub_typeid == obj['subtype'] }
end
end
class KVValue < BaseUnion
# PP HERE
possible_types(KVArrayValue, KVScalarValue)
def self.resolve_type(obj, _ctx)
obj['type'] == 4 ? # typeid for array
KVArrayValue :
KVScalarValue
end
end
class KVEntry < BaseObject
field :name, String, null: false
field :value, KVValue, null: false
end
end
end
While running a Rake task that dumps the whole schema to a file to be consumed by frontend, I see the type denoted by KVEntry class having only the name field.
If I put all possible types in the KVValue class like such:
pp(*KVScalars, *KVArrays)
possible_types(*KVScalars, *KVArrays)
it works and generates types correctly.
But note the pp line above - it does not work without this line (???).
Also, if I keep it as is (with nested unions), it does not work regardless of number and positions of pp clauses. When going through with the debugger, all classes are loaded correctly, including generated ones, but the schema still lacks required types.
So the question is what the bisq... why the types are not processed and how can pp affect this process in any sense?
P.S. The data format is fixed by frontend and no way subject to change.

The problem was in the nested unions. GraphQL does not support these. And on the part of plain union not working - I still have no idea of reasons for such behavior, but it fixed itself after N-th restart.

Related

How to transform nested parameters in Rails API for PATCH requests

I'm having problems trying to implement a PATCH endpoint for a Rails API which deals with complex request objects that are structurally different from the ActiveRecord model.
As an example let's say I have the following request object:
{
"details": {
"color": {
"id": 1
}
},
"name": "Hello, world!"
...
}
However, on my model I expect a flat color_id attribute:
class CreateModel < ActiveRecord::Migration[7.0]
def change
create_table :model do |t|
t.string :name, null: false
t.integer :color_id, null: false
end
end
end
Therefore I need to transform the request params. For this I've found one approach which works pretty well in case of PUT requests, but not at all for PATCH:
ActionController::Parameters.new({
color_id: params.dig(:details, :color, :id),
name: params.dig(:name)
})
If I issue a PUT request this solution works great since PUT expects the whole object as payload, PATCH on the other hand would cause issues when passing only a subset of the properties since everything else will be set to nil due to how dig works.
Assuming I have no control over the request format, how can I transform the request params in the backend so that omitted keys will not result in nil values? Of course I could imperatively handle each property line by line, checking whether the key is present in the original params and then setting it in the new one, but is there a more elegant approach?
I've found a generic solution using mapping logic with a lookup table. For the example above:
{
"details": {
"color": {
"id": 1
}
},
"name": "Hello, world!"
...
}
I would have the following mapping variable:
MAPPING = {
[:details, :color, :id] => [:color_id]
}
Then I'm able to transform the params using this recursive algorithm:
def handle(params, keys)
output = Hash.new
params.each do |k,v|
sym_keys = (keys + [k]).map &:to_sym
target_keys = MAPPING[sym_keys]
if v.is_a? ActionController::Parameters
keys << k
output = output.deep_merge! transform(v, keys)
else
if target_keys.nil?
value = sym_keys.reverse().reduce(v) { |v, k| Hash[k, v] }
else
value = target_keys.reverse().reduce(v) { |v, k| Hash[k, v] }
end
output = output.deep_merge! value
end
end
output
end
def transform(params)
output = handle(params, [])
end

Rails to_json(methods: => [...]) for different ActiveRecords

In Rails, i have an object called values that could be 1 of 20 kinds of ActiveRecord, and in only 1 of them there's a method(may be the wrong term, rails newbie) that can add a customized field in returned JSON object where the method name is the field name and method returned value is the field value. For example
class XXXController < ApplicationController
..
if a
values = A
elsif b
values = B
elseif c
values = C
..
end
render :json => values.to_json(:methods => :type_needed)
and you will see response like
{
..
"type_needed": true,
..
}
I only have type_needed defined in A which will return true in some cases. For others like B, C, D... which in total 19, i want them to all have type_needed returned as false, is there a way i can do that in one place instead of add type_needed method in the rest 19?
I will do it as follows:
json = values.to_json(:methods => :type_needed)
# => "[{\"id\":1,\"name\":\"Aaa\"},{\"id\":\"2\",\"name\":\"Bbb\"}]" # => Representational value only
ary = JSON.parse(json)
# => [{"id"=>1, "name"=>"Aaa"}, {"id"=>2, "name"=>"Bbb"}]
ary.map! { |hash| hash[:type_needed] = false unless hash.key?(:type_needed); hash }
# => [{"id"=>1, "name"=>"Aaa", :type_needed=>false}, {"id"=>2, "name"=>"Bbb", :type_needed=>false}]
ary.to_json
# => "[{\"id\":1,\"name\":\"Aaa\",\"type_needed\":false},{\"id\":\"2\",\"name\":\"Bbb\",\"type_needed\":false}]"
If I am understanding your question correctly then you want to define type_needed method once and have it included on all your 20 models. If yes, then you can define a concern and include it in all your 20 models.
app/models/concerns/my_model_concern.rb
module MyModelConcern
extend ActiveSupport::Concern
def type_needed?
self.respond_to?(:some_method)
end
end
app/models/a.rb
class A < ApplicationRecord
include MyModelConcern
def some_method
end
end
app/models/b.rb
class B < ApplicationRecord
include MyModelConcern
end
app/models/c.rb
class C < ApplicationRecord
include MyModelConcern
end
With the above
a = A.new
a.type_needed?
=> true
b = B.new
b.type_needed?
=> false
c = C.new
c.type_needed?
=> false
See if this helps.

Instance Variables in a Rails Model

I have this variable opinions I want to store as an instance variable in my model... am I right in assuming I will need to add a column for it or else be re-calculating it constantly?
My other question is what is the syntax to store into a column variable instead of just a local one?
Thanks for the help, code below:
# == Schema Information
#
# Table name: simulations
#
# id :integer not null, primary key
# x_size :integer
# y_size :integer
# verdict :string
# arrangement :string
# user_id :integer
#
class Simulation < ActiveRecord::Base
belongs_to :user
serialize :arrangement, Array
validates :user_id, presence: true
validates :x_size, :y_size, presence: true, :numericality => {:only_integer => true}
validates_numericality_of :x_size, :y_size, :greater_than => 0
def self.keys
[:soft, :hard, :none]
end
def generate_arrangement
#opinions = Hash[ Simulation.keys.map { |key| [key, 0] } ]
#arrangement = Array.new(y_size) { Array.new(x_size) }
#arrangement.each_with_index do |row, y_index|
row.each_with_index do |current, x_index|
rand_opinion = Simulation.keys[rand(0..2)]
#arrangement[y_index][x_index] = rand_opinion
#opinions[rand_opinion] += 1
end
end
end
def verdict
if #opinions[:hard] > #opinions[:soft]
:hard
elsif #opinions[:soft] > #opinions[:hard]
:soft
else
:push
end
end
def state
#arrangement
end
def next
new_arrangement = Array.new(#arrangement.size) { |array| array = Array.new(#arrangement.first.size) }
#opinions = Hash[ Simulation.keys.map { |key| [key, 0] } ]
#seating_arrangement.each_with_index do |array, y_index|
array.each_with_index do |opinion, x_index|
new_arrangement[y_index][x_index] = update_opinion_for x_index, y_index
#opinions[new_arrangement[y_index][x_index]] += 1
end
end
#arrangement = new_arrangement
end
private
def in_array_range?(x, y)
((x >= 0) and (y >= 0) and (x < #arrangement[0].size) and (y < #arrangement.size))
end
def update_opinion_for(x, y)
local_opinions = Hash[ Simulation.keys.map { |key| [key, 0] } ]
for y_pos in (y-1)..(y+1)
for x_pos in (x-1)..(x+1)
if in_array_range? x_pos, y_pos and not(x == x_pos and y == y_pos)
local_opinions[#arrangement[y_pos][x_pos]] += 1
end
end
end
opinion = #arrangement[y][x]
opinionated_neighbours_count = local_opinions[:hard] + local_opinions[:soft]
if (opinion != :none) and (opinionated_neighbours_count < 2 or opinionated_neighbours_count > 3)
opinion = :none
elsif opinion == :none and opinionated_neighbours_count == 3
if local_opinions[:hard] > local_opinions[:soft]
opinion = :hard
elsif local_opinions[:soft] > local_opinions[:hard]
opinion = :soft
end
end
opinion
end
end
ActiveRecord analyzes the database tables and creates setter and getter methods with metaprogramming.
So you would create a database column with a migration:
rails g migration AddOpinionToSimulation opinion:hash
Note that not all databases support storing a hash or a similar key/value data type in a column. Postgres does. If you need to use another database such MySQL you should consider using a relation instead (storing the data in another table).
Then when you access simulation.opinion it will automatically get the database column value (if the record is persisted).
Since ActiveRecord creates a setter and getter you can access your property from within the Model as:
class Simulation < ActiveRecord::Base
# ...
def an_example_method
self.opinions # getter method
# since self is the implied receiver you can simply do
opinions
opinions = {foo: "bar"} # setter method.
end
end
The same applies when using the plain ruby attr_accessor, attr_reader and attr_writer macros.
When you assign to an attribute backed by a database column ActiveRecord marks the attribute as dirty and will include it when you save the record.
ActiveRecord has a few methods to directly update attributes: update, update_attributes and update_attribute. There are differences in the call signature and how they handle callbacks.
you can add a method like
def opinions
#opinions ||= Hash[ Simulation.keys.map { |key| [key, 0] }
end
this will cache the operation into the variable #opinions
i would also add a method like
def arrangement
#arrangement ||= Array.new(y_size) { Array.new(x_size) }
end
def rand_opinion
Simulation.keys[rand(0..2)]
end
and then replace the variables with your methods
def generate_arrangement
arrangement.each_with_index do |row, y_index|
row.each_with_index do |current, x_index|
arrangement[y_index][x_index] = rand_opinion
opinions[rand_opinion] += 1
end
end
end
now your opinions and your arrangement will be cached and the code looks better. you didn't have to add a new column in you table
you now hat to replace the #opinions variable with your opinions method

How to compare if a record exist with json data type field?

I want to check if a record already exist on database, but I have one json data type field and I need to compare it too.
When I try check using exists? I got the following error:
SELECT 1 AS one FROM "arrangements"
WHERE "arrangements"."deleted_at" IS NULL AND "arrangements"."account_id" = 1
AND "arrangements"."receiver_id" = 19 AND "config"."hardware" = '---
category: mobile
serial: ''00000013''
vehicle:
' AND "arrangements"."recorded" = 't' LIMIT 1
PG::UndefinedTable: ERROR: missing FROM-clause entry for table "config"
LINE 1: ...id" = 1 AND "arrangements"."receiver_id" = 19 AND "config"."...
^
Code that I using to check if a exists:
#arrangement = Arrangement.new({account_id: receiver.account.id, receiver_id: receiver.id, config: params[:config], recorded: true})
if Arrangement.exists?(account_id: #arrangement.account_id, receiver_id: #arrangement.receiver_id, config: #arrangement.config, recorded: #arrangement.recorded)
puts 'true'
end
I already tried:
if Arrangement.exists?(#arrangement)
puts 'true'
end
But always return false
Table:
create_table :arrangements do |t|
t.references :account, index: true
t.references :receiver, index: true
t.json :config, null: false
t.boolean :recorded, default: false
t.datetime :deleted_at, index: true
t.integer :created_by
t.timestamps
end
You cannot compare jsons. Try to compare some jsons values
where("arrangements.config->>'category' = ?", params[:config][:category])
Look in postgresql docs for other JSON functions and operators
This will convert both field(in case it is just json) and the parameter(which will be a json string) to jsonb, and then perform a comparison of everything it contains.
def existing_config?(config)
Arrangement.where("config::jsonb = ?::jsonb", config.to_json).any?
end

Rails: ActiveRecord interdependent attributes setters

In activerecord, attribute setters seems to be called in order of the param hash.
Therefore, in the following sample, "par_prio" will be empty in "par1" setter.
class MyModel < ActiveRecord::Base
def par1=(value)
Rails.logger.info("second param: #{self.par_prio}")
super(value)
end
end
MyModel.new({ :par1 => 'bla', :par_prio => 'bouh' })
Is there any way to simply define an order on attributes in the model ?
NOTE: I have a solution, but not "generic", by overriding the initialize method on "MyModel":
def initialize(attributes = {}, options = {})
if attributes[:par_prio]
value = attributes.delete(:par_prio)
attributes = { :par_prio => value }.merge(attributes)
end
super(attributes, options)
end
Moreover, it does not works if par_prio is another model that has a relation on, and is used to build MyModel:
class ParPrio < ActiveRecord::Base
has_many my_models
end
par_prio = ParPrio.create
par_prio.my_models.build(:par1 => 'blah')
The par_prio param will not be available in the initialize override.
Override assign_attributes on the specific model where you need the assignments to happen in a specific order:
attr_accessor :first_attr # Attr that needs to be assigned first
attr_accessor :second_attr # Attr that needs to be assigned second
def assign_attributes(new_attributes, options = {})
sorted_new_attributes = new_attributes.with_indifferent_access
if sorted_new_attributes.has_key?(:second_attr)
first_attr_val = sorted_new_attributes.delete :first_attr
raise ArgumentError.new('YourModel#assign_attributes :: second_attr assigned without first_attr') unless first_attr_val.present?
new_attributes = Hash[:first_attr, first_attr_val].merge(sorted_new_attributes)
end
super(new_attributes, options = {})
end

Resources