Elixir not altering table in database migration - erlang

I've a table called Leads. It stores items such as name, quote_amount, lead_source_id and specifications.
My current update method:
def update(conn, %{"id" => _id, "data" => data} = params, user) do
#data has structure like this: data["attributes"][value_to be stored]
#Example: data["attributes"]["specifications"] for specifications value
lead =
Lead.leads(user)
|> Lead.with_id(params)
|> Repo.one!()
attrs = JaSerializer.Params.to_attributes(data)
attrs = conversion_of_company_id_param_if_necessary(attrs, Map.get(attrs, "company_id"), lead)
changeset = Lead.changeset(lead, attrs, user)
case Repo.update(changeset) do
{:ok, lead} ->
lead = Repo.get_by(Lead, id: lead.id) |> Repo.preload([:zipcode])
render(conn, "show.json-api", data: lead)
{:error, changeset} ->
conn
|> put_status(:unprocessable_entity)
|> render(PortalApi.ChangesetView, "error.json", changeset: changeset)
end
end
schema "leads" do
field(:first_name, :string)
field(:last_name, :string)
field(:email, :string)
field(:next_step, :string)
field(:activity_date, :date)
field(:phone, :string)
field(:quote_number, :string)
field(:quote_amount, :integer)
field(:po_number, :string)
field(:po_amount, :integer)
field(:closed_reason, :string)
field(:specifications,:string)
field(:zip, :string, virtual: true)
field(:local_dealer, :boolean, default: false)
field(:claimed, :boolean, default: false)
field(:state, :string, default: "new")
field(:message, :string)
field(:uuid, :string)
field(:reference_number, :string)
field(:business_name, :string)
field(:project_reference, :string)
field(:quotable, :boolean, default: false)
field(:lead_url, :string)
field(:lead_details, :string)
field(:details, :map)
field(:printed, :boolean, default: false)
belongs_to(:internal_rep, PortalApi.User)
belongs_to(:external_rep, PortalApi.User)
belongs_to(:company, PortalApi.Company)
belongs_to(:proposed_company, PortalApi.Company)
belongs_to(:lead_source, PortalApi.LeadSource)
belongs_to(:zipcode, PortalApi.Zipcode, on_replace: :nilify)
belongs_to(:quote_request, PortalApi.QuoteRequest)
has_many(:attachments, PortalApi.Attachment)
has_many(:activity_logs, PortalApi.ActivityLog)
has_many(:notes, PortalApi.Note)
timestamps()
end
Migration of specifications table:
def change do
alter table(:leads) do
add :specifications, :string
end
end
This update seem to update everything normally except that specification.
I've tried checking everything from model, to migrations and controllers, but it just doesn't seem to store that column. I can assure you that Postgres table is fine and all required column exists.
What could be the issue? How can I solve this?
Thank you!

Related

Why GraphQL-Ruby does not understand dynamically generated schema definitions?

In Rails app, I have the following part of the schema defined:
# frozen_string_literal: true
module Types
module KVInfo
def self.kv_value_scalar(typename, raw_type: String, typeid:)
clazz = Class.new(BaseObject) do
graphql_name "KVEntry#{typename}Value"
field :value, raw_type, null: false
end
clazz.define_singleton_method(:typeid) { typeid }
clazz.define_singleton_method(:typename) { typename }
clazz
end
# typeids taken from enum in (.../kv_info.ts)
KVScalars = [
kv_value_scalar('String', typeid: 0),
kv_value_scalar('Markdown', typeid: 1),
kv_value_scalar(
'Date',
raw_type: GraphQL::Types::ISO8601DateTime,
typeid: 2
),
kv_value_scalar('Country', typeid: 3),
kv_value_scalar('Address', typeid: 5)
].freeze
KVScalars.each { |t| KVInfo.const_set(t.graphql_name, t) }
class KVScalarValue < BaseUnion
possible_types(*KVScalars)
def self.resolve_type(obj, _ctx)
KVScalars.select { |t| t.typeid == obj['type'] }.first
end
end
def self.kv_value_array(subtype)
clazz = Class.new(BaseObject) do
graphql_name "KVEntryArray#{subtype.typename}"
field :value, [subtype], null: false
end
clazz.define_singleton_method(:sub_typeid) { subtype.typeid }
clazz
end
KVArrays = KVScalars.map { |s| kv_value_array(s) }
KVArrays.each { |t| KVInfo.const_set(t.graphql_name, t) }
class KVArrayValue < BaseUnion
possible_types(*KVArrays)
def self.resolve_type(obj, _ctx)
KVArrays.select { |t| t.sub_typeid == obj['subtype'] }
end
end
class KVValue < BaseUnion
# PP HERE
possible_types(KVArrayValue, KVScalarValue)
def self.resolve_type(obj, _ctx)
obj['type'] == 4 ? # typeid for array
KVArrayValue :
KVScalarValue
end
end
class KVEntry < BaseObject
field :name, String, null: false
field :value, KVValue, null: false
end
end
end
While running a Rake task that dumps the whole schema to a file to be consumed by frontend, I see the type denoted by KVEntry class having only the name field.
If I put all possible types in the KVValue class like such:
pp(*KVScalars, *KVArrays)
possible_types(*KVScalars, *KVArrays)
it works and generates types correctly.
But note the pp line above - it does not work without this line (???).
Also, if I keep it as is (with nested unions), it does not work regardless of number and positions of pp clauses. When going through with the debugger, all classes are loaded correctly, including generated ones, but the schema still lacks required types.
So the question is what the bisq... why the types are not processed and how can pp affect this process in any sense?
P.S. The data format is fixed by frontend and no way subject to change.
The problem was in the nested unions. GraphQL does not support these. And on the part of plain union not working - I still have no idea of reasons for such behavior, but it fixed itself after N-th restart.

Issues with writing scope for expired & closed cases - Rails 4

i am trying to write a scope for the model Event
i want to display all events that have expired and have been closed
schema
create_table "events", force: :cascade do |t|
t.string "title"
t.text "description"
t.date "date"
t.boolean "close"
end
event.rb
scope :expired_or_closed_events, -> {where(['close = ? OR close IS ?', true] || ['date < ?', Date.current])}
i tried the above scope but i get the below error
2.3.0 :014 > events.expired_or_closed_events
ActiveRecord::PreparedStatementInvalid: wrong number of bind variables (1 for 2) in: close = ? OR close IS ?
could one kindly advise me how i write the scope for this correctly
Your scope should be:
scope :expired_or_closed, -> { where("close = true OR date < ?", DateTime.now) }
Or using Arel
scope :expired_or_closed, -> { where(arel_table[:close].eq(true).or(arel_table[:date].lt(DateTime.now)) }
Note that I use expired_or_closed not expired_or_closed_events, because we are defining this scope in Event model, using `events' is redundant.
Use This
scope :expired_or_closed_events, -> { where( "close == ? || date < ? ", true, Date.current ) }
I think your condition should be **close == ? **
otherwise no sense of condition, which always calculate as true

Create partial index for postgresql jsonb field

I am trying to add a partial index to postgresql jsonb column. The jsonb column is named: email_provider. I have tried adding a partial index on the column as shown below but it throws different error like this PG::UndefinedColumn: ERROR: column "email_povider" does not exist and at other times it raises the error: PG::AmbiguousFunction: ERROR: operator is not unique: unknown -> unknown
The partial index in the rails-5 migration looks like this:
add_index :accounts, :name, name: "index_accounts_on_email_provider_kind", using: :gin, where: "('email_provider' -> 'sparkpost' ->> 'smtp_host' = 'www.sparkpost.com') AND ('email_povider' -> 'amazon ses' ->> 'smtp_host' = 'www.awses.com')"
The json for the email_provider column looks like this:
{
"email_provider": {
"sparkpost": {
"smtp_host": "www.sparkpost.com",
"smtp_port": ""
},
"aws ses": {
"smtp_host": "www.amazon/ses.com ",
"smtp_port": " ",
"username": " ",
"password": " "
}
}
}
The table looks like this:
class CreateAccounts < ActiveRecord::Migration[5.0]
def change
create_table :accounts do |t|
t.string :name, null: false
t.jsonb :email_provider, null: false
t.jsonb :social_account, default: '[]', null: false
t.timestamps
end
add_index :accounts, :name, name: "index_accounts_on_email_provider_kind", using: :gin, where: "('email_provider' -> 'sparkpost' ->> 'smtp_host' = 'www.sparkpost.com') AND ('email_povider' -> 'amazon ses' ->> 'smtp_host' = 'www.awses.com')"
add_index :accounts, :name, name: "index_accounts_on_social_account_type", using: :gin, where: " 'social_account' #> [{'type': 'facebook'}] AND 'social_account' #> [{'type': 'twitter'}]"
end
end
Update
Based on slight adjustment to the accepted answer below, the code I am using to create btree not gin index in rails activerecord is shown below. It creates a btree index because we are using the name column which is of string datatype and not jsonb as described here :
add_index :accounts, :name, name: "index_accounts_on_name_email_provider", where: "email_provider -> 'sparkpost' ->> 'smtp_host' = 'www.sparkpost.com' AND email_provider -> 'amazon ses' ->> 'smtp_host' = 'www.awses.com' "
add_index :accounts, :name, name: "index_accounts_on_name_social_account", where: " social_account #> '[{\"type\": \"facebook\"}]'::jsonb AND social_account #> '[{\"type\": \"twitter\"}]'::jsonb"
I think you need something like that:
add_index :accounts, :email_provider, name: "index_accounts_on_email_provider_kind", using: :gin, where: "(email_provider -> 'email_provider' -> 'sparkpost' ->> 'smtp_host' = 'www.sparkpost.com') AND (email_provider -> 'email_povider' -> 'amazon ses' ->> 'smtp_host' = 'www.awses.com')"
I've changed the second argument of add_index to :email_provider because that's the column name. Also, for the the where clause, I changed
'email_provider' -> 'sparkpost' ->> 'smtp_host' = 'www.sparkpost.com'
to
email_provider -> 'email_provider' -> 'sparkpost' ->> 'smtp_host' = 'www.sparkpost.com'
because the -> operator expects it left argument to be a json(b) value, but you provided a string. So e.g. email_provider -> 'email_provider' extracts the value corresponding to email_provider from the column called email_provider. Note that this last line can be written more compactly as:
email_provider #>> '{email_provider,sparkpost,smtp_host}' = 'www.sparkpost.com'
by using the #>> which extracts a "path" from a josn(b) object. So the add_index statement can be written as:
add_index :accounts, :email_provider, name: "index_accounts_on_email_provider_kind", using: :gin, where: "(email_provider #>> '{email_provider,sparkpost,smtp_host}' = 'www.sparkpost.com') AND (email_provider -> '{email_povider,amazon ses,smtp_host}' = 'www.awses.com')"
For you second index, you should try something like:
where: " social_account -> 'social_account' #> '[{\"type\": \"facebook\"}]'::jsonb AND social_account -> 'social_account' #> '[{\"type\": \"twitter\"}]'::jsonb"
In this case, I did the same think with the column as in the first case. I also changed the right argument of #>, which has to be a jsonb value. So you must define it as a JSON string, which requires double quotes for strings (also notice that we have to escape them for ruby), and then I type cast that string to jsonb to get the desired type.

How to model this complex validation for uniqueness on combined fields

A link has two components: componenta_id and componentb_id. To this end, in the Link model file I have:
belongs_to :componenta, class_name: "Component"
belongs_to :componentb, class_name: "Component"
validates :componenta_id, presence: true
validates :componentb_id, presence: true
validates :componenta_id, uniqueness: { scope: :componentb_id }
validates :componentb_id, uniqueness: { scope: :componenta_id }
And in the migration file:
create_table :links do |t|
t.integer :componenta_id, null: false
t.integer :componentb_id, null: false
...
end
add_index :links, :componenta_id
add_index :links, :componentb_id
add_index :links, [:componenta_id, :componentb_id], unique: true
Question: This all works. Now I want the combination of componanta and componentb to be unique, irrespective their order. So irrespective which component is componenta and which one is componentb (after all that's the same link; a link between the two same components). So the two records below should not be allowed since they represent the same link and thus are not unique:
componenta_id = 1 ; componentb_id = 2
componenta_id = 2 ; componentb_id = 1
How can I create this uniqueness validation? I have model validation working (see below) but wonder whether and how I should also add validation at the migration/db level...?
Model validation
I have model validation working with the code below:
before_save :order_links
validates :componenta_id, uniqueness: { scope: :componentb_id }
private
def order_links
if componenta_id > componentb_id
compb = componentb_id
compa = componenta_id
self.componenta_id = compb
self.componentb_id = compa
end
end
The following test confirms the above works:
1. test "combination of two links should be unique" do
2. assert #link1.valid?
3. assert #link2.valid?
4. #link1.componenta_id = 3 ##link2 already has combination 3-4
5. #link1.componentb_id = 4
6. assert_not #link1.valid?
7. #link1.componenta_id = 4
8. #link1.componentb_id = 3
9. assert_raises ActiveRecord::RecordNotUnique do
10. #link1.save
11. end
12.end
Migration/db validation:
As an extra level of security, is there also a way to incorporate validation for this at the db level? Otherwise it is still possible to write both of the following records to the database: componenta_id = 1 ; componentb_id = 2 as well as componenta_id = 2 ; componentb_id = 1.
Perhaps it is possible to control the creation of the links with:
def create_unique_link( comp_1, comp_2 )
if comp_1.id > comp_2.id
first_component = comp_1
second_component = comp_2
end
link = Link.find_or_create_by( componenta_id: first_comp.id, componentb_id: second_comp.id )
end
If you need the validation, then you can custom validate:
def ensure_uniqueness_of_link
if comp_1.id > comp_2.id
first_component = comp_1
second_component = comp_2
end
if Link.where( componenta_id: first_component.id, componentb_id: second_component ).first
errors.add( :link, 'Links should be unique' )
end
end
validates :componenta_id, uniqueness: { scope: :componentb_id }
validates :componentb_id, uniqueness: { scope: :componenta_id }

How to compare if a record exist with json data type field?

I want to check if a record already exist on database, but I have one json data type field and I need to compare it too.
When I try check using exists? I got the following error:
SELECT 1 AS one FROM "arrangements"
WHERE "arrangements"."deleted_at" IS NULL AND "arrangements"."account_id" = 1
AND "arrangements"."receiver_id" = 19 AND "config"."hardware" = '---
category: mobile
serial: ''00000013''
vehicle:
' AND "arrangements"."recorded" = 't' LIMIT 1
PG::UndefinedTable: ERROR: missing FROM-clause entry for table "config"
LINE 1: ...id" = 1 AND "arrangements"."receiver_id" = 19 AND "config"."...
^
Code that I using to check if a exists:
#arrangement = Arrangement.new({account_id: receiver.account.id, receiver_id: receiver.id, config: params[:config], recorded: true})
if Arrangement.exists?(account_id: #arrangement.account_id, receiver_id: #arrangement.receiver_id, config: #arrangement.config, recorded: #arrangement.recorded)
puts 'true'
end
I already tried:
if Arrangement.exists?(#arrangement)
puts 'true'
end
But always return false
Table:
create_table :arrangements do |t|
t.references :account, index: true
t.references :receiver, index: true
t.json :config, null: false
t.boolean :recorded, default: false
t.datetime :deleted_at, index: true
t.integer :created_by
t.timestamps
end
You cannot compare jsons. Try to compare some jsons values
where("arrangements.config->>'category' = ?", params[:config][:category])
Look in postgresql docs for other JSON functions and operators
This will convert both field(in case it is just json) and the parameter(which will be a json string) to jsonb, and then perform a comparison of everything it contains.
def existing_config?(config)
Arrangement.where("config::jsonb = ?::jsonb", config.to_json).any?
end

Resources