shema_plus with rails migration: make combined columns unique - ruby-on-rails

I'm using the shema_plus gem in rails, and I want to make it so I am unable to duplicate data into my database if the combination of two columns is the same.
shouldn't this work?:
t.string :name, null: false, index: { with: :deleted_at, unique: true }
':deleted_at' is definitely a field in my migration.
but this allows me to enter in the same name twice on the mysql side.
also, here is some info that mysql gives me:
Index: index_plans_on_name_and_deleted_at
Definition:
Type BTREE
Unique Yes
Columns name
deleted_at
EDIT_____________________________________________________
It looks like I needed a default value for deleted_at, like :
t.datetime :deleted_at, default: '2000-01-01'

Not sure about the gem, but you can directly achieve this in ActiveRecord using:
validates :name, :uniqueness => {:scope => :deleted_at}
assuming you want the combination of name and deleted_at to be unique.

Related

GraphQL gem with Rails, can't seem to find correct type definition?

I've defined a UserType as such:
class Types::UserType < GraphQL::Schema::Object
field :id, ID, null: false
field :username, String, null: false
field :full_name, String, null: true
end
Each of these fields exists on the Rails model, and pre 1.8 upgrade of the GraphQL Gem, I was able to use full_name in queries just fine.
When I run the query:
query {
users {
username
id
full_name
}
}
I get: "message": "Field 'full_name' doesn't exist on type 'User'",
If I remove full_name, I get the data I expect. In what way am I approaching this incorrectly? For reference, my QueryType is defined as:
class Types::QueryType < GraphQL::Schema::Object
# Add root-level fields here.
# They will be entry points for queries on your schema.
field :users, [UserType, null: true], null: false do
argument :id, Integer, required: false
end
def users(**args)
args[:id]
if args[:id]
User.where(id: args[:id])
else
User.all
end
end
end
I believe the issue is that full_name should be fullName in your query. With 1.8.x the fields in the schema are auto-camalized.
Field and argument names should be underscored as a convention. They will be converted to camelCase in the underlying GraphQL type and be camelCase in the schema itself.
-- http://graphql-ruby.org/type_definitions/objects.html

Rails Migration - Constraints not working

I'm a new to Ruby on Rails (switched from Laravel) and don't understand how the migration constraints work.
In my migration file I have:
t.string :username, null: false, limit: 20
t.index :username, unique: true
But when I try to create a user with a username length that exceed the limit of 20 characters (or with no value), it works, only the unique constraint works and give me a warning if I try to create a second user with the same username.
I use sqlite for development. When I look into development.sqlite3 file, everything seems OK:
"username" varchar(20) NOT NULL
If someone could help me, it would be much appreciated :)
Thanks

Rails Migration changing column to use Postgres arrays

I am trying to change a column in my database so that it can use the Postgres array data type.
Currently the table column is of type string.
I am using the following migration to convert it:
def change
change_column :table, :dummy_column, :text, array: true, default: []
end
But I get the following error:
bundle exec rake db:migrate
rake aborted!
An error has occurred, this and all later migrations canceled:
PG::Error: ERROR: column "dummy_column" cannot be cast automatically to type character varying[]
HINT: Specify a USING expression to perform the conversion.
: ALTER TABLE "table" ALTER COLUMN "dummy_column" TYPE character varying(255)
Tasks: TOP => db:migrate
PostgreSQL doesn't know how to automatically convert a column of varchar into an array of varchar. It doesn't know what you might intend, because it has no way to know what format you think the current values are in.
So you need to tell it; that's what the USING clause is for.
ActiveRecord doesn't seem to explicitly support the USING clause (not surprising, as it barely supports even the most basic database features). You can specify your own SQL text for the migration, though.
Assuming your strings are comma separated and may not themselves contain commas, for example:
def change
change_column :table, :dummy_column, "varchar[] USING (string_to_array(dummy_column, ','))"
end
(I don't use Rails myself and haven't tested this, but it's consistent with the syntax used in examples elsewhere).
Using Rails 4.2 on postgresql 9.4 I was looking to do this and preserve my pre-existing string data as the first element in one element arrays.
It turns out that postgresql cannot coerce a string into a text array without a USING expression to tell it how.
After much fiddling with delicate postgres syntax, I found a good middle way with active record:
def change
change_column :users, :event_location, :text, array: true, default: [], using: "(string_to_array(event_location, ','))"
end
The only direct postgresql there is the (string_to_array() ) function call. Here are the docs on that--note that you have to supply a delimiter.
Using Rails 4.2 on postgresql 9.4 with a down and a up, base on lrrthomas response.
Note: your starting column should have a default of nil
class ChangeEmailAndNumberColumnForContact < ActiveRecord::Migration
def up
change_column :contacts, :mobile_number, :text, array: true, default: [], using: "(string_to_array(mobile_number, ','))"
change_column :contacts, :email, :text, array: true, default: [], using: "(string_to_array(email, ','))"
end
def down
change_column :contacts, :mobile_number, :text, array: false, default: nil, using: "(array_to_string(mobile_number, ','))"
change_column :contacts, :email, :text, array: false, default: nil, using: "(array_to_string(email, ','))"
end
end
def change
change_column :table, :dummy_column, :string, array: true, default: '{}'
end
Notice:
it's specified as data type :string with array: true to default the column to an empty array ( [] ), you use default: '{}'
It can be done like below:
change_column :table, :column, :string, array: true, default: {}, using: "(string_to_array(column, ','))"
add_column :table, :dummy_column, :string, array: true
change_column_default :table, :dummy_column, []
This fixed it for me.

Rails :uniqueness validation not finding previous records

I'm having a bizzarre glitch, where Rails does not validate the uniqueness of an attribute on a model, despite the attribute being saved perfectly, and despite the validation being written correctly.
I added a validation to ensure the uniqueness of a value on one of my Rails models, Spark, with this code:
validates :content_hash, :presence => true, :uniqueness => true
The content_hash is an attribute created from the model's other attributes in a method called using a before_validation callback. Using the Rails console, I've confirmed that this hash is actually being created before the validation, so that is not the issue.
When I call in the Rails console spark.valid? on a spark for which I know a collision exists on its content_hash, the console tells me that it has run this query:
Spark Exists (0.2ms) SELECT 1 AS one FROM "sparks" WHERE "sparks"."content_hash" = '443524b1c8e14d627a3fadfbdca50118c6dd7a7f' LIMIT 1
And the method returns that the object is valid. It seems that the validator is working perfectly fine, and is running the correct query to check the uniqueness of the content_hash, the problem is instead on the database end (I'm using sqlite3). I know this because I decided to check on the database myself to see if a collision really exists using this query:
SELECT "sparks".* FROM "sparks" WHERE "sparks"."content_hash" = '443524b1c8e14d627a3fadfbdca50118c6dd7a7f'
Bizarrely, this query returns nothing from the database, despite the fact that I can see with my own eyes that other records with this content_hash exist on the table.
For some reason, this is an issue that exists exclusively with the content_hash attribute of the sparks table, because when I run similar queries for the other attributes of the table, the output is correct.
The content_hash column is no different from the others which work as expected, as seen in this relevant part of my schema.rb file:
create_table "sparks", :force => true do |t|
t.string "spark_type"
t.string "content_type"
t.text "content"
t.text "content_hash"
t.datetime "created_at", :null => false
t.datetime "updated_at", :null => false
end
Any help on this problem would be much appreciated; I'm about ready to tear my hair out over this thing.
Okay, I managed to fix the problem. I think it was an sqlite3 issue, because everything worked perfectly once I changed the type of content_hash from a text column to a string column. Weird.

Rails 3.1: Problem saving record with not-null boolean mysql column (false saves as NULL)

Using Rails 3.1rc5 and devise 1.4.2. I have the following column on the users table
add_column :users, :has_dummy_password, :boolean, :default => false, :null => false
Without the :null => false clause if I do the following with an existing user record...
user.has_dummy_password = false
user.save
... then then in mysql the column has a value of NULL.
With the :null => false clause I (not surprisingly) get the following mysql error:
Column 'has_dummy_password' cannot be null
I can get around this by doing
user.has_dummy_password = 0
user.save
because "under the hood" booleans are implemented as tinyint in mysql. That seems a bit unfortunate though.
Is it possible to actually set boolean column values with true/false in Rails 3.1 instead of 1/0?

Resources