In my Heroku postgres db, I had a column of type "string" with a limit of "50" characters.
I just made a migration that changed the limit to 80 characters.
class ChangeTagLineLimit < ActiveRecord::Migration
def up
change_column :blocks, :tag_line, :string, :limit => 80
end
def down
change_column :blocks, :tag_line, :string, :limit => 50
end
end
However when I try to save a record, I get this error:
PG::StringDataRightTruncation: ERROR: value too long for type character varying(50)
It sounds like PostGRES hasn't changed the size of the varchar column. How do I fix this?
That migration should work. The error is probably related to another table that didn't get migrated yet.
Related
I'm trying to change gender column to integer in my rails app. It is said that when using postgresql, we have to write a little bit different way. i.e.
change_column :users, :gender, :integer, using: 'gender::integer'
or
change_column :users, :gender, 'integer USING CAST(gender AS integer)'
However, in my case, the aboves are still not working and got the below error.
PG::DatatypeMismatch: ERROR: default for column "gender" cannot be cast automatically to type integer
: ALTER TABLE "users" ALTER COLUMN "gender" TYPE integer USING gender::integer
Please tell me why it doesn't work. Or some weird things I've got? For example typo.
My environment
Ruby: 2.5.0
Rails: 5.1.6
Postgres: 11.1
Based on the error message you're seeing, Postgres does not know how to keep the default value you've set while changing the database type to integer. I'd suggest:
Dropping the default
Changing the datatype on the column
Adding the default in the new type
You may be able to do steps 2 and 3 in the same call. You'll want to use something like the below in the up migration:
def up
execute "ALTER TABLE users ALTER gender DROP DEFAULT;"
change_column :users, :gender, :integer, using: 'gender::integer', default: 0
end
I have this migration where I convert a column from integer to an array of string.
class ChangeWdayFromIntegerToStringInResourceWeekDayStart < ActiveRecord::Migration[4.2]
def up
change_column :resource_week_day_starts, :wday, :string, default: []
add_column :resource_week_day_starts, :number_days, :integer, default: 7
end
def down
change_column :resource_week_day_starts, :wday, :string, default: nil
change_column :resource_week_day_starts, :wday, 'integer USING CAST(wday AS integer)'
remove_column :resource_week_day_starts, :number_days
end
end
This migration works pretty well when we where in rails 3, but we have migrate to rails 5 and now we try to setup a new server. When running the migration in rails 5 we got this error message:
PG::DatatypeMismatch: ERROR: column "wday" cannot be cast automatically to type character varying[]
HINT: You might need to specify "USING wday::character varying[]".
: ALTER TABLE "resource_week_day_starts" ALTER COLUMN "wday" TYPE character varying[]
/home/ruby/src/mapsbooking/db/migrate/20170307000000_change_wday_from_integer_to_string_in_resource_week_day_start.rb:3:in `up'
I have try many ways to fix this up. But nothing works.
Can somebody help me
Thanks
You have three problems:
As max says in the comments, you need to include array: true in the options so that you get an array column.
You need an SQL expression to convert a single integer to an array of strings so that you can include a suitable USING clause in the ALTER TABLE.
change_column wants to change the type and the default separately.
(1) is easy, add array: true to the change_column options.
(2) is a little harder but a couple options come to mind. You could use the element-to-array concatenation operator and a type cast:
wday::varchar || array[]::varchar[]
:: is a type cast, || is the concatenation operator, and array[] is an empty array. Or, if that's too much punctuation, you could use the array_append function to do the same thing:
array_append(array[]::varchar[], wday::varchar)
(3) can be dealt with by dropping the old default with a change_column_default call before the change_column.
Putting them together:
change_column_default :resource_week_day_starts, :wday, nil
change_column :resource_week_day_starts,
:wday,
:string,
array: true,
default: [],
using: 'array_append(array[]::varchar[], wday::varchar)'
This could leave you with array[null] values in wday if you currently have nulls inwday`. You can clean those up after if necessary.
I have to create a migration to do a db level validation. The migration:
class DataBaseLevelValidation < ActiveRecord::Migration
def change
add_index :benefits_business_changes, [:benefit_id, :business_change_id], :unique => true
end
end
The problem I have is that when I try to run rake db:migration I have this error:
Index name 'index_benefits_business_changes_on_benefit_id_and_business_change_id' on table 'benefits_business_changes' is too long;
the limit is 62 characters/Users/mariocardoso/.rvm/gems/ruby-2.1.2/gems/activerecord-4.1.5/lib/active_record/connection_adapters/abstract/schema_statements.rb:797:in `add_index_options'
But if I change the name to a shorter version I get this:
SQLite3::SQLException: no such table: main.benefits_businessc: CREATE UNIQUE INDEX "index_benefits_businessc_on_benefit_id_and_business_change_id" ON "benefits_businessc"
How can I overcome this problem?
The only ways I see, is to change the 'business_change' model to a shorter name (model, views, migration, ... everything).
There is any way to run this migration without having the error caused by the long name?
You can do
add_index :benefits_business_changes, [:benefit_id, :business_change_id], :unique => true, :name => "a_shorter_name"
A common choice is to use just the first few letters of each column.
I am trying to change a column in my database so that it can use the Postgres array data type.
Currently the table column is of type string.
I am using the following migration to convert it:
def change
change_column :table, :dummy_column, :text, array: true, default: []
end
But I get the following error:
bundle exec rake db:migrate
rake aborted!
An error has occurred, this and all later migrations canceled:
PG::Error: ERROR: column "dummy_column" cannot be cast automatically to type character varying[]
HINT: Specify a USING expression to perform the conversion.
: ALTER TABLE "table" ALTER COLUMN "dummy_column" TYPE character varying(255)
Tasks: TOP => db:migrate
PostgreSQL doesn't know how to automatically convert a column of varchar into an array of varchar. It doesn't know what you might intend, because it has no way to know what format you think the current values are in.
So you need to tell it; that's what the USING clause is for.
ActiveRecord doesn't seem to explicitly support the USING clause (not surprising, as it barely supports even the most basic database features). You can specify your own SQL text for the migration, though.
Assuming your strings are comma separated and may not themselves contain commas, for example:
def change
change_column :table, :dummy_column, "varchar[] USING (string_to_array(dummy_column, ','))"
end
(I don't use Rails myself and haven't tested this, but it's consistent with the syntax used in examples elsewhere).
Using Rails 4.2 on postgresql 9.4 I was looking to do this and preserve my pre-existing string data as the first element in one element arrays.
It turns out that postgresql cannot coerce a string into a text array without a USING expression to tell it how.
After much fiddling with delicate postgres syntax, I found a good middle way with active record:
def change
change_column :users, :event_location, :text, array: true, default: [], using: "(string_to_array(event_location, ','))"
end
The only direct postgresql there is the (string_to_array() ) function call. Here are the docs on that--note that you have to supply a delimiter.
Using Rails 4.2 on postgresql 9.4 with a down and a up, base on lrrthomas response.
Note: your starting column should have a default of nil
class ChangeEmailAndNumberColumnForContact < ActiveRecord::Migration
def up
change_column :contacts, :mobile_number, :text, array: true, default: [], using: "(string_to_array(mobile_number, ','))"
change_column :contacts, :email, :text, array: true, default: [], using: "(string_to_array(email, ','))"
end
def down
change_column :contacts, :mobile_number, :text, array: false, default: nil, using: "(array_to_string(mobile_number, ','))"
change_column :contacts, :email, :text, array: false, default: nil, using: "(array_to_string(email, ','))"
end
end
def change
change_column :table, :dummy_column, :string, array: true, default: '{}'
end
Notice:
it's specified as data type :string with array: true to default the column to an empty array ( [] ), you use default: '{}'
It can be done like below:
change_column :table, :column, :string, array: true, default: {}, using: "(string_to_array(column, ','))"
add_column :table, :dummy_column, :string, array: true
change_column_default :table, :dummy_column, []
This fixed it for me.
I have a PostgreSQL database for a Rails application.
I want to store the Facebook user id so I thought I could use integer but its not big enough so I chose float.
However now Rails adds .0 to the end of my user id's
What datatype can I use so this does not happen for Facebook user ids which are very long example: 100002496803785
You can use :limit => 8 on your integer column to get a bigint. For example:
class Pancakes < ActiveRecord::Migration
def change
create_table :pancakes do |t|
t.integer :c, :limit => 8
end
end
end
And then, from psql:
=> \d pancakes
Table "public.pancakes"
Column | Type | Modifiers
--------+---------+-------------------------------------------------------
id | integer | not null default nextval('pancakes_id_seq'::regclass)
c | bigint | not null
Indexes:
"pancakes_pkey" PRIMARY KEY, btree (id)
And there's your eight byte bigint column.
You could also use a string for the Facebook ID. You're not doing any arithmetic on the IDs so they're really just opaque bags of bits that happen to look like large integers, strings will sort and compare just fine so they might be the best option. There would be some storage and access overhead due to the increased size of a string over the integer but it probably wouldn't be enough to make any noticeable difference.
Never use a double for something that needs to be exact. You'd probably be fine (except for the trailing .0 of course) in this case because you'd have 52 bits of mantissa and that means that the double would act like a 52 bit integer until your values got large enough to require the exponent. Even so, using double for this would be an awful idea and an abuse of the type system.
I don't use postgresql but in mysql I use BIGINT
According to postgresql data types, BIGINT for postgresql as well.
mu is too short has a great answer, I only want to add that if you want to use the ID as a foreign key between tables then you should stick to the BIGINT solution he describes, not use a string. This is what I use, essentially:
Example:
create_table(:photos) do |t|
t.integer :fb_uid, :limit => 8 # Facebook ID of the photo record
t.integer :facebook_profile_uid, :limit => 8, :null => false # foreign key to user
# ...
end
create_table(:users) do |t|
t.integer :fb_uid, :limit => 8, :null => false # Facebook ID of the user record
t.integer :photos_count, :integer, :default => 0
# ...
end
class User < ActiveRecord::Base
has_many :photos, foreign_key: :facebook_profile_uid, primary_key: :fb_uid
# ...
end
class Photo < ActiveRecord::Base
belongs_to :facebook_profile, foreign_key: :facebook_profile_uid, primary_key: :fb_uid, :counter_cache => true
end
Ran into this problem while using the Google uid which also is quite large.
I found the this answer to be most useful:
Getting error indicating number is "out of range for ActiveRecord::Type::Integer with limit 4" when attempting to save large(ish) integer value
Run a migration to change your table column.
Edit the generated migration -> add, limit: 8
Run db:migrate to migrate to the database.
Restart the rails server.
This will allow you to change the limit of your table column.