Rails rename_column migration issue - ruby-on-rails

I am trying to run a migration on an existing database to change the column name on a table. When I run the migration, I get an error stating that Blob/Text fields cannot have a default value. The column in question is a text column, with a non-null attribute, but no default value.
The migration that Rails attempts is:
ALTER TABLE xxxxx CHANGE abcd ABCD text DEFAULT '' NOT NULL
Now, I haven't asked the migration to change the column type, I have only asked it to rename the column, so why is the migration trying to do anything to the column type?
I have Googled the issue, and haven't come up with an explanation or workaround.
Any help appreciated.
Vikram

There does seem to be a longstanding unresolved ticket on this issue, as described here:
rails bug report
Rails' default behavior is to make columns which are NULL, since this prevents false positives on presence checks, etc, when translating blank strings back into Ruby. Any chance you can work around this by redefining your text column to work with NULL values in the mySQL console?
EDIT
You can do this in your migration file, it's not the Rails way but it's a lot nicer than sending an email to everyone to change their local copies:
MyModel.connection.execute "ALTER TABLE xxxxx CHANGE abcd ABCD text DEFAULT NULL"

Related

In Rails: How is existing data affected by data type change?

I'm working on a Rails app and would like to change the datatype for an existing column. It's currently a DateTime type, and I want to change it to a Date type. I found a way to do this here, but in this case, the person was not worried about preexisting data.
Right now, I plan to generate a migration...
rails g migration change_my_column_in_my_talbe
...and make the following change:
class ChangeMyColumnInMyTable < ActiveRecord::Migration
def change
change_column :my_table, :my_column, :date
end
end
My question is: will the existing data be converted to a Date type, or will I need to create a rake task to convert the values for all of my existing DateTime values?
I found a similar question in which the conversion was from Boolean to String, and it seemed like the change would be automatic for existing data. I just want to be sure before I jump into making this change.
I'm using Rails version 4.2.0 and MySQL version 5.6.27 Homebrew. Any advice on this issue would be greatly appreciated!
change_column is going to be translated into an ALTER TABLE ... ALTER COLUMN SQL statement, at which point how the casting is handled is up to your database, so the full answer depends on which database you're using. However, it's a pretty safe bet that your database can convert between datetime and date without trouble.
That said, there's no better way to know than to test it!
Using the change_column method, data conversion will be handled by the specific database adapter you are using. For example, with mysql adapter, change_column will call ALTER TABLE tbl_name MODIFY COLUMN col_name DATE. The data conversion from DATETIME to DATE will truncate the times.
Futhermore, MYSQL DATETIME to DATE performs rounding, conversion to a DATE value takes fractional seconds into account and rounds the time, '1999-12-31 23:59:59.499' becomes '1999-12-31', whereas '1999-12-31 23:59:59.500' becomes '2000-01-01'. Either way, the change method IS NOT reversible for the change_column method.
http://guides.rubyonrails.org/active_record_migrations.html#using-the-change-method
If you plan on reversing this migration, use the reversible method instead.

Rollback migration that changes column type from string to text where db is postgresql in rails 3.2

I have understood the solution for changing the column type from string to text while using postgresql and rails 3.2 provided here. I have also implemented it. But when I rollback this migration, it fails with "PG::StringDataRightTruncation: ERROR: value too long" error. How should we tackle this problem?
You have new values that're too long for the old type. PostgreSQL would have to throw away data to change to varchar(255) if the values are longer than 255 chars. It refuses to do so because it won't cause data loss without being told very specifically to do so.
If you don't mind truncating these long values, permanently and unrecoverably discarding data, you can use the USING clause of ALTER COLUMN ... TYPE. This is the same approach used when converting string columns to integer.
ALTER TABLE mytable
ALTER COLUMN mycolumn
TYPE varchar(255)
USING (substring(mycolumn from 1 for 255));
I don't think there is any way to express this symmetrically in a Rails migration; you will need separate clauses for the up- and down- migrations, with the up-migration being a simple:
ALTER TABLE mytable
ALTER COLUMN mycolumn
TYPE text;
Frankly though, I think it's a terrible idea to do this in a migration. The migration should fail. This action should require administrator intervention to UPDATE the columns that have data that is too long, then run the migration.

ActiveRecord changes my binary column to a string column, for some reason, on migration

Using Ruby on Rails 3 and ActiveRecord 3.2.18. I have a binary column in my database. It holds binary stuff, and actually I have a bunch of records in production with that column filled. So in my db/schema.rb file I had...
...
t.binary "tin"
...
Now, after running a migration that touches this table but doesn't change that column, my schema says...
...
t.string "tin"
...
Well... I know that a string might be binary, and binary might be a string, depending on how it's stored in the database, and maybe these equate to the same column type in the end, but why is this happening and what can I do to fix it? Is it safe to deploy this change to production or will it hose my binary columns?
When you run a rake command such as rake db:migrate, Rails will recreate the schema.rb file from the schema in your own personal database. So it sounds like your database has the tin field setup as a varchar field. If your migrations set up your database this way and your production server has the same database then I wouldn't count on the production server to do the right thing. So you may need to look into how to really set a binary field.
On the other hand, if your database is setup properly and it's just the schema file that's not then it may be because... schema can't interpret every database-specific column type. In these cases, you can switch your schema to dump to schema_dump.sql instead of to schema.rb. So check this Stack Overflow post for more on that.

Can't insert row in table with computed column?

I've got a Rails 3 applicaiton running on SQL Server against a legacy database with some computed columns. When I try to save a new record, I get an error "The column "ContractPendingDays" cannot be modified because it is either a computed column or is the result of a UNION operator
I'm not updating this column, it just seems to be one of the columns activerecord tries to write to when adding a new record into the db.
Is there some way to stop this behavior? I even tried changing schema rb but that didn't help. (and suboptimal anyway since I'd have to do it every time I change the db.)
Is there some way to mark a column as not updatable so activerecord doesn't try to write to it?
Found answer here:
Prevent NULL for active record INSERT?
(Use attributes.delete in a before_create filter)

rails migration. modify starting point for auto_increment

I have a table already created. I am looking for a rails migration where I can modify the starting point of the auto_increment number for id column of my table. Let's say I want it to start from 1000.
I googled a bit and came across this:
it says:
:options "string" pass raw options to
your underlying database, e.g.
auto_increment = 10000. Note that
passing options will cause you to lose
the default ENGINE=InnoDB statement
Can this be used for something I want? and how will the migration look since i am changing the column and not creating new one...
You can use raw execute method
execute ("ALTER TABLE your_table_name AUTO_INCREMENT = 10000")

Resources