How can I load data into database from fixture files with Cyrillic data?
I've tried, but data in database is converted to ??? symbols. My fixture file is saved in UTF-8 encoding.
Are you using MySQL? You may need to change the collation and/or character sets for the tables in your database.
A whole section exists in the MySQL Manual on this topic which I recommend if your project scope for internationalisation is wide, but essentially applying this SQL to each table will help you:
ALTER TABLE tablename CHARACTER SET utf8 COLLATE utf8_general_ci;
Related
How can I change my Entity Framework 6 collation to support Arabic language?
I googled it a lot but I cannot find any good solution
It is not EF issue, if you have default collation configured in your Sql database system then that will be picked by Sql. So collation is database system configuration thing. Probably you have to change your default collation on your machine.
Other option is to add after creating table statement in migration:
Sql('alter table <some_table> convert to character set utf8 collate utf8_unicode_ci');
One of my recent Rails app requires import data from excel, I've been following along Ryan's RailsCasts #396 Importing CSV and Excel, it worked out pretty okay, but with one limitation, I have to create the database schema first, just wondering how can I make it more adaptable, so that it could pick up any excel with any header or number of columns, base on the imported the data, to create database schema on the fly. Is that even possible? Thanks :)
Relational database works well when you know the structure of the data, there are not suitable for making migrations triggered by user data, if you don't know the structure you can always a schema like:
MyTable
col1: string
col2: string
col3: string
coln: string
Other approach you may try is to use a non-relational database like mongodb, it's compatible with ruby on rails.
I have understood the solution for changing the column type from string to text while using postgresql and rails 3.2 provided here. I have also implemented it. But when I rollback this migration, it fails with "PG::StringDataRightTruncation: ERROR: value too long" error. How should we tackle this problem?
You have new values that're too long for the old type. PostgreSQL would have to throw away data to change to varchar(255) if the values are longer than 255 chars. It refuses to do so because it won't cause data loss without being told very specifically to do so.
If you don't mind truncating these long values, permanently and unrecoverably discarding data, you can use the USING clause of ALTER COLUMN ... TYPE. This is the same approach used when converting string columns to integer.
ALTER TABLE mytable
ALTER COLUMN mycolumn
TYPE varchar(255)
USING (substring(mycolumn from 1 for 255));
I don't think there is any way to express this symmetrically in a Rails migration; you will need separate clauses for the up- and down- migrations, with the up-migration being a simple:
ALTER TABLE mytable
ALTER COLUMN mycolumn
TYPE text;
Frankly though, I think it's a terrible idea to do this in a migration. The migration should fail. This action should require administrator intervention to UPDATE the columns that have data that is too long, then run the migration.
I'm trying to seed a Rails application with some sql statements in the seed.rb file. There are 12 values supplied, however, the table has 15 columns. The extra three columns are the automatically generated id, created_at and updated at columns that Rails includes by default. If I run a custom sql statement in the seed.rb file in the following manner....
connection = ActiveRecord::Base.connection()
query = "random sql"
connection.execute(query)
Rails doesn't create those columns for me in the way it would if I did
Employee.create!(name: "Joe")
Is there anyway to indicate to rails that I need the id and timestamp columns filled with values when I run an sql statement in seed.rb?
No, because Rails has no way of knowing whether your "random sql" even creates any records for it to fill in ids/timestamps.
When you connection.execute you are on your own, you have forsaken your ORM and given in to the temptation of SQL.
If you can do it using ActiveRecord, then do so! If not, well, that is why Rails lets you drop down to SQL (but think again. Can you really not write it in Ruby?).
Using RoR 2.3.8.
I have two models. It's strange that when I typed text and saved in Model A, it shows the exact text, but when I do this in Model B, it shows ???. It's most likely one supports UTF-8 but another doesn't. The thing is, I don't remember me setting any on either. What can I do to fix this?
Using Mac OS 10.6.7, Chrome
MySQL and other database engines set the encoding used for text on several levels: server, database, table and column. Generally the defaults are applied from the top down, from server to database, database to table and so forth, but they can be customized at any given point as required. Sometimes this happens inadvertently and can cause issues.
One way to know what encoding is currently active is to use a client like Sequel Pro which will expose this information to you, or to investigate using the mysql command line tool:
SHOW CREATE DATABASE example;
You get a response that contains something like this:
CREATE DATABASE `example` /*!40100 DEFAULT CHARACTER SET utf8 COLLATE utf8_unicode_ci */
In this case it's a UTF-8 table with UTF-8 UNICODE collation. The encoding defines how thae data is stored and the collation defines how to handle sorting and case conversion.
Investigating further you can examine the table and columns:
SHOW CREATE TABLE example;
This gives a lot more detail, something like this:
CREATE TABLE `example` (
`id` int(11) NOT NULL AUTO_INCREMENT,
`email` varchar(255) COLLATE utf8_unicode_ci NOT NULL DEFAULT '',
PRIMARY KEY (`id`),
UNIQUE KEY `index_example_on_email` (`email`)
) ENGINE=InnoDB AUTO_INCREMENT=29 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci
In this case the table itself is defaulting to UTF-8 and the email column is likewise flagged. You may have a column that's different.
If you need to alter the encoding or collation you can use the ALTER TABLE statement to effect this.