Schema.rb doesn't understand SQLite virtual tables - ruby-on-rails

I am using sqlite's FTS4 full-text-search functionality. The FTS table is created via a raw-sql migration using CREATE VIRTUAL TABLE fts_foo USING fts4(); When this is executed SQLite actually creates several tables fts_foo, fts_foo_content, fts_foo_docsize, fts_foo_segdir, fts_foo_segments, fts_foo_stat, as well as an index on fts_foo_segdir columns.
However, schema.rb does not understand these columns and outputs the following
# Could not dump table "fts_foo" because of following StandardError
# Unknown type '' for column 'content'
# Could not dump table "fts_foo_content" because of following StandardError
# Unknown type '' for column 'c0content'
create_table "fts_foo_docsize", :primary_key => "docid", :force => true do |t|
t.binary "size"
end
create_table "fts_foo_segdir", :primary_key => "level", :force => true do |t|
t.integer "idx"
t.integer "start_block"
t.integer "leaves_end_block"
t.integer "end_block"
t.binary "root"
end
add_index "fts_foo_segdir", ["level", "idx"], :name => "sqlite_autoindex_fts_foo_segdir_1", :unique => true
create_table "fts_foo_segments", :primary_key => "blockid", :force => true do |t|
t.binary "block"
end
create_table "fts_foo_stat", :force => true do |t|
t.binary "value"
end
I don't think any of these tables should be created in schema.rb. It should simply create a single virtual table and let sqlite build the supporting tables. Is there any way I can do this? If not, what kind of work-arounds would facilitate this?

Related

Ruby on Rails ActiveRecord Database Migration Failure

I have a preexisting sqlserver database 'MyDatabase' populated with data. Within this database I have two schemas, 'dbo' and 'Master'.
dbo is the default schema and contains tables:
OWNER
LOCATION
Master schema contains tables:
BANK
ZONE
Tables OWNER, LOCATION, BANK, and ZONE contain several attributes a piece.
I have initialized a RoR server and have verified that the appropriate gems are installed (activerecord, tiny_tds, activerecord-sqlserver-adapter), as well as that in database.yml the correct information is provided such that a connection can be established. I ~am~ able to connect to the database. I am able to add and remove tables.
The unusual thing to me is that when I run rake db:migrate, only attributes from the dbo schema become automatically initialized in the schema.rb file of my RoR server:
ActiveRecord::Schema.define(:version => 20131014210258) do
create_table "BANK", :id => false, :force => true do |t|
end
create_table "LOCATION", :id => false, :force => true do |t|
t.string "VarA", :limit => 50
t.string "VarB", :limit => 50
t.decimal "VarC", :precision => 28, :scale => 0
t.integer "VarD"
t.string "VarE", :limit => 500
end
create_table "OWNER", :id => false, :force => true do |t|
t.string "VarF", :limit => 50
t.string "VarG", :limit => 50
t.string "VarH", :limit => 50
t.string "VarI", :limit => 50
t.string "VarJ", :limit => 50
end
create_table "ZONE", :id => false, :force => true do |t|
end
end
Why is it that the attributes are not automatically populated for tables from my Master schema? I have significantly reduced the scope of my database for this question...in actuality there are dozens of tables with dozens of attributes per, so doing the work manually is really not an option.
Is there a way to assign a specific schema(s) towards which ActiveRecord will default to search and generate attributes for?
Help! & Thank you in advance!

How to reverse an extend migration

I recently ran this migration while installing the fuzzily gem:
class AddTrigramsModel < ActiveRecord::Migration
extend Fuzzily::Migration
end
From looking at my schema.rb file, it looks like the effect of this migration was:
create_table "trigrams", :force => true do |t|
t.string "trigram", :limit => 3
t.integer "score", :limit => 2
t.integer "owner_id"
t.string "owner_type"
t.string "fuzzy_field"
end
add_index "trigrams", ["owner_id", "owner_type", "fuzzy_field", "trigram", "score"], :name => "index_for_match"
add_index "trigrams", ["owner_id", "owner_type"], :name => "index_by_owner"
Not sure if the easiest way is just to drop the table trigrams, or if there is a more appropriate method? I am assuming the indexes will be deleted on dropping the table?
Just run rake db:rollback. Fuzzily has support for rollbacks. Although everything it does is dropping the trigrams table :)
# lib/fuzzily/migration.rb:33
def down
drop_table trigrams_table_name
end

Why do created_at/updated_at get created when migration does not have timestamps?

I'm using rails 3.2 with the following migration and created_at/updated_at both get generated. I was under the impression that adding t.timestamps was what caused those columns to get generated.
class CreateContactsCountries < ActiveRecord::Migration
def change
create_table :contacts_countries do |t|
t.string :name, :official_name, :null => false
t.string :alpha_2_code, :null => false, :limit => 2
t.string :alpha_3_code, :null => false, :limit => 3
end
add_index :contacts_countries, :alpha_2_code
end
end
Please delete the table and check again becuase
By default, the generated migration will include t.timestamps (which creates
the updated_at and created_at columns that are automatically populated
by Active Record).
Ref this

Index name is too long - Rails 3

I am trying to run this migration:
class RemoveClientFromSalesteam < ActiveRecord::Migration
change_table :sales_teams do |t|
t.remove :client_id
end
end
This is the error I am getting:
rake db:migrate
-- change_table(:sales_teams)
rake aborted!
An error has occurred, this and all later migrations canceled:
Index name 'temp_index_altered_sales_teams_on_client_priority_and_personal_priority' on table 'altered_sales_teams' is too long; the limit is 64 characters
Tasks: TOP => db:migrate
(See full trace by running task with --trace)
This is what my schema.rb looks like:
create_table "sales_teams", :force => true do |t|
t.string "name"
t.integer "firm_id"
t.boolean "client_priority"
t.boolean "personal_priority"
t.datetime "created_at", :null => false
t.datetime "updated_at", :null => false
t.integer "client_id"
end
add_index "sales_teams", ["client_id"], :name => "index_sales_teams_on_client_id"
add_index "sales_teams", ["client_priority", "personal_priority"], :name => "index_sales_teams_on_client_priority_and_personal_priority"
add_index "sales_teams", ["name", "firm_id"], :name => "index_sales_teams_on_name_and_firm_id"
Thoughts?
Thanks.
Drop the index, remove your column, and then re-add the index:
def up
remove_index :sales_teams, :column => [ :client_priority, :personal_priority ]
remove_column :sales_teams, :client_id
add_index :sales_teams, [ :client_priority, :personal_priority ]
end
I'm guessing that you're using SQLite, most databases support real ALTER TABLE operations for removing columns but SQLite forces you to copy the table (and indexes), drop the table, and copy everything back; the Rails SQLite driver takes care of this behind the scenes but, apparently, doesn't know about the identifier length limit.
You can also specify your own index names by using the :name option to add_index and remove_index if necessary.

Rails: can I use polymorphic references with non-integer primary keys?

I have a database that uses UUIDs as primary keys, like this:
create_table "my_table", :id => false, :force => true do |t|
t.string "id", :limit => 36
end
However, when I try to use :references for foreign keys to that table, it generates integer columns for the ID. Can :references be instructed to deal with a non-integer ID? My migration for the referring table is like this:
create_table "child_table" :id => false, :force => true do |t|
t.string "id", :limit => 36
t.references :my_table
end
I know that I could just manually create :my_table_id and :my_table_type columns, but I'm wondering whether :references can be made to do its magic under these circumstances so that I don't have to handle the id+type explicitly throughout my code.
A :type option has been added when referencing since Rails 4.2
t.references :car, type: :uuid, index: true
For example:
def change
enable_extension 'uuid-ossp'
create_table :cars, id: :uuid do |t|
t.integer :seats
# And other car-specific things
end
create_table :wheels do |t|
t.references :car, type: :uuid, index: true
t.integer :radius
# And other wheel-specific things
end
end
source: https://github.com/rails/rails/pull/16231
Nope, references only creates integer columns as of this writing.
I'm sure you could override the references method to do what you want. But IMO you'd be better off specifying your UUID columns and type columns explicitly. That way the code is clear about what is going on behind the scenes.

Resources