I have a preexisting sqlserver database 'MyDatabase' populated with data. Within this database I have two schemas, 'dbo' and 'Master'.
dbo is the default schema and contains tables:
OWNER
LOCATION
Master schema contains tables:
BANK
ZONE
Tables OWNER, LOCATION, BANK, and ZONE contain several attributes a piece.
I have initialized a RoR server and have verified that the appropriate gems are installed (activerecord, tiny_tds, activerecord-sqlserver-adapter), as well as that in database.yml the correct information is provided such that a connection can be established. I ~am~ able to connect to the database. I am able to add and remove tables.
The unusual thing to me is that when I run rake db:migrate, only attributes from the dbo schema become automatically initialized in the schema.rb file of my RoR server:
ActiveRecord::Schema.define(:version => 20131014210258) do
create_table "BANK", :id => false, :force => true do |t|
end
create_table "LOCATION", :id => false, :force => true do |t|
t.string "VarA", :limit => 50
t.string "VarB", :limit => 50
t.decimal "VarC", :precision => 28, :scale => 0
t.integer "VarD"
t.string "VarE", :limit => 500
end
create_table "OWNER", :id => false, :force => true do |t|
t.string "VarF", :limit => 50
t.string "VarG", :limit => 50
t.string "VarH", :limit => 50
t.string "VarI", :limit => 50
t.string "VarJ", :limit => 50
end
create_table "ZONE", :id => false, :force => true do |t|
end
end
Why is it that the attributes are not automatically populated for tables from my Master schema? I have significantly reduced the scope of my database for this question...in actuality there are dozens of tables with dozens of attributes per, so doing the work manually is really not an option.
Is there a way to assign a specific schema(s) towards which ActiveRecord will default to search and generate attributes for?
Help! & Thank you in advance!
Related
I would like to know if there are any performance related ups and downs for tables that has a primary key and those without primary key.
I have a schema with two tables.
Table Without ID
create_table "site_page_views", :id => false, :force => true do |t|
t.integer "site_id"
t.integer "page_id"
t.integer "visit_count", :default => 0, :null => false
t.date "start_date"
t.date "end_date"
end
add_index "site_page_views", ["end_date"], :name => "index_site_page_views_on_end_date"
add_index "site_page_views", ["site_id"], :name => "index_site_page_views_on_site_id"
add_index "site_page_views", ["start_date", "end_date"], :name => "index_site_page_views_on_start_date_and_end_date"
add_index "site_page_views", ["start_date"], :name => "index_site_page_views_on_start_date"
Table With ID
create_table "content_views", :force => true do |t|
t.integer "site_id"
t.integer "page_id"
t.integer "visit_count", :default => 0, :null => false
t.string "type"
t.date "start_date"
t.date "end_date"
end
add_index "content_views", ["page_id"], :name => "index_content_views_on_page_id"
add_index "content_views", ["site_id"], :name => "index_content_views_on_site_id"
add_index "content_views", ["start_date", "end_date"], :name => "index_content_views_on_start_date_and_end_date"
add_index "content_views", ["type"], :name => "index_content_views_on_type"
If you have a look at the second table it represents a STI(Single Table Inheritance)
I have similar data in both tables(this is just a curious testing), when I query for records to get records between date ranges. I get following benchmark results
puts 'No primary Key :' Benchmark.bm do |b|
b.report {
SitePageView.where(site_id: 123,
start_date: start_date,
end_date: end_date).includes(:page)
.order('visit_count DESC').limit(100).all }.real * 1000
end
=> No primay key : 176 ms
puts 'With primary Key :' Benchmark.bm do |b|
b.report {
StiPageViews.where(site_id: 123,
start_date: start_date,
end_date: end_date).includes(:page)
.order('visit_count DESC').limit(100).all }.real * 1000
end
=> With primay key : 101 ms
What would be the reason for the slowness of table without primary key ?
Primary key has auto index. You no need to manually index the primary key.
But, your Query doesn't use primary key column.
So, there should not be any differences in performance.
Better check the query execution time in postgres aswell.
I'm currently working on an admin tool for an existing database and encountered a strange problem when scaffolding a particular table.
Here is the schema of the table using rake db:schema:dump:
create_table "locality", :force => true do |t|
t.integer "version", :limit => 8, :null => false
t.string "truth_id", :null => false
t.string "truth_record_id", :null => false
t.binary "deleted", :limit => 1, :null => false
t.string "internal_code", :null => false
t.string "iso_code"
t.string "materialized_path", :null => false
t.string "invariant_name", :null => false
t.binary "published", :limit => 1, :null => false
t.float "geo_point_latitude", :default => 0.0
t.float "geo_point_longitude", :default => 0.0
t.string "class", :null => false
t.integer "hut_count", :default => 0
t.integer "hotel_count", :default => 0
t.string "destination_url"
end
add_index "locality", ["truth_record_id"], :name => "truth_record_id", :unique => true
I used the schema_to_scaffold gem to create my scaffold from the dumped schema:
rails g scaffold locality version:integer truth_id:string truth_record_id:string
deleted:binary internal_code:string iso_code:string materialized_path:string
invariant_name:string published:binary geo_point_latitude:float
geo_point_longitude:float class:string hut_count:integer hotel_count:integer
destination_url:string
This workflow worked for a lot of other tables but when accessing /localities or Locality.all in the rails console all i get its:
irb(main):001:0> Locality.all
Locality Load (2.1ms) SELECT `locality`.* FROM `locality`
NoMethodError: undefined method `attribute_method_matcher' for "Country":String
Where does "Country":String come from?
At first I thought the model name 'locality' is somehow reservers by rails for i18n stuff but the same problem happens when naming the model 'Bla'.
I'm using rails 3.2.13 and a MySQL Database.
I believe that your column: class is invalid. How would you have access to that column since the class is already a method of any object in ruby?
I think that this causes the mess. The class column's value of your loaded locality is "Country" right?
So the problem was column named class, which ruby obviously hates.
Solution posted in this StackOverflow question: Legacy table with column named "class" in Rails
or more specifically in this blog post (Accessed 25.03.2013):
http://kconrails.com/2011/01/28/legacy-database-table-column-names-in-ruby-on-rails-3/
I'm using rails 3.2 with the following migration and created_at/updated_at both get generated. I was under the impression that adding t.timestamps was what caused those columns to get generated.
class CreateContactsCountries < ActiveRecord::Migration
def change
create_table :contacts_countries do |t|
t.string :name, :official_name, :null => false
t.string :alpha_2_code, :null => false, :limit => 2
t.string :alpha_3_code, :null => false, :limit => 3
end
add_index :contacts_countries, :alpha_2_code
end
end
Please delete the table and check again becuase
By default, the generated migration will include t.timestamps (which creates
the updated_at and created_at columns that are automatically populated
by Active Record).
Ref this
I am using sqlite's FTS4 full-text-search functionality. The FTS table is created via a raw-sql migration using CREATE VIRTUAL TABLE fts_foo USING fts4(); When this is executed SQLite actually creates several tables fts_foo, fts_foo_content, fts_foo_docsize, fts_foo_segdir, fts_foo_segments, fts_foo_stat, as well as an index on fts_foo_segdir columns.
However, schema.rb does not understand these columns and outputs the following
# Could not dump table "fts_foo" because of following StandardError
# Unknown type '' for column 'content'
# Could not dump table "fts_foo_content" because of following StandardError
# Unknown type '' for column 'c0content'
create_table "fts_foo_docsize", :primary_key => "docid", :force => true do |t|
t.binary "size"
end
create_table "fts_foo_segdir", :primary_key => "level", :force => true do |t|
t.integer "idx"
t.integer "start_block"
t.integer "leaves_end_block"
t.integer "end_block"
t.binary "root"
end
add_index "fts_foo_segdir", ["level", "idx"], :name => "sqlite_autoindex_fts_foo_segdir_1", :unique => true
create_table "fts_foo_segments", :primary_key => "blockid", :force => true do |t|
t.binary "block"
end
create_table "fts_foo_stat", :force => true do |t|
t.binary "value"
end
I don't think any of these tables should be created in schema.rb. It should simply create a single virtual table and let sqlite build the supporting tables. Is there any way I can do this? If not, what kind of work-arounds would facilitate this?
(I am using Rails 2.2.2, but should be very similar to 2.3.5 or 3.0)
The following line works:
User.create!(:email => 'ha')
But I generated a migration and added identifier to the users table, and restarted the Rails console, and used
User.create!(:email => 'bar', :identifier => 'foo')
This user is created with and the email field is set to bar (as seen in mysql) but identifier is not set to foo... is there a reason why?
db/schema.rb:
create_table "users", :force => true do |t|
t.string "login"
t.string "email"
t.string "crypted_password", :limit => 40
t.string "salt", :limit => 40
t.datetime "created_at"
[...]
t.string "short_bio"
t.string "identifier"
end
Try adding attr_accessible to User model:
def User
attr_accessible :identifier
end
If you do not want to add attr_accessible for identifier (because say, a user should not be allowed to set their own identifier), then you need to first save the user and then set the identifier separately:
User.create!(:email => "a#a.com")
u = User.find_by_email("a#a.com")
u.identifier = "foo"
u.save!