I've been digging around to see how I could have all my newly and subsequent Model id's to have a limit of 8 byte. Answers show how to when adding a new table column; I want whenever I create a new Model, it would automatically has a limit of 8 byte. Possible?
When creating a new model, I get:
ActiveModel::RangeError: 36565651767 is out of range for ActiveModel::Type::Integer with limit 4
Where to change this limit from 4 to 8?
A possible duplicate but since there will be errors:
you can't redefine the primary key column 'id'. To define a custom primary key, pass { id: false } to create_table.
Which means your table should look like this:
class MyModels < ActiveRecord::Migration[5.0]
def change
create_table :my_models, {id: false } do |t|
t.column :id, limit: 8
...
end
end
end
Related
I have a class called Order that has fields total and commission_amount, which are both decimals. In the seeds.rb file I assign both fields 100.00, as is (no quotes). When I enter rails console and type Order.all I can see that the order created by the seeds file has a commission amount of 100.00 but the total is listed as 0.0.
I've tried using rails console to assign different numbers to it, both total and commission amount are decimal attributes with precision: 10, scale: 2. I've tried using a BigDecimal constructor to assign the values in the seeds file, and I don't have any kind of verifications in place over the total attribute. Pretty stumped on what should be a trivial issue. Thanks for your help!
EDIT: here are the relevant samples of code
# the migration
class CreateOrders < ActiveRecord::Migration
def change
create_table :orders do |t|
# ...
t.decimal :total, precision: 10, scale: 2
t.decimal :commission_amount, precision: 10, scale: 2
end
end
end
# the seeds file
# Order SEEDING
# --------------
create_records([
{
# ...
total: 100.00,
commission_amount: 100.00
}
], Order)
Strange value association errors in ActiveRecord are often caused by name collisions. ActiveRecord defines setter and getter methods for all fields in the database table of a model and ruby allows you to silently override those in your class definition.
It is always a good advice to go sure not to have used a reserved word for a column name (classic candidates are 'type','object','class'), not to have called a column the same as an associated object, and finally not to define a method with the same name as a column in the database.
In your case, check if you have defined a method called total in your model definition.
My app uses a PostgreSQL database. I've got a migration that looks like this:
class CreateTagAssignments < ActiveRecord::Migration
def change
create_table :tag_assignments do |t|
t.integer :tag_id
t.integer :quote_id
t.integer :user_id
t.timestamps
end
add_index :tag_assignments, :tag_id
add_index :tag_assignments, :quote_id
end
end
Records will be quite frequently searched by these two columns so I want to have a separate index for each. But now I'd like to enforce uniqueness of the pair (tag_id, quote_id) on the database level. I tried add_index :tag_assignments, [:tag_id, :quote_id], unique: true but I got the error:
PG::Error: ERROR: could not create unique index "index_tag_assignments_on_tag_id_and_quote_id"
DETAIL: Key (tag_id, quote_id)=(10, 1) is duplicated.
: CREATE UNIQUE INDEX "index_tag_assignments_on_tag_id_and_quote_id" ON "tag_assignments" ("tag_id", "quote_id")
So multiple indexes apparently do the job of a multi-column index? If so, then I could add the constraint with ALTER TABLE ... ADD CONSTRAINT, but how can I do it in ActiveRecord?
edit: manually performing ALTER TABLE ... ADD CONSTRAINT produces the same error.
As Erwin points out, the "Key (tag_id, quote_id)=(10, 1) is duplicated" constraint violation error message tells you that your unique constraint is already violated by your existing data. I infer from what's visible of your model that different users can each introduce a common association between a tag and a quote, so you see duplicates when you try to constrain uniqueness for just the quote_id,tag_id pair. Compound indexes are still useful for index access on leading keys (though slightly less efficiently than a single column index since the compound index will have lower key-density). You could probably get the speed you require along with the appropriate unique constraint with two indexes, a single column index on one of the ids and a compound index on all three ids with the other id as its leading field. If mapping from tag to quote was a more frequent access path than mapping from quote to tag, I would try this:
add_index :tag_assignments, :tag_id
add_index :tag_assignments, [:quote_id,:tag_id,:user_id], unique: true
If you're using Pg >= 9.2, you can take advantage of 9.2's index visibility maps to enable index-only scans of covering indexes. In this case there may be benefit to making the first index above contain all three ids, with tag_id and quote_id leading:
add_index :tag_assignments, [:tag_id,:quote_id,user_id]
It's unclear how user_id constrains your queries, so you may find that you want indexes with its position promoted earlier as well.
So multiple indexes apparently do the job of a multi-column index?
This conclusion is untrue as well as unfounded after what you describe. The error message indicates the opposite. A multicolumn index or a UNIQUE constraint on multiple columns (implementing a multi-column index, too) provide functionality that you cannot get out of multiple single-column indexes.
I have a model with 2 fields => :name and :age
I need to do a Migration that add a column :position which needs auto increment and start with 0 (zero).
I tried these way:
class AddPosition < ActiveRecord::Migration
def up
add_column :clientes, :position, :integer, :default => 0, :null => false
execute "ALTER TABLE clientes ADD PRIMARY KEY (position);"
end
But it doesn't work because it not auto increment. If I try to use primary key as type:
class AddPosition < ActiveRecord::Migration
def up
add_column :clientes, :position, :primary_key, :default => 0, :null => false
end
end
rake db:migrate don't run because multiple values.
Anyone could explain a way to have zeros and autoincrement on Primary Key w/ Rails 3.2?
Here's how you can set up auto increment column in PostgreSQL:
# in migration:
def up
execute <<-SQL
CREATE SEQUENCE clients_position_seq START WITH 0 MINVALUE 0;
ALTER TABLE clients ADD COLUMN position INTEGER NOT NULL DEFAULT nextval('clients_position_seq');
SQL
end
But unfortunately it may not be what you need. The above would work if you'd insert values into clients table with SQL like this: INSERT INTO clients(name, age) VALUES('Joe', 21), and rails doesn't work that way.
The first problem is that rails expects primary key to be called id. And while you can override this convention, it would cause more problems than it would solve. If you want to bind position to primary key value, better option is to add virtual attribute to your model, a method like this:
def position
id.nil? ? nil : id - 1
end
But let's suppose you already have conventional primary key id and want to add position field so that you can reorder clients after they have been created, without touching their ids (which is always a good idea). Here comes the second problem: rails won't recognize and respect DEFAULT nextval('clients_position_seq'), i.e. it won't pull values from PG backed sequence and would try to put NULL in position by default.
I'd like to suggest looking at acts_as_list gem as better option. This would make DB sequence manipulations unnecessary. Unfortunately it uses 1 as initial value for position but that can be cured by setting custom name for list position field and defining method as I showed above.
I have a PostgreSQL database for a Rails application.
I want to store the Facebook user id so I thought I could use integer but its not big enough so I chose float.
However now Rails adds .0 to the end of my user id's
What datatype can I use so this does not happen for Facebook user ids which are very long example: 100002496803785
You can use :limit => 8 on your integer column to get a bigint. For example:
class Pancakes < ActiveRecord::Migration
def change
create_table :pancakes do |t|
t.integer :c, :limit => 8
end
end
end
And then, from psql:
=> \d pancakes
Table "public.pancakes"
Column | Type | Modifiers
--------+---------+-------------------------------------------------------
id | integer | not null default nextval('pancakes_id_seq'::regclass)
c | bigint | not null
Indexes:
"pancakes_pkey" PRIMARY KEY, btree (id)
And there's your eight byte bigint column.
You could also use a string for the Facebook ID. You're not doing any arithmetic on the IDs so they're really just opaque bags of bits that happen to look like large integers, strings will sort and compare just fine so they might be the best option. There would be some storage and access overhead due to the increased size of a string over the integer but it probably wouldn't be enough to make any noticeable difference.
Never use a double for something that needs to be exact. You'd probably be fine (except for the trailing .0 of course) in this case because you'd have 52 bits of mantissa and that means that the double would act like a 52 bit integer until your values got large enough to require the exponent. Even so, using double for this would be an awful idea and an abuse of the type system.
I don't use postgresql but in mysql I use BIGINT
According to postgresql data types, BIGINT for postgresql as well.
mu is too short has a great answer, I only want to add that if you want to use the ID as a foreign key between tables then you should stick to the BIGINT solution he describes, not use a string. This is what I use, essentially:
Example:
create_table(:photos) do |t|
t.integer :fb_uid, :limit => 8 # Facebook ID of the photo record
t.integer :facebook_profile_uid, :limit => 8, :null => false # foreign key to user
# ...
end
create_table(:users) do |t|
t.integer :fb_uid, :limit => 8, :null => false # Facebook ID of the user record
t.integer :photos_count, :integer, :default => 0
# ...
end
class User < ActiveRecord::Base
has_many :photos, foreign_key: :facebook_profile_uid, primary_key: :fb_uid
# ...
end
class Photo < ActiveRecord::Base
belongs_to :facebook_profile, foreign_key: :facebook_profile_uid, primary_key: :fb_uid, :counter_cache => true
end
Ran into this problem while using the Google uid which also is quite large.
I found the this answer to be most useful:
Getting error indicating number is "out of range for ActiveRecord::Type::Integer with limit 4" when attempting to save large(ish) integer value
Run a migration to change your table column.
Edit the generated migration -> add, limit: 8
Run db:migrate to migrate to the database.
Restart the rails server.
This will allow you to change the limit of your table column.
I want to have a "Customer" Model with a normal primary key and another column to store a custom "Customer Number". In addition, I want the db to handle default Customer Numbers. I think, defining a sequence is the best way to do that. I use PostgreSQL. Have a look at my migration:
class CreateAccountsCustomers < ActiveRecord::Migration
def up
say "Creating sequenze for customer number starting at 1002"
execute 'CREATE SEQUENCE customer_no_seq START 1002;'
create_table :accounts_customers do |t|
t.string :type
t.integer :customer_no, :unique => true
t.integer :salutation, :limit => 1
t.string :cp_name_1
t.string :cp_name_2
t.string :cp_name_3
t.string :cp_name_4
t.string :name_first, :limit => 55
t.string :name_last, :limit => 55
t.timestamps
end
say "Adding NEXTVAL('customer_no_seq') to column cust_id"
execute "ALTER TABLE accounts_customers ALTER COLUMN customer_no SET DEFAULT NEXTVAL('customer_no_seq');"
end
def down
drop_table :accounts_customers
execute 'DROP SEQUENCE IF EXISTS customer_no_seq;'
end
end
If you know a better "rails-like" approach to add sequences, would be awesome to let me know.
Now, if I do something like
cust = Accounts::Customer.new
cust.save
the field customer_no is not pre filled with the next value of the sequence (should be 1002).
Do you know a good way to integrate sequences? Or is there a good plugin?
Cheers to all answers!
I have no suggestions for a more 'rails way' of handling custom sequences, but I can tell you why the customer_no field appears not to be being populated after a save.
When ActiveRecord saves a new record, the SQL statement will only return the ID of the new record, not all of its fields, you can see where this happens in the current rails source here https://github.com/rails/rails/blob/cf013a62686b5156336d57d57cb12e9e17b5d462/activerecord/lib/active_record/persistence.rb#L313
In order to see the value you will need to reload the object...
cust = Accounts::Customer.new
cust.save
cust.reload
If you always want to do this, consider adding an after_create hook in to your model class...
class Accounts::Customer < ActiveRecord::Base
after_create :reload
end
I believe that roboles answer is not correct.
I tried to implement this on my application (exactly the same env: RoR+PostgreSQL), and I found out that when save is issued on RoR with the object having empty attributes, it tries to perform an INSERT on the database mentioning that all VALUES shall be set to NULL. The problem is the way PostgreSQL handles NULLs: in this case, the new row will be created but with all values empty, i.e. the DEFAULT will be ignored. If save only wrote on the INSERT statement attributes filled on RoR, this would work fine.
In other words, and focusing only on the type and customer_no attribute mentioned above, this is the way PostgreSQL behaves:
SITUATION 1:
INSERT INTO accounts_customers (type, customer_no) VALUES (NULL, NULL);
(this is how Rails' save works)
Result: a new row with empty type and empty customer_no
SITUATION 2:
INSERT INTO accounts_customers (type) VALUES (NULL);
Result: a new row with empty type and customer_no filled with the sequence's NEXTVAL
I have a thread going on about this, check it out at:
Ruby on Rails+PostgreSQL: usage of custom sequences
I faced a similar problem, but I also put :null => false on the field hopping that it will be auto-populated with nextval.
Well, in my case AR was still trying to insert NULL if no attribute was supplied in the request, and this resulted in an exception for not-null constraint violation.
Here's my workaround. I just deleted this attribute key from #attributes and #changed_attributes and in this case postgres correctly put the expected sequence nextval.
I've put this in the model:
before_save do
if (#attributes["customer_no"].nil? || #attributes["customer_no"].to_i == 0)
#attributes.delete("customer_no")
#changed_attributes.delete("customer_no")
end
end
Rails 3.2 / Postgres 9.1
If you're using PostgreSQL, check out the gem I wrote, pg_sequencer:
https://github.com/code42/pg_sequencer
It provides a DSL for creating, dropping and altering sequences in ActiveRecord migrations.