Rails Very Large Table - ruby-on-rails

Rails likes to use autoincrement 32bit ints as primary keys on tables. What do people do when they get close to the limits of 32bit int # of rows in a table?

You could change the key to a bigint? That is an 8-byte (64-bit) integer. It gives you up to 9 quantillion instead of 4 billion. There isn't a native migration though, you'd have to do something like:
execute("ALTER TABLE massive_table CHANGE id id BIGINT")
EDIT Apparently specifying a limit on the field as Alex suggested does allow for bigints in both PostgreSQL and mySQL.

You could use 8-byte id fields. Rails doesn't provide types to create long integer or double precision columns, however it can be done using the :limit parameter:
create_table :my_table do |t|
t.integer :long_int_column, :limit => 8
t.float :double_column, :limit => 53
end
8 and 53 are magic numbers. This works for PostgreSQL and MySQL databases, but I haven't tried any others.
If you're altering a table, then you can write
change_column :my_table, :my_col, :integer, :limit => 8
The alternative to 8-byte id fields is to handle id rollover in some way. That would depend on the specifics of your data and application.

Related

Ruby on Rails ignoring integer limit

I need to index a table of users using an externally sourced id, which is a 64-bit integer. Rails is perfectly capable of storing such a number, unless it's the primary key it seems. I have the following migration:
class CreateUsers < ActiveRecord::Migration
def change
create_table :users, :id => false do |t|
t.integer :id, limit: 8
t.string :name
t.timestamps null: false
end
end
end
The migration works fine, no errors reported, but when I attempt to seed it with a 64-bit integer, I'm told off by this:
RangeError: 76561198054432981 is out of range for ActiveRecord::Type::Integer with limit 4
Obviously Rails is ignoring the limit field, so long as it's the primary key/the :id field? How should I go about dealing with this?
For what it's worth I'm using sqlite3 (default), but to my knowledge, sqlite is perfectly capable of storing 64-bit integers.
Here's the table_info from sqlite:
0|id|integer(8)|0||0
1|name|varchar|0||0
2|created_at|datetime|1||0
3|updated_at|datetime|1||0
The limit value you gave is correct; it corresponds to BIGINT type
Make sure your migration is applied; open you database in some CLI or GUI software and verify the col-type
Addition:
Changing a column's length or datatype in a migration will invalidate the column as a primary key. Rather, creating an initializer that overrides the site's default primary key datatype should provide the behavior you're looking to implement:
# config/initializers/change_primary_key_datatype.rb
require 'active_record/connection_adapters/postgresql_adapter'
ActiveRecord::ConnectionAdapters::PostgreSQLAdapter::NATIVE_DATABASE_TYPES[:primary_key] = "bigserial primary key"
This is what we would do for PG database; This is possible because of
however in the code base of SQLite there is

Integer out of range in PostgreSQL database

I'm trying to save a number representing the length of a file (4825733517). The column is set to type integer. I don't have any validations or restrictions set.
RangeError: 4825733517 is out of range for ActiveRecord::ConnectionAdapters::PostgreSQL::OID::Integer with limit 4
Should I be using some other column type for this value? (on rails 4.2.4)
For columns of type integer, the :limit value is the maximum column length in bytes (documentation).
With 4 byte length, the largest signed integer you can store is 2,147,483,647, way smaller than your value of 4,825,733,517. You can increase the byte limit, for example to 8 bytes to be a long integer (a bigint PostgreSQL type), this will allow you to store signed values up to 9,223,372,036,854,775,807.
You can do this with a migration create it with something like rails generate migration change_integer_limit_in_your_table, and the following code:
class ChangeIntegerLimitInYourTable < ActiveRecord::Migration
def change
change_column :your_table, :your_column, :integer, limit: 8
end
end
According to the PostgreSQL documentation an integer have a range from -2147483648 to +2147483647. So your number is to big for this type.
Update your column and use the parameter limit to indicate that you want to have a bigint.
change_column :table, :column, :integer, limit: 8
You should change the length of the column in your database with a migration :
update_column :my_table, :my_column, :integer, limit: 12
It will allow you to store bigger integers.

Rails 3 - dumping PostgreSQL database into schema.rb has incorrect precision for numeric types

I have an existing postgresql database that I want to use in a new rails app, so I first want to dump the existing schema into schema.rb using rake db:schema:dump. However, when I do this, the schema.rb has a strange precision value for the numeric columns.
create_table "order", :id => false, :force => true do |t|
....
t.decimal "Quantity", :precision => 131089, :scale => 0
....
In my PostgreSQL db, the numeric type column does not have an explicit precision or scale set.
Is there a reason why precision is showing such a huge value?
I've also tried changing and removing the precision modifier in schema.rb, but everytime I do a migration, it regenerates the schema.rb file with these huge values. I've looked at the ActiveRecord table definition, but that wasn't very helpful.
I suspect that this is being picked as the maximum precision of a numeric value in PostgreSQL. See http://www.postgresql.org/docs/9.2/static/datatype-numeric.html.

What datatype to use for Facebook user id in Rails and PostgreSQL

I have a PostgreSQL database for a Rails application.
I want to store the Facebook user id so I thought I could use integer but its not big enough so I chose float.
However now Rails adds .0 to the end of my user id's
What datatype can I use so this does not happen for Facebook user ids which are very long example: 100002496803785
You can use :limit => 8 on your integer column to get a bigint. For example:
class Pancakes < ActiveRecord::Migration
def change
create_table :pancakes do |t|
t.integer :c, :limit => 8
end
end
end
And then, from psql:
=> \d pancakes
Table "public.pancakes"
Column | Type | Modifiers
--------+---------+-------------------------------------------------------
id | integer | not null default nextval('pancakes_id_seq'::regclass)
c | bigint | not null
Indexes:
"pancakes_pkey" PRIMARY KEY, btree (id)
And there's your eight byte bigint column.
You could also use a string for the Facebook ID. You're not doing any arithmetic on the IDs so they're really just opaque bags of bits that happen to look like large integers, strings will sort and compare just fine so they might be the best option. There would be some storage and access overhead due to the increased size of a string over the integer but it probably wouldn't be enough to make any noticeable difference.
Never use a double for something that needs to be exact. You'd probably be fine (except for the trailing .0 of course) in this case because you'd have 52 bits of mantissa and that means that the double would act like a 52 bit integer until your values got large enough to require the exponent. Even so, using double for this would be an awful idea and an abuse of the type system.
I don't use postgresql but in mysql I use BIGINT
According to postgresql data types, BIGINT for postgresql as well.
mu is too short has a great answer, I only want to add that if you want to use the ID as a foreign key between tables then you should stick to the BIGINT solution he describes, not use a string. This is what I use, essentially:
Example:
create_table(:photos) do |t|
t.integer :fb_uid, :limit => 8 # Facebook ID of the photo record
t.integer :facebook_profile_uid, :limit => 8, :null => false # foreign key to user
# ...
end
create_table(:users) do |t|
t.integer :fb_uid, :limit => 8, :null => false # Facebook ID of the user record
t.integer :photos_count, :integer, :default => 0
# ...
end
class User < ActiveRecord::Base
has_many :photos, foreign_key: :facebook_profile_uid, primary_key: :fb_uid
# ...
end
class Photo < ActiveRecord::Base
belongs_to :facebook_profile, foreign_key: :facebook_profile_uid, primary_key: :fb_uid, :counter_cache => true
end
Ran into this problem while using the Google uid which also is quite large.
I found the this answer to be most useful:
Getting error indicating number is "out of range for ActiveRecord::Type::Integer with limit 4" when attempting to save large(ish) integer value
Run a migration to change your table column.
Edit the generated migration -> add, limit: 8
Run db:migrate to migrate to the database.
Restart the rails server.
This will allow you to change the limit of your table column.

Integer out of range on Postgres DB

Simple rails app using Postgres DB, getting 'integer out of range' error when trying to insert 2176968859. Should be an easy fix to the migrations, but I'm not sure. Right now I've got...
create_table :targets do |t|
t.integer :tid
...
end
Here's the magic incantation in your migration when you declare the column:
create_table :example do |t|
t.integer :field, :limit => 8
end
The :limit => 8 is the magic in this case as postgres only does signed 4-byte integers when you just say integer. This uses 8-byte signed integers.
What's the question? You are overflowing. Use a bigint if you need numbers that big.
http://www.postgresql.org/docs/8.3/interactive/datatype-numeric.html
In Rails 4. In your migration file, you could define the column as:
t.column :foobar, :bigint
As noted in previous answers, limit: 8 will also achieve the same thing
PostgreSQL integers are signed, there is no unsigned datatype - I bet that's your problem.
If you need larger values, use bigint. If bigint also isn't enough, use numeric - but use bigint rather than numeric unless you need the larger size or decimals, since it's much faster.
Note the range of allowed values for the integer type in http://www.postgresql.org/docs/8.3/interactive/datatype-numeric.html. I think you are going to have to use a bigint, decimal, or double precision.

Resources