Integer out of range on Postgres DB - ruby-on-rails

Simple rails app using Postgres DB, getting 'integer out of range' error when trying to insert 2176968859. Should be an easy fix to the migrations, but I'm not sure. Right now I've got...
create_table :targets do |t|
t.integer :tid
...
end

Here's the magic incantation in your migration when you declare the column:
create_table :example do |t|
t.integer :field, :limit => 8
end
The :limit => 8 is the magic in this case as postgres only does signed 4-byte integers when you just say integer. This uses 8-byte signed integers.

What's the question? You are overflowing. Use a bigint if you need numbers that big.
http://www.postgresql.org/docs/8.3/interactive/datatype-numeric.html

In Rails 4. In your migration file, you could define the column as:
t.column :foobar, :bigint
As noted in previous answers, limit: 8 will also achieve the same thing

PostgreSQL integers are signed, there is no unsigned datatype - I bet that's your problem.
If you need larger values, use bigint. If bigint also isn't enough, use numeric - but use bigint rather than numeric unless you need the larger size or decimals, since it's much faster.

Note the range of allowed values for the integer type in http://www.postgresql.org/docs/8.3/interactive/datatype-numeric.html. I think you are going to have to use a bigint, decimal, or double precision.

Related

Getting "value "3000002000" is out of range for type integer"

I’m using Rails 4.2.3 with a PostGre database. I want a column in my database to store a number of milliseconds — note, NOT a timestamp, but rather a duration in milliseconds. So I created my column like so
time_in_ms | bigint
However, when I go to store a value in Rails, I get the below error
ActiveRecord::StatementInvalid (PG::NumericValueOutOfRange: ERROR: value "3000002000" is out of range for type integer
: INSERT INTO "my_object_times" ("time_in_ms", "my_object_id", "created_at", "updated_at") VALUES ($1, $2, $3, $4) RETURNING "id"):
app/controllers/my_objects_controller.rb:31:in `update'
It would seem the number, “3000002000” is smaller than the maximum value for the column (which I’m reading is “9223372036854775807”), so I’m wondering what else is going wrong and how I can fix it.
Edit: To provide additional information, in my db/schema.rb file, the column in question is described thusly ...
create_table "my_object_times", force: :cascade do |t|
...
t.integer "time_in_ms", limit: 8
Edit 2: Here is the output of create table in PSQL
CREATE TABLE my_object_times (
id integer NOT NULL,
first_name character varying,
last_name character varying,
time_in_ms bigint,
created_at timestamp without time zone NOT NULL,
updated_at timestamp without time zone NOT NULL,
name character varying,
age integer,
city character varying,
state_id integer,
country_id integer,
overall_rank integer,
age_group_rank integer,
gender_rank integer
);
I have had it happen to me before where when I initially try to create a bigint field in the db, for some reason the Model thinks it is an integer instead, even when the schema and migration file specify it as a bigint.
For example: I had this migration file
class CreateSecureUserTokens < ActiveRecord::Migration
def change
create_table :secure_user_tokens do |t|
t.integer :sso_id, null: false, length: 8
t.string :token, null: false
t.timestamps null: false
end
end
end
Note, it has the included length: 8 requirement to make an integer a bigint. However, after I ran the migration, I was having the same issue as you. Eventually I just created another migration to try and fix the issue, and it worked. Here's the migration I used to fix the issue:
class ModifySecureTokensForLargerSsoIdSizes < ActiveRecord::Migration
def change
change_column :secure_user_tokens, :sso_id, :integer, limit: 8
end
end
So if we changed that to fit your needs, it would be:
class ObjectTimesBigInt < ActiveRecord::Migration
def change
change_column :my_object_times, :time_in_ms, :integer, limit: 8
end
end
Hope that helps!
-Charlie
I guess, the table my_object_times might not be created from the schema.rb file or it might be overwritten in other migration file. Because in the migration file integer column with limit 8 is itself a bigint. So you should cross-check the table definition from the PG-admin. If the column is not bigInt then run the following migration
class ChangeTimeInMsToBigint < ActiveRecord::Migration
def change
execute <<-SQL
ALTER TABLE my_object_times
ALTER COLUMN time_in_ms TYPE bigint USING time_in_ms::bigint
SQL
end
end
edit: I just re-read this and my original answer actually does not make sense in your case. I do believe you need to look outside that column for an answer, and confirm every bit of what you think you know about the state of it, manually. Any addition of detail would help us find a correct answer.
Set breakpoints to step through the request and see if you can spot the integer
create_table "my_object_times", force: :cascade do |t|
...
t.integer "time_in_ms", limit: 8
t.integer
- this looks like your culprit to me.
...
Well, I tried, my last thought is that it has to be related to some kind of Rails request middleware, but I'm ignorant of what the specifics might be. Something in the request path thinks that column is an integer. I didn't understand how Rails migration datatypes worked until now, so I learned something. (And I went fishing all day, so I'll count this day a win.) Good luck!
For anyone on Rails 5/6 using the paper_trail gem, check what the polymorphic foreign_key id field item_id is set as in versions. I had it set as an integer, and got this bug.
Changing versions.item_id to a bigint fixed the error.
bigint is 64-bit, while Rails is 32-bit.
3000002000 is greater than 2^32. That's why converting it into a 32-bit integer fails with NumericValueOutOfRange.

Ruby on Rails ignoring integer limit

I need to index a table of users using an externally sourced id, which is a 64-bit integer. Rails is perfectly capable of storing such a number, unless it's the primary key it seems. I have the following migration:
class CreateUsers < ActiveRecord::Migration
def change
create_table :users, :id => false do |t|
t.integer :id, limit: 8
t.string :name
t.timestamps null: false
end
end
end
The migration works fine, no errors reported, but when I attempt to seed it with a 64-bit integer, I'm told off by this:
RangeError: 76561198054432981 is out of range for ActiveRecord::Type::Integer with limit 4
Obviously Rails is ignoring the limit field, so long as it's the primary key/the :id field? How should I go about dealing with this?
For what it's worth I'm using sqlite3 (default), but to my knowledge, sqlite is perfectly capable of storing 64-bit integers.
Here's the table_info from sqlite:
0|id|integer(8)|0||0
1|name|varchar|0||0
2|created_at|datetime|1||0
3|updated_at|datetime|1||0
The limit value you gave is correct; it corresponds to BIGINT type
Make sure your migration is applied; open you database in some CLI or GUI software and verify the col-type
Addition:
Changing a column's length or datatype in a migration will invalidate the column as a primary key. Rather, creating an initializer that overrides the site's default primary key datatype should provide the behavior you're looking to implement:
# config/initializers/change_primary_key_datatype.rb
require 'active_record/connection_adapters/postgresql_adapter'
ActiveRecord::ConnectionAdapters::PostgreSQLAdapter::NATIVE_DATABASE_TYPES[:primary_key] = "bigserial primary key"
This is what we would do for PG database; This is possible because of
however in the code base of SQLite there is

Integer out of range in PostgreSQL database

I'm trying to save a number representing the length of a file (4825733517). The column is set to type integer. I don't have any validations or restrictions set.
RangeError: 4825733517 is out of range for ActiveRecord::ConnectionAdapters::PostgreSQL::OID::Integer with limit 4
Should I be using some other column type for this value? (on rails 4.2.4)
For columns of type integer, the :limit value is the maximum column length in bytes (documentation).
With 4 byte length, the largest signed integer you can store is 2,147,483,647, way smaller than your value of 4,825,733,517. You can increase the byte limit, for example to 8 bytes to be a long integer (a bigint PostgreSQL type), this will allow you to store signed values up to 9,223,372,036,854,775,807.
You can do this with a migration create it with something like rails generate migration change_integer_limit_in_your_table, and the following code:
class ChangeIntegerLimitInYourTable < ActiveRecord::Migration
def change
change_column :your_table, :your_column, :integer, limit: 8
end
end
According to the PostgreSQL documentation an integer have a range from -2147483648 to +2147483647. So your number is to big for this type.
Update your column and use the parameter limit to indicate that you want to have a bigint.
change_column :table, :column, :integer, limit: 8
You should change the length of the column in your database with a migration :
update_column :my_table, :my_column, :integer, limit: 12
It will allow you to store bigger integers.

What datatype to use for Facebook user id in Rails and PostgreSQL

I have a PostgreSQL database for a Rails application.
I want to store the Facebook user id so I thought I could use integer but its not big enough so I chose float.
However now Rails adds .0 to the end of my user id's
What datatype can I use so this does not happen for Facebook user ids which are very long example: 100002496803785
You can use :limit => 8 on your integer column to get a bigint. For example:
class Pancakes < ActiveRecord::Migration
def change
create_table :pancakes do |t|
t.integer :c, :limit => 8
end
end
end
And then, from psql:
=> \d pancakes
Table "public.pancakes"
Column | Type | Modifiers
--------+---------+-------------------------------------------------------
id | integer | not null default nextval('pancakes_id_seq'::regclass)
c | bigint | not null
Indexes:
"pancakes_pkey" PRIMARY KEY, btree (id)
And there's your eight byte bigint column.
You could also use a string for the Facebook ID. You're not doing any arithmetic on the IDs so they're really just opaque bags of bits that happen to look like large integers, strings will sort and compare just fine so they might be the best option. There would be some storage and access overhead due to the increased size of a string over the integer but it probably wouldn't be enough to make any noticeable difference.
Never use a double for something that needs to be exact. You'd probably be fine (except for the trailing .0 of course) in this case because you'd have 52 bits of mantissa and that means that the double would act like a 52 bit integer until your values got large enough to require the exponent. Even so, using double for this would be an awful idea and an abuse of the type system.
I don't use postgresql but in mysql I use BIGINT
According to postgresql data types, BIGINT for postgresql as well.
mu is too short has a great answer, I only want to add that if you want to use the ID as a foreign key between tables then you should stick to the BIGINT solution he describes, not use a string. This is what I use, essentially:
Example:
create_table(:photos) do |t|
t.integer :fb_uid, :limit => 8 # Facebook ID of the photo record
t.integer :facebook_profile_uid, :limit => 8, :null => false # foreign key to user
# ...
end
create_table(:users) do |t|
t.integer :fb_uid, :limit => 8, :null => false # Facebook ID of the user record
t.integer :photos_count, :integer, :default => 0
# ...
end
class User < ActiveRecord::Base
has_many :photos, foreign_key: :facebook_profile_uid, primary_key: :fb_uid
# ...
end
class Photo < ActiveRecord::Base
belongs_to :facebook_profile, foreign_key: :facebook_profile_uid, primary_key: :fb_uid, :counter_cache => true
end
Ran into this problem while using the Google uid which also is quite large.
I found the this answer to be most useful:
Getting error indicating number is "out of range for ActiveRecord::Type::Integer with limit 4" when attempting to save large(ish) integer value
Run a migration to change your table column.
Edit the generated migration -> add, limit: 8
Run db:migrate to migrate to the database.
Restart the rails server.
This will allow you to change the limit of your table column.

Rails Very Large Table

Rails likes to use autoincrement 32bit ints as primary keys on tables. What do people do when they get close to the limits of 32bit int # of rows in a table?
You could change the key to a bigint? That is an 8-byte (64-bit) integer. It gives you up to 9 quantillion instead of 4 billion. There isn't a native migration though, you'd have to do something like:
execute("ALTER TABLE massive_table CHANGE id id BIGINT")
EDIT Apparently specifying a limit on the field as Alex suggested does allow for bigints in both PostgreSQL and mySQL.
You could use 8-byte id fields. Rails doesn't provide types to create long integer or double precision columns, however it can be done using the :limit parameter:
create_table :my_table do |t|
t.integer :long_int_column, :limit => 8
t.float :double_column, :limit => 53
end
8 and 53 are magic numbers. This works for PostgreSQL and MySQL databases, but I haven't tried any others.
If you're altering a table, then you can write
change_column :my_table, :my_col, :integer, :limit => 8
The alternative to 8-byte id fields is to handle id rollover in some way. That would depend on the specifics of your data and application.

Resources