Forgive me if I'm asking a silly question, but I've run into some trouble trying to update a time column to datetime.
Here's what the relevant part of my schema.rb looks like:
create_table "shop_hours", id: :serial, force: :cascade do |t|
t.time "from_hours"
t.time "to_hours"
t.string "day"
t.integer "repair_shop_id"
t.boolean "is_shop_open"
t.integer "chain_id"
t.integer "regions", default: [], array: true
t.index ["repair_shop_id"], name: "index_shop_hours_on_repair_shop_id"
end
Here's an example of random ShopHour object:
[67] pry(main)> ShopHour.find(439)
#<ShopHour:0x00007ff05462d3a0
id: 439,
from_hours: Sat, 01 Jan 2000 15:00:00 UTC +00:00,
to_hours: Sat, 01 Jan 2000 00:00:00 UTC +00:00,
day: "Friday",
repair_shop_id: 468,
is_shop_open: true,
chain_id: nil,
regions: []>
Ultimately, I want to migrate the attributes from_hours and to_hours on all of my ShopHour tables so that they're of type datetime.
I'd also like to update the date on each from_hours and to_hours to be current.
I tried this migration, but ran into an error:
class ChangeShopHoursToDateTime < ActiveRecord::Migration[6.0]
def change
change_column :shop_hours, :from_hours, 'timestamp USING CAST(from_hours AS timestamp)'
change_column :shop_hours, :to_hours, 'timestamp USING CAST(to_hours AS timestamp)'
end
end
Here's the error I'm encountering:
== 20201021083719 ChangeShopHoursToDateTime: migrating ========================
-- change_column(:shop_hours, :from_hours, "timestamp USING CAST(from_hours AS timestamp)")
rails aborted!
StandardError: An error has occurred, this and all later migrations canceled:
PG::CannotCoerce: ERROR: cannot cast type time without time zone to timestamp without time zone
LINE 1: ...s" ALTER COLUMN "from_hours" TYPE timestamp USING CAST(from_...
Please let me know if I can provide any more information. Thanks in advance!
You can't automatically actually cast a time column to a timestamp as a time has no date component. Postgres actually correctly prevents you from doing this as the result would be ambiguous - which date should it really cast 12:45 to:
0 BC?
the beginning of epoc time?
todays date?
Ruby doesn't actually have a class to represent a time without a date component. The major difference is that Time is simple wrapper written in C that wraps a UNIX timestamp and DateTime is better at historical times. The fact that Rails just casts a time database column to a Time starting at 2000-01-01 is really just a strange yet pragmatic solution to the problem instead of creating something like a TimeWithoutDate class.
If you want to migrate a database column from time to timestamp / timestampz you need to tell the database which date you expect the time to be at:
class AddDatetimeColumnsToHours < ActiveRecord::Migration[6.0]
def up
add_column :shop_hours, :opens_at, :datetime
add_column :shop_hours, :closes_at, :datetime
ShopHour.update_all(
[ "closes_at = (timestamp '2000-01-01') + to_hour, opens_at = (timestamp '2000-01-01') + from_hour" ]
)
end
def down
remove_column :shop_hours, :opens_at
remove_column :shop_hours, :closes_at
end
end
This adds two new columns and you should really consider just dropping the existing column and going with this naming scheme as methods that start with to_ are by convention casting methods in Ruby (for example to_s, to_a, to_h) - to_hour is thus a really bad name.
Related
I'm trying to change a 'time' entry with the date entered in the 'date' entry. So if the time is "2000-01-01 10:00:00 UTC" and the date is "2021-10-10" I want the output to be "2021-10-10 10:00:00 UTC".
I almost have it working, however; when I assign the updated date back to the original object, it does not save the change. For instance, in the code below, event_time contains the proper time I want, however, assigning it to #event.time and then printing #event.time shows the change did not take place.
def create
#event = Event.new(event_params)
event_date = #event.date
event_time = #event.time.change(:year => event_date.year, :month => event_date.month, :day => event_date.day)
puts event_time # prints 2021-10-22 06:06:00 UTC
#event.time = event_time
puts #event.time # prints 2000-01-01 06:06:00 UTC
if #event.save
redirect_to(events_path)
else
render('new')
end
end
Any suggestions? I'm new to Ruby so I'm probably missing something obvious here
Here's my schema
create_table "events", force: :cascade do |t|
t.date "date"
t.string "description"
t.boolean "isMandatory"
t.datetime "created_at", precision: 6, null: false
t.datetime "updated_at", precision: 6, null: false
t.string "name"
t.time "time"
t.string "location"
end
You can refer to the SO answer here
The problem is that there is no time-of-day class in Ruby or Rails. All the time classes are dates or timestamps (i.e. date plus time of day).
Inside the database it will be a time (without timezone) column and it will behave properly inside the database. However, once the time gets into Ruby, ActiveRecord will add a date component because there is no plain time-of-day class available, it just happens to use 2000-01-01 as the date.
Everything will be fine inside the database but you'll have to exercise a little bit of caution to ignore the date component when you're outside the database in Rails.
Use datetime column type to hold a date and time. Only use time in the migration if you don't need the date (only want to store time part).
I’m using Rails 4.2.3 with a PostGre database. I want a column in my database to store a number of milliseconds — note, NOT a timestamp, but rather a duration in milliseconds. So I created my column like so
time_in_ms | bigint
However, when I go to store a value in Rails, I get the below error
ActiveRecord::StatementInvalid (PG::NumericValueOutOfRange: ERROR: value "3000002000" is out of range for type integer
: INSERT INTO "my_object_times" ("time_in_ms", "my_object_id", "created_at", "updated_at") VALUES ($1, $2, $3, $4) RETURNING "id"):
app/controllers/my_objects_controller.rb:31:in `update'
It would seem the number, “3000002000” is smaller than the maximum value for the column (which I’m reading is “9223372036854775807”), so I’m wondering what else is going wrong and how I can fix it.
Edit: To provide additional information, in my db/schema.rb file, the column in question is described thusly ...
create_table "my_object_times", force: :cascade do |t|
...
t.integer "time_in_ms", limit: 8
Edit 2: Here is the output of create table in PSQL
CREATE TABLE my_object_times (
id integer NOT NULL,
first_name character varying,
last_name character varying,
time_in_ms bigint,
created_at timestamp without time zone NOT NULL,
updated_at timestamp without time zone NOT NULL,
name character varying,
age integer,
city character varying,
state_id integer,
country_id integer,
overall_rank integer,
age_group_rank integer,
gender_rank integer
);
I have had it happen to me before where when I initially try to create a bigint field in the db, for some reason the Model thinks it is an integer instead, even when the schema and migration file specify it as a bigint.
For example: I had this migration file
class CreateSecureUserTokens < ActiveRecord::Migration
def change
create_table :secure_user_tokens do |t|
t.integer :sso_id, null: false, length: 8
t.string :token, null: false
t.timestamps null: false
end
end
end
Note, it has the included length: 8 requirement to make an integer a bigint. However, after I ran the migration, I was having the same issue as you. Eventually I just created another migration to try and fix the issue, and it worked. Here's the migration I used to fix the issue:
class ModifySecureTokensForLargerSsoIdSizes < ActiveRecord::Migration
def change
change_column :secure_user_tokens, :sso_id, :integer, limit: 8
end
end
So if we changed that to fit your needs, it would be:
class ObjectTimesBigInt < ActiveRecord::Migration
def change
change_column :my_object_times, :time_in_ms, :integer, limit: 8
end
end
Hope that helps!
-Charlie
I guess, the table my_object_times might not be created from the schema.rb file or it might be overwritten in other migration file. Because in the migration file integer column with limit 8 is itself a bigint. So you should cross-check the table definition from the PG-admin. If the column is not bigInt then run the following migration
class ChangeTimeInMsToBigint < ActiveRecord::Migration
def change
execute <<-SQL
ALTER TABLE my_object_times
ALTER COLUMN time_in_ms TYPE bigint USING time_in_ms::bigint
SQL
end
end
edit: I just re-read this and my original answer actually does not make sense in your case. I do believe you need to look outside that column for an answer, and confirm every bit of what you think you know about the state of it, manually. Any addition of detail would help us find a correct answer.
Set breakpoints to step through the request and see if you can spot the integer
create_table "my_object_times", force: :cascade do |t|
...
t.integer "time_in_ms", limit: 8
t.integer
- this looks like your culprit to me.
...
Well, I tried, my last thought is that it has to be related to some kind of Rails request middleware, but I'm ignorant of what the specifics might be. Something in the request path thinks that column is an integer. I didn't understand how Rails migration datatypes worked until now, so I learned something. (And I went fishing all day, so I'll count this day a win.) Good luck!
For anyone on Rails 5/6 using the paper_trail gem, check what the polymorphic foreign_key id field item_id is set as in versions. I had it set as an integer, and got this bug.
Changing versions.item_id to a bigint fixed the error.
bigint is 64-bit, while Rails is 32-bit.
3000002000 is greater than 2^32. That's why converting it into a 32-bit integer fails with NumericValueOutOfRange.
I have the following table in my DB:
class CreateGoogleRecords < ActiveRecord::Migration
def change
create_table :google_records do |t|
t.string :user_id
t.string :date
t.text :stats
t.string :account_name
t.integer :total_conversions
t.decimal :total_cost
t.timestamps
end
end
end
I'm looking to create a table inside a view that groups together records by month (I can't use "date created because sometimes they are scraped in bulk from an API).
There is a lot of legacy code involved so rather than convert the column to datetime I was hoping I could convert the date string to a datetime object when performing the query.
I've tried writing a scope like:
scope :stats_for_reports, ->(start_date, end_date, user_ids) { select('user_id, sum(total_cost) as total_cost, sum(total_conversions) as total_conversions')
.where('date >= ? and date <= ?', start_date, end_date)
.where(user_id: user_ids)
.group(DateTime.parse(:date).month.to_s)}
but I receive a TypeError: can't convert Symbol into String error.
In the console I've been trying things like:
GoogleRecord.where(date: date_start..date_end).group{ |m| DateTime.parse(m.date).month }
or
GoogleRecord.where(date: date_start..date_end).group(:date).to_date
Am I on the right track with any of these?
Have you considered using ActiveRecord before_save, after_save, after_initialize? You may be able to create a DateWrapper (very similar to the EncryptionWrapper below) and convert the string to a date transparent to the rest of the code.
http://api.rubyonrails.org/classes/ActiveRecord/Callbacks.html
I have an error while trying to compare datetimes with rails and postgresql
Now, I created the table using this simple schema:
def change
create_table :events do |t|
t.string :name
t.datetime :start_time
end
end
I try then to get events in the future using this scope:
scope :active_events, ->(id) { includes(:event).where(:foursquare_id => id).where('events.start_time => now()::timestamp') }
That translate in postgresql as :
SELECT COUNT(DISTINCT "venues"."id")
FROM "venues"
LEFT OUTER JOIN "events" ON "events"."venue_id" = "venues"."id"
WHERE "venues"."foursquare_id" = '4ada1e5ff964a5209f1e21e3'
AND (events.start_time => now()::timestamp)
Now I have this error that shows up then :
PG::UndefinedFunction: ERROR: operator does not exist: timestamp without time zone => timestamp without time zone
LINE 1: ...'4ada1e5ff964a5209f1e21e3' AND (events.start_time => now()::...
and I'm really not sure to understand this. Normally they are of the same data type I assume they would be able to compare but it doesn't seem to be the case.
What am I missing here ?
The operator you're looking for is >=, => is usually used with hstore.
I have an Rails application that defines a migration that contains a decimal with precision 8 and scale 2. The database I have set up is PostgreSQL 9.1 database.
class CreateMyModels < ActiveRecord::Migration
def change
create_table :my_models do |t|
t.decimal :multiplier, precison: 8, scale: 2
t.timestamps
end
end
end
When I run rake db:migrate, the migration happens successfully, but I noticed an error when I was trying to run a MyModel.find_or_create_by_multiplier. If I ran the following command twice, the object would get created twice:
MyModel.find_or_create_by_multiplier(multiplier: 0.07)
I am assuming this should create the object during the first call and then find the object during the second call. Unfortunately, this does not seem to be happening with the multiplier set to 0.07.
This DOES work as expected for every other number I have thrown at the above command. The following commands work as expected (creating the object during the first call and then finding the object during the second call).
MyModel.find_or_create_by_multiplier(multiplier: 1.0)
MyModel.find_or_create_by_multiplier(multiplier: 0.05)
MyModel.find_or_create_by_multiplier(multiplier: 0.071)
When I view the PostgreSQL database description of the MyModel table, I notice that the table does not have a restriction on the numeric column.
Column | Type | Modifiers
-------------+-----------------------------+-------------------------------------------------------
id | integer | not null default nextval('my_models_id_seq'::regclass)
multiplier | numeric |
created_at | timestamp without time zone | not null
updated_at | timestamp without time zone | not null
My db/schema.rb file also does not state the precision and scale:
ActiveRecord::Schema.define(:version => 20121206202800) do
...
create_table "my_models", :force => true do |t|
t.decimal "multiplier"
t.datetime "created_at", :null => false
t.datetime "updated_at", :null => false
end
...
So my first question is, why do I not see precision and scale pushed down to PostgreSQL when I migrate? Documentation states that it should be supported.
My second question is, why is 0.07 not correctly comparing using the MyModel.find_or_create_by_multiplier(multiplier: 0.07) command? (If I need to open another question for this, I will).
This is embarrassing...
I have precision misspelled.
Changing the migration to:
t.decimal :multiplier, precision: 8, scale: 2
fixed everything.
PostgreSQL 9.1 will let you declare a column in any of these ways.
column_name decimal
column_name numeric
column_name decimal(8, 2)
column_name numeric(8, 2)
If you look at that column using, say, pgAdminIII, it will show you exactly how it was created. If you (or Rails) created the column as numeric, it will say "numeric". If you (or Rails) created the column as decimal(8, 2), it will say "decimal(8, 2)".
So it looks to me like Rails is not passing precision and scale to PostgreSQL. Instead, it's simply telling PostgreSQL to create that column with type "numeric". Rails docs suggest it should not be doing that.
Example syntax in that link is different from yours.
td.column(:bill_gates_money, :decimal, :precision => 15, :scale => 2)
I was using :numeric at first. Although ActiveRecord changed it to :decimal for me, both :precision and :scale were ignored.
# 202001010000000_add_my_col_to_my_table.rb
add_column :my_table, :my_col :numeric, percision: 3, scale: 2, comment: 'Foobar'
# schema.rb
t.decimal "my_col", comment: 'Foobar'
Simply change to :decimal in migration file fixed it for me:
# 202001010000000_add_my_col_to_my_table.rb
add_column :my_table, :my_col :decimal, percision: 3, scale: 2, comment: 'Foobar'
# schema.rb
t.decimal "my_col", precision: 3, scale: 2, comment: 'Foobar'