I have a table that needs to store epoch timestamp.
class CreateKlines < ActiveRecord::Migration[5.1]
def change
create_table :klines do |t|
t.string :symbol
t.timestamp :open_time
t.timestamp :close_time
t.timestamps
end
end
However, when I stored it, it becomes nil
(*the open_time and the close_time)
EX:
Kline.create(open_time: Time.now.to_i)
(0.3ms) BEGIN
SQL (1.5ms) INSERT INTO `klines` (`created_at`, `updated_at`) VALUES ('2020-05-14 01:45:22', '2020-05-14 01:45:22')
(3.8ms) COMMIT
You can notice that the value of open_time is gone, and the result is
#<Kline:0x00007f9f68c87788
id: 600,
symbol: nil,
open_time: nil,
close_time: nil,
created_at: Thu, 14 May 2020 01:45:22 UTC +00:00,
updated_at: Thu, 14 May 2020 01:45:22 UTC +00:00>
Env:
Rails 5.1.4,
Ruby 2.6.5,
MySQL 5.7
Rails has two "special" timestamps, created_at and updated_at that Rails automatically stores and updates it properly (t.timestamps will generate those columns). Otherwise, it's not special meaning you need to set it manually.
So for example,
def open
update(open_time: Time.current)
end
a method like this can do what you want, and you can add anything to this method so that it does other things besides updating a column.
I realize my question wasn't related to Rails. I shouldn't use timestamp to store epoch time.
What is the data type for unix_timestamp (MySQL)?
Instead, I should use int(11) to store it.
Back to my question. If I want to store timestamp, I should use the time object in Rails.
EX:
Kline.create(open_time: Time.now)
(1.0ms) BEGIN
SQL (1.4ms) INSERT INTO `klines` (`open_time`, `created_at`, `updated_at`) VALUES ('2020-05-14 01:49:15', '2020-05-14 01:49:15', '2020-05-14 01:49:15')
(1.2ms) COMMIT
Related
I'm having a weird issue where Rails is not saving a date correctly. I've tried formatting the date a number of ways, including as an actual date object as well as various string formats, but it's wanting to swap the month and day regardless. When this results in an invalid date (ie 28/08/2019 as mm/dd/yyyy), it fails to include the parameter all together when saving.
If the date is valid when swapped (ie 2019-09-03 ==> 2019-03-09), it will save the incorrect (swapped) date.
params[:shift_date] = Date.strptime(shift_params[:shift_date], '%m/%d/%Y').to_date
#shift.save(params)
Results in...
<ActionController::Parameters {"shift_date"=>Tue, 03 Sep 2019, "start_time"=>"00:00:00", "end_time"=>"00:00:00"} permitted: true>
(0.1ms) BEGIN
↳ app/controllers/api/shifts_controller.rb:25
Shift Create (0.4ms) INSERT INTO "shifts" ("shift_date", "start_time", "end_time") VALUES ($1, $2, $3) RETURNING "id" [["shift_date", "2019-03-09"], ["start_time", "00:00:00"], ["end_time", "00:00:00"]]
With a date like '08/29/2019', again correctly interpreted into a Date object, when saved it will be omitted since 2019-29-08 is not a valid date. I've also tried converting to a string 2019-08-29, 2019-08-29 00:00:00, etc etc, and it always get its wrong no matter what I do. I'm very surprised passing a correctly set and valid Date object that Rails would change it at all...
Note this only happens in the controller. Via the console everything works as expected...
2.6.3 :076 > s.shift_date = Date.strptime('08/28/2019', '%m/%d/%Y')
=> Wed, 28 Aug 2019
2.6.3 :077 > s.save
(0.2ms) BEGIN
Shift Update (0.4ms) UPDATE "shifts" SET "shift_date" = $1, "updated_at" = $2 WHERE "shifts"."id" = $3 [["shift_date", "2019-08-28"], ["updated_at", "2019-08-28 20:48:31.944877"], ["id", 54]]
(0.3ms) COMMIT
=> true
The test app:
$ rails new ra1
$ cd ra1
$ ./bin/rails g model m1 dt1:datetime
$ ./bin/rake db:migrate
Then I add config.time_zone = 'Europe/Kiev' into config/application.rb, run console:
irb(main):001:0> M1.create dt1: Time.now
(0.1ms) begin transaction
SQL (0.3ms) INSERT INTO "m1s" ("dt1", "created_at", "updated_at") VALUES (?, ?, ?) [["dt1", "2015-03-30 11:11:43.346991"], ["created_at", "2015-03-30 11:11:43.360987"], ["updated_at", "2015-03-30 11:11:43.360987"]]
(33.0ms) commit transaction
=> #<M1 id: 3, dt1: "2015-03-30 11:11:43", created_at: "2015-03-30 11:11:43", updated_at: "2015-03-30 11:11:43">
irb(main):002:0> Time.now
=> 2015-03-30 14:12:27 +0300
irb(main):003:0> Rails.configuration.time_zone
=> "Europe/Kiev"
What am I doing wrong?
Values in the database are always stored in UTC, no matter the time_zone.
The time zone configuration affects only the Ruby environment. The data is fetched from the database and the dates are converted into the selected timezone. The same applies to new time instances, as you noticed using Time.now.
The main reasons the time is normalized in the database is that it allows to easily convert the same value to multiple timezones, or change the timezone after the initial phase of the project without the need to reconvert all the dates. It's a good practice to use UTC in the database.
I am saving a time into database and I get a very different value back from the database.
2.1.0 :047 > Schedule.create(last_reminded_at: Time.now)
(0.9ms) BEGIN
SQL (1.1ms) INSERT INTO "schedules" ("last_reminded_at")
VALUES ($1) RETURNING "id" [["last_reminded_at", "2014-12-13 22:14:16.022267"]]
(8.3ms) COMMIT
=> #<Schedule id: 8, ... last_reminded_at: "2014-12-13 23:14:16">
Schedule is created with correct time, but when I get it back from the db, the time is always 1st Jan 2000.
2.1.0 :048 > Schedule.last
Schedule Load (0.9ms) SELECT "schedules".* FROM "schedules" ORDER BY id DESC LIMIT 1
=> #<Schedule id: 8, ... last_reminded_at: "2000-01-01 22:14:16">
Looking at the query, I suspect the insert statement is somehow incorrect?
Maybe I need to configure ActiveRecord timezone somehow?
This is the only time related config I have, in config/application.rb:
Time.zone = 'UTC'
The schema is:
create_table "schedules", force: true do |t|
....
t.time "last_reminded_at"
end
I am running Rails 4.1.8, Postgresql 9.2.6 and Ruby 2.1.2p95.
Your schema has set last_reminded_at to a time. You want a datetime as you care about the date too.
So, I have my database, but whenever I call Model.create(:thing => "Hi") it just "does it". When I look at it, my records are All nils! (Minus the ID and Timestamp, those are managed by active record.) Is it the way I create them or IS it my model??? I am using Rails 4.0.1, and it's corresponding active record version. So, what is it? What problems could create this?
My logs:irb(main):003:0>
Email.create(:user => User.find(3), :email => "HIDDEN#HIDDEN.HIDDEN", :key => Email.gen("gemist", "HIDDEN#HIDDEN.HIDDEN"))
User Load (0.3ms) SELECT "users".* FROM "users" WHERE "users"."id" = ? LIMIT 1 [["id", 3]]
WARNING: Can't mass-assign protected attributes for Email: user, email, key
(0.1ms) begin transaction
SQL (2.9ms) INSERT INTO "emails" ("created_at", "updated_at") VALUES (?, ?) [["created_at", Wed, 01 Jan 2014 21:22:24 UTC +00:00], ["updated_at", Wed, 01 Jan 2014 21:22:24 UTC +00:00]]
(1.0ms) commit transaction
=> #<Email id: 3, email: nil, User_id: nil, key: nil, confirmed: nil, created_at: "2014-01-01 21:22:24", updated_at: "2014-01-01 21:22:24">
And my model, incase you are wonder about ZE Generate function.... or anything else
class Email < ActiveRecord::Base
belongs_to :User
def self.gen(user,email)
# Make conf keys on demand
# Salting is used for randominzg and uniquisng, in the case we have already
# sent keys to the same email (We don't want the samekeys more than once!)
# Comboing to make sure that incase of two users who want to confirm the
# same email
salt = BCrypt::Engine.generate_salt
combo = user + email
return BCrypt::Engine.hash_secret(combo, salt)
end
end
I can offer a Migration or schmea if needed.
Now I know under "How to edit" I need to respect the original author, but I have no self respect.
Your question contains the answer already:
WARNING: Can't mass-assign protected attributes for Email: user, email, key
and:
INSERT INTO "emails" ("created_at", "updated_at") VALUES (?, ?) [["created_at", Wed, 01 Jan 2014 21:22:24 UTC +00:00], ["updated_at", Wed, 01 Jan 2014 21:22:24 UTC +00:00]]
You can't set user, email, and key during a mass-assignment (which create() is doing), so it's ignoring those. The only fields being set, as you can see from the INSERT log, are the timestamp fields. So you end up with a fairly empty record in the database.
You can set those fields individually on a model instance, or you can flag them as mass-assignable by putting this in your model:
attr_accessible :user, :email, :key
You may very well want to leave them protected, though. Here's an article on mass-assignment protection. If you're processing form data, you probably want to leave the protection in place for certain fields. If your create() is already using trusted data, you can make them accessible.
I have an ActiveRecord object that has a before_create hook that generates a SHA hash and stores the result in an attribute of the object:
before_create :store_api_id
attr_reader :api_id
def store_api_id
self.api_id = generate_api_id
end
private
def generate_api_id
Digest::SHA1.hexdigest([Time.now.nsec, rand].join).encode('UTF-8')
end
This works in that the api_id attribute is created and stored in the database as text (which is forced by the call to .encode('UTF-8') otherwise SQLite will try to store the result as binary data.
However the following specs are failing:
it "should be available as ad.api_id" do
#ad.save!
#ad.api_id.should_not be_nil
end
it "should match the directly accessed attribute" do
#ad.save!
#ad.api_id.should == #ad.attributes['api_id']
end
I can get the correct hash by using ad.api_id_before_type_cast and ad.attributes['api_id'] but not when using ad.api_id.
Ad.find_by_api_id('string api id') also works as expected, but still returns null when calling .api_id on the returned object.
I have double-checked the type in the database as follows:
sqlite> select typeof(api_id) from ads;
text
Here is an example rails console session on a fresh Rails 3.2.2 / ruby 1.9.3-p125 app:
Loading development environment (Rails 3.2.2)
irb(main):001:0> ex = Example.new
=> #<Example id: nil, api_id: nil, created_at: nil, updated_at: nil>
irb(main):002:0> ex.save! (0.1ms) begin transaction
SQL (7.3ms) INSERT INTO "examples" ("api_id", "created_at", "updated_at")
VALUES (?, ?, ?) [["api_id", "fc83bb94420cf8fb689b9b33195318778d771c4e"],
["created_at", Fri, 23 Mar 2012 10:17:24 UTC +00:00],
["updated_at", Fri, 23 Mar 2012 10:17:24 UTC +00:00]]
(1.0ms) commit transaction
=> true
irb(main):003:0> ex.api_id
=> nil
irb(main):004:0> ex.api_id_before_type_cast
=> "fc83bb94420cf8fb689b9b33195318778d771c4e"
irb(main):005:0> ex.attributes['api_id']
=> "fc83bb94420cf8fb689b9b33195318778d771c4e"
irb(main):006:0>
As I wrote above, using attr_readonly instead of attr_reader to protect the attribute fixed this issue for me, and in this case is actually closer to what I want.
As the API docs note:
Attributes listed as readonly will be used to create a new record but update operations will ignore these fields.