Rails Time inconsistencies with rspec - ruby-on-rails

I'm working with Time in Rails and using the following code to set up the start date and end date of a project:
start_date ||= Time.now
end_date = start_date + goal_months.months
I then clone the object and I'm writing rspec tests to confirm that the attributes match in the copy. The end dates match:
original[end_date]: 2011-08-24 18:24:53 UTC
clone[end_date]: 2011-08-24 18:24:53 UTC
but the spec gives me an error on the start dates:
expected: Wed Aug 24 18:24:53 UTC 2011,
got: Wed, 24 Aug 2011 18:24:53 UTC +00:00 (using ==)
It's clear the dates are the same, just formatted differently. How is it that they end up getting stored differently in the database, and how do I get them to match? I've tried it with DateTime as well with the same results.
Correction: The end dates don't match either. They print out the same, but rspec errors out on them as well. When I print out the start date and end date, the values come out in different formats:
start date: 2010-08-24T19:00:24+00:00
end date: 2011-08-24 19:00:24 UTC

This usually happens because rspec tries to match different objects: Time and DateTime, for instance. Also, comparable times can differ a bit, for a few milliseconds.
In the second case, the correct way is to use stubbing and mock. Also see TimeCop gem
In the first case, possible solution can be to compare timestamps:
actual_time.to_i.should == expected_time.to_i
I use simple matcher for such cases:
# ./spec/spec_helper.rb
#
# Dir[File.dirname(__FILE__) + "/support/**/*.rb"].each {|f| require f}
#
#
# Usage:
#
# its(:updated_at) { should be_the_same_time_as updated_at }
#
#
# Will pass or fail with message like:
#
# Failure/Error: its(:updated_at) { should be_the_same_time_as 2.days.ago }
# expected Tue, 07 Jun 2011 16:14:09 +0300 to be the same time as Mon, 06 Jun 2011 13:14:09 UTC +00:00
RSpec::Matchers.define :be_the_same_time_as do |expected|
match do |actual|
expected.to_i == actual.to_i
end
end

You should mock the now method of Time to make sure it always match the date in the spec. You never know when a delay will make the spec fail because of some milliseconds. This approach will also make sure that the time on the real code and on the spec are the same.
If you're using the default rspec mock lib, try to do something like:
t = Time.parse("01/01/2010 10:00")
Time.should_receive(:now).and_return(t)

I totally agree with the previous answers about stubbing Time.now. That said there is one other thing going on here. When you compare datetimes from a database you lose some of the factional time that can be in a ruby DateTime obj. The best way to compare date in that way in Rspec is:
database_start_date.should be_within(1).of(start_date)

One gotcha is that Ruby Time objects have nanosecond precision, but most databases have at most microsecond precision. The best way to get around this is to stub Time.now (or use timecop) with a round number. Read the post I wrote about this: http://blog.solanolabs.com/rails-time-comparisons-devil-details-etc/

Depending on your specs, you might be able to use Rails-native travel helpers:
# in spec_helper.rb
config.include ActiveSupport::Testing::TimeHelpers
start_date ||= Time.current.change(usecs: 0)
end_date = start_date + goal_months.months
travel_to start_date do
# Clone here
end
expect(clone.start_date).to eq(start_date)
Without Time.current.change(usecs: 0) it's likely to complain about the difference between time zones. Or between the microseconds, since the helper will reset the passed value internally (Timecop has a similar issue, so reset usecs with it too).

My initial guess would be that the value of Time.now is formatted differently from your database value.

Are you sure that you are using == and not eql or be? The latter two methods use object identity rather than comparing values.
From the output it looks like the expected value is a Time, while the value being tested is a DateTime. This could also be an issue, though I'd hesitate to guess how to fix it given the almost pathological nature of Ruby's date and time libraries ...

One solution I like is to just add the following to spec_helper:
class Time
def ==(time)
self.to_i == time.to_i
end
end
That way it's entirely transparent even in nested objects.

Adding .to_datetime to both variables will coerce the datetime values to be equivalent and respect timezones and Daylight Saving Time. For just date comparisons, use .to_date.
An example spec with two variables:
actual_time.to_datetime.should == expected_time.to_datetime
A better spec with clarity:
actual_time.to_datetime.should eq 1.month.from_now.to_datetime
.to_i produces ambiguity regarding it's meaning in the specs.
+1 for using TimeCop gem in specs. Just make sure to test Daylight Saving Time in your specs if your app is affected by DST.

Our current solution is to have a freeze_time method that handles the rounding:
def freeze_time(time = Time.zone.now)
# round time to get rid of nanosecond discrepancies between ruby time and
# postgres time
time = time.round
Timecop.freeze(time) { yield(time) }
end
And then you can use it like:
freeze_time do
perform_work
expect(message.reload.sent_date).to eq(Time.now)
end

Related

Ruby Time.zone.now and accounting for database precision

I have a test in a Rails 3 test suite that makes a number assertions that compare timestamps that passes on my local machine but fails within our CI pipeline. This test stores a timestamp in a Postgres database timestamp field with the precision of 6 and compares the stored value to the original timestamp, very similar to the following example:
tmp_time = Time.zone.now
u = User.find(1)
u.updated_at = tmp_time
u.save!
u.reload
assert_equal u.updated_at.to_i, tmp_time.to_i # passes...
assert_equal u.updated_at, tmp_time # fails...
assert_equal u.updated_at.to_f, tmp_time.to_f # fails...
I believe the problem relates to Ruby's time representation being of a higher precision than the stored value.
What is the best way of compensating for the slight difference in values due to precision, outside of being less precise in comparisons? We have considered overriding Time.zone.now, but believe that will lead to downstream problems.
Thanks in advance.
The issue is probably not with precision in the database, but rather that a small time passes between when you define tmp_time and when you save.
You can see that the .to_f representation of Time changes instantly:
irb(main):011:0> 2.times.map { Time.now.to_f }
=> [1551755487.5737898, 1551755487.573792]
This difference is usually not visible when you use .to_i because it rounds to the nearest second.
You can use Timecop, as another answer mentions, to get around this:
irb(main):013:0> Timecop.freeze { 2.times.map { Time.now.to_f } }
=> [1551755580.12368, 1551755580.12368]
When you call .save! an actual write to the database occurs. Timestamps are written by the database which updates the actual data stored in updated_at which is not written by the ActiveRecord Ruby object (unless you do so explicitly with u.update_attribute(updated_at: tmp_time) which defeats the point of timestamps in most cases.
So the time in memory at the moment you instantiate the Ruby Time object won't match the time recorded by the database, which will be some nanoseconds later. Converting Time.new.to_i is not very accurate. While .to_f is normally "close enough", equality of time is nearly impossible. This can be illustrated with a multi-threaded example:
#times = []
def test_time
t1 = Thread.new{ #times << Time.now }
t2 = Thread.new{ #times << Time.now }
t1.join; t2.join
end
test_time
puts #times.each(&:to_s)
# they may appear the same depending on default `.to_s` format
# 10/29/2021 2:20PM
# 10/29/2021 2:20PM
#times.map(&:to_i)
=>[1635531636,1635531636] # same problem
#times.map(&:to_f)
=>[1635531636.989422, 1635531636.989532] # here's often enough precision but...
# under the hood times[1] == times[2] will use the most precise nsec method
#times.map(&:nsec)
=>[989422129, 989531969]
See also See docs on Time#nsec

Rails 5 and PostgreSQL 'Interval' data type

Does Rails really not properly support PostgreSQL's interval data type?
I had to use this Stack Overflow answer from 2013 to create an interval column, and now it looks like I'll need to use this piece of code from 2013 to get ActiveRecord to treat the interval as something other than a string.
Is that how it is? Am I better off just using an integer data type to represent the number of minutes instead?
From Rails 5.1, you can use postgres 'Interval' Data Type, so you can do things like this in a migration:
add_column :your_table, :new_column, :interval, default: "2 weeks"
Although ActiveRecord only treat interval as string, but if you set the IntervalStyle to iso_8601 in your postgresql database, it will display the interval in iso8601 style: 2 weeks => P14D
execute "ALTER DATABASE your_database SET IntervalStyle = 'iso_8601'"
You can then directly parse the column to a ActiveSupport::Duration
In your model.rb
def new_column
ActiveSupport::Duration.parse self[:new_column]
end
More infomation of ISO8601 intervals can be find at https://en.wikipedia.org/wiki/ISO_8601#Time_intervals
I had a similar issue and went with defining reader method for the particular column on the ActiveRecord model. Like this:
class DivingResults < ActiveRecord::Base
# This overrides the same method for db column, generated by rails
def underwater_duration
interval_from_db = super
time_parts = interval_from_db.split(':')
if time_parts.size > 1 # Handle formats like 17:04:41.478432
units = %i(hours minutes seconds)
in_seconds = time_parts
.map.with_index { |t,i| t.to_i.public_send(units[i]) }
.reduce(&:+) # Turn each part to seconds and then sum
ActiveSupport::Duration.build in_seconds
else # Handle formats in seconds
ActiveSupport::Duration.build(interval_from_db.to_i)
end
end
end
This allows to use ActiveSupport::Duration instance elsewhere. Hopefully Rails will start handling the PostgreSQL interval data type automatically in near future.
A more complete and integrated solution is available in Rails 6.1
The current answers suggest overriding readers and writers in the models. I took the alter database suggestion and built a gem for ISO8601 intervals, ar_interval.
It provides a simple ActiveRecord::Type that deals with the serialization and casting of ISO8601 strings for you!
The tests include examples for how to use it.
If there is interest, the additional formats Sam Soffes demonstrates could be included in the tests
Similar to Madis' solution, this one handles fractions of a second and ISO8601 durations:
def duration
return nil unless (value = super)
# Handle ISO8601 duration
return ActiveSupport::Duration.parse(value) if value.start_with?('P')
time_parts = value.split(':')
if time_parts.size > 1
# Handle formats like 17:04:41.478432
units = %i[hours minutes seconds]
in_seconds = time_parts.map.with_index { |t, i| t.to_f.public_send(units[i]) }.reduce(&:+)
ActiveSupport::Duration.build in_seconds
else
# Handle formats in seconds
ActiveSupport::Duration.build(value)
end
end
def duration=(value)
unless value.is_a?(String)
value = ActiveSupport::Duration.build(value).iso8601
end
self[:duration] = value
end
This assumes you setup your database like Leo mentions in his answer. No idea why sometimes they come back from Postgres in the PT42S format and sometimes in the 00:00:42.000 format :/

Querying in MongoDb

I have a model named "Class Sessions" with :scheduled_at as one of the fields, i need to extract ClassSessions whose :scheduled_at is later than a specific date.
P.S.: Scheduled_at stores date in UTC format.
Try some thing like ClassSessions.where(:scheduled_at.gte => Time.now.utc) where Time.now.utc will return the time now in utc format..
gte, lte, lt ... etc are the comparisons you're looking here. please refer to Mongoid Docs for further info.
and Rails by default combines all the condition seperated by , in where as AND so for range use that logic.
Note: Where returns a Mongoid#Criteria Object and you might have to call .results on it to return the result set.
Note2: you might've to call .to_s on Time.now.utc before passing it to mongodb where but I am not sure, try doing both if it works without to_s don't call that.
For Time you can have a look at Ruby Time Docs

Rails/Ruby: TimeWithZone comparison inexplicably failing for equivalent values

I am having a terrible time (no pun intended) with DateTime comparison in my current project, specifically comparing two instances of ActiveSupport::TimeWithZone. The issue is that both my TimeWithZone instances have the same value, but all comparisons indicate they are different.
Pausing during execution for debugging (using RubyMine), I can see the following information:
timestamp = {ActiveSupport::TimeWithZone} 2014-08-01 10:33:36 UTC
started_at = {ActiveSupport::TimeWithZone} 2014-08-01 10:33:36 UTC
timestamp.inspect = "Fri, 01 Aug 2014 10:33:36 UTC +00:00"
started_at.inspect = "Fri, 01 Aug 2014 10:33:36 UTC +00:00"
Yet a comparison indicates the values are not equal:
timestamp <=> started_at = -1
The closest answer I found in searching (Comparison between two ActiveSupport::TimeWithZone objects fails) indicates the same issue here, and I tried the solutions that were applicable without any success (tried db:test:prepare and I don't run Spring).
Moreover, even if I try converting to explicit types, they still are not equivalent when comparing.
to_time:
timestamp.to_time = {Time} 2014-08-01 03:33:36 -0700
started_at.to_time = {Time} 2014-08-01 03:33:36 -0700
timestamp.to_time <=> started_at.to_time = -1
to_datetime:
timestamp.to_datetime = {Time} 2014-08-01 03:33:36 -0700
started_at.to_datetime = {Time} 2014-08-01 03:33:36 -0700
timestamp.to_datetime <=> started_at.to_datetime = -1
The only "solution" I've found thus far is to convert both values using to_i, then compare, but that's extremely awkward to code everywhere I wish to do comparisons (and moreover, seems like it should be unnecessary):
timestamp.to_i = 1406889216
started_at.to_i = 1406889216
timestamp.to_i <=> started_at.to_i = 0
Any advice would be very much appreciated!
Solved
As indicated by Jon Skeet above, the comparison was failing because of hidden millisecond differences in the times:
timestamp.strftime('%Y-%m-%d %H:%M:%S.%L') = "2014-08-02 10:23:17.000"
started_at.strftime('%Y-%m-%d %H:%M:%S.%L') = "2014-08-02 10:23:17.679"
This discovery led me down a strange path to finally discover what was ultimately causing the issue. It was a combination of this issue occurring only during testing and from using MySQL as my database.
The issues was showing only in testing because within the test where this cropped up, I'm running some tests against a couple of associated models that contain the above fields. One model's instance must be saved to the database during the test -- the model that houses the timestamp value. The other model, however, was performing the processing and thus is self-referencing the instance of itself that was created in the test code.
This led to the second culprit, which is the fact I'm using MySQL as the database, which when storing datetime values, does not store millisecond information (unlike, say, PostgreSQL).
Invariably, what this means is that the timestamp variable that was being read after its ActiveRecord was retrieved from the MySQL database was effectively being rounded and shaved of the millisecond data, while the started_at variable was simply retained in memory during testing and thus the original milliseconds were still present.
My own (sub-par) solution is to essentially force both models (rather than just one) in my test to retrieve themselves from the database.
TLDR; If at all possible, use PostgreSQL if you can!
This seem to happen if you're comparing time generated in Ruby with time loaded from the database.
For example:
time = Time.zone.now
Record.create!(mark: time)
record = Record.last
In this case record.mark == time will fail because Ruby keeps time down to nanoseconds, while different databases have different precission.
In case of postgres DateTime type it'll be to miliseconds.
You can see that when you check that while record.mark.sec == time.msec - record.mark.nsec != time.nsec

Rails date equality not working in where clause

Say I have ModelA and ModelB. When I save an instance of ModelA to the db it also creates/saves an instance of ModelB. In the db I end up with UTC for created_at that show up exactly the same. eg:
puts ModelA.first.created_at # Wed, 31 Aug 2011 22:49:28 UTC +00:00
puts ModelB.first.created_at # Wed, 31 Aug 2011 22:49:28 UTC +00:00
So I'd expect a query like the following to return matching records (but it doesn't)
# model_instance is instance of SomeModel
ModelA.where(created_at:model_b_instance.created_at) # returns []
But something like this, using to_s(:db) does work
ModelA.each do |m|
if m.created_at.to_s(:db) == model_b_instance.created_at.to_s(:db)
... # found matches here
end
end
Can someone explain what I am doing wrong here? I want to be able to write queries like ModelA.where(created_at: ... ) but I'm currently stuck having to iterate and match against to_s(:db).
Two things to try:
see what the created_at values are in the db -- maybe there's something off like the time zone (maybe rails is inferring a time zone for one but not the other)
look in the rails log at the actual sql query that is being generated by your ModelA.where invocation -- maybe it's doing some sort of unexpected thing.

Resources