Rails date equality not working in where clause - ruby-on-rails

Say I have ModelA and ModelB. When I save an instance of ModelA to the db it also creates/saves an instance of ModelB. In the db I end up with UTC for created_at that show up exactly the same. eg:
puts ModelA.first.created_at # Wed, 31 Aug 2011 22:49:28 UTC +00:00
puts ModelB.first.created_at # Wed, 31 Aug 2011 22:49:28 UTC +00:00
So I'd expect a query like the following to return matching records (but it doesn't)
# model_instance is instance of SomeModel
ModelA.where(created_at:model_b_instance.created_at) # returns []
But something like this, using to_s(:db) does work
ModelA.each do |m|
if m.created_at.to_s(:db) == model_b_instance.created_at.to_s(:db)
... # found matches here
end
end
Can someone explain what I am doing wrong here? I want to be able to write queries like ModelA.where(created_at: ... ) but I'm currently stuck having to iterate and match against to_s(:db).

Two things to try:
see what the created_at values are in the db -- maybe there's something off like the time zone (maybe rails is inferring a time zone for one but not the other)
look in the rails log at the actual sql query that is being generated by your ModelA.where invocation -- maybe it's doing some sort of unexpected thing.

Related

Querying in MongoDb

I have a model named "Class Sessions" with :scheduled_at as one of the fields, i need to extract ClassSessions whose :scheduled_at is later than a specific date.
P.S.: Scheduled_at stores date in UTC format.
Try some thing like ClassSessions.where(:scheduled_at.gte => Time.now.utc) where Time.now.utc will return the time now in utc format..
gte, lte, lt ... etc are the comparisons you're looking here. please refer to Mongoid Docs for further info.
and Rails by default combines all the condition seperated by , in where as AND so for range use that logic.
Note: Where returns a Mongoid#Criteria Object and you might have to call .results on it to return the result set.
Note2: you might've to call .to_s on Time.now.utc before passing it to mongodb where but I am not sure, try doing both if it works without to_s don't call that.
For Time you can have a look at Ruby Time Docs

Is it possible to combine active record query with postgreSQL raw query

Is it possible to combine active record query with postgreSQL raw query ?
Czce.where("time > ? ", "2014-02-09".to_datetime).raw_query
def self.raw_query
raw_q = 'SELECT cast(ticktime as timestamp(1)) AS ticktime
,max(bid_price) as price, max(bid_volume) as volume
From czces
Group BY 1
ORDER BY 1
limit 1000;'
ActiveRecord::Base.connection.select_all(raw_q)
end
If I do this with find_by_sql, the result from database is missing many columns.
And the conditional .where("ticktime > ? ", "2014-02-09".to_datetime) still not works
Here's the query expression Czce.where("ticktime > ? ", "2014-02-09".to_datetime).find_by_sql(raw_q)
[998] #<Czce:0x007fc881443080> {
:id => nil,
:ticktime => Fri, 07 Feb 2014 01:16:41 UTC +00:00
},
[999] #<Czce:0x007fc881442d38> {
:id => nil,
:ticktime => Fri, 07 Feb 2014 01:16:42 UTC +00:00
}
But the expected result should contains price, volume
from (pry):3:in `block (2 levels) in <top (required)>'
[4] pry(main)> result[0]
{
"ticktime" => "2014-02-28 07:00:00",
"price" => "7042",
"volume" => "2"
}
[5] pry(main)> result[1]
{
"ticktime" => "2014-02-28 06:59:59",
"price" => "18755",
"volume" => "525"
}
In short, no.
Raw select_all queries are done as pure SQL sent to the server, and records sent back as raw data.
From the Rails Guide for select_all (emphasis mine):
select_all will retrieve objects from the database using custom SQL
just like find_by_sql but will not instantiate them. Instead, you will
get an array of hashes where each hash indicates a record.
You could iterate over the resulting records and do something with those, perhaps store them in your own class and then use that information in subsequent calls via ActiveRecord, but you can't actually directly chain the two. If you're going to drop down into raw SQL (and certainly there are myriad reasons you may want to do this), you might as well grab everything else you would need in that same context at the same time, in the same raw query.
There's also find_by_sql, which will return an array of instantiated ActiveRecord objects.
From the guide:
The find_by_sql method will return an array of objects even if the
underlying query returns just a single record.
And:
find_by_sql provides you with a simple way of making custom calls to
the database and retrieving instantiated objects.
However, that's an actual instantiated object, which, while perhaps easier in many respects, since they would be mapped to an instance of the model and not simply a hash, chaining, say, where to that is not the same as a chained call to the base model class, as would normally be done.
I would recommend doing everything you can in the SQL itself, all server side, and then any further touch-up filtering you want to do can be done client-side in Rails by iterating over the records that are returned.
Just try to not use base connection itself, try expand the standard rails sql form like:
Czces.select("cast(ticktime as timestamp(1)) AS ticktime,max(bid_price) as price, max(bid_volume) as volume")
.group("1").order("1").limit(1000)
but if you just explain a condition, i.e. want do you really wnat to get from a query, we'll try to write a proper sql.

Rails not correctly converting to UTC timezone when doing date compare

All,
I have a query that is looking for conversations since a certain date to allow the mobile app to see if there are any updated conversations since the last time it checked. What I am finding is that rails is not converting my "since_date" to UTC before making the query and I am getting incorrect results. (This is coming right from my functional tests)
The generated query is
SELECT "conversations".* FROM "conversations" INNER JOIN "conversation_joins" ON "conversations"."id" = "conversation_joins"."conversation_id" WHERE "conversation_joins"."user_id" = 980190962 AND (last_message_at > '2014-10-05 11:46:22 -0500')
The result is
[#<Conversation id: 980190962, last_message_id: 298486374, user_ids: nil, last_message_at: "2014-10-05 15:48:22", message_count: 2, created_at: "2014-10-05 16:48:22", updated_at: "2014-10-05 16:48:22", key: "298486374,980190962">]
Notice the returned conversation has a UTC last_message_at date/time of "2014-10-05 15:48:22" and the query is looking for "2014-10-05 11:46:22 -0500" which is "2014-10-05 16:46:22" meaning it should not be returned. To confirm this, I did the same thing looking X hours ahead and the conversation was NOT returned once I hit 6 hours ahead which is matching my time zone speculation.
Do I need to explicitly change the Date/Time to UTC in my controller before creating my query or should rails be doing this for me since the time zone is included already? My Rails version is "3.2.13"
The code that queries has to turn the dates to UTC before sending them to the where call. ActiveRecord does not transform query parameters.

Rails/Ruby: TimeWithZone comparison inexplicably failing for equivalent values

I am having a terrible time (no pun intended) with DateTime comparison in my current project, specifically comparing two instances of ActiveSupport::TimeWithZone. The issue is that both my TimeWithZone instances have the same value, but all comparisons indicate they are different.
Pausing during execution for debugging (using RubyMine), I can see the following information:
timestamp = {ActiveSupport::TimeWithZone} 2014-08-01 10:33:36 UTC
started_at = {ActiveSupport::TimeWithZone} 2014-08-01 10:33:36 UTC
timestamp.inspect = "Fri, 01 Aug 2014 10:33:36 UTC +00:00"
started_at.inspect = "Fri, 01 Aug 2014 10:33:36 UTC +00:00"
Yet a comparison indicates the values are not equal:
timestamp <=> started_at = -1
The closest answer I found in searching (Comparison between two ActiveSupport::TimeWithZone objects fails) indicates the same issue here, and I tried the solutions that were applicable without any success (tried db:test:prepare and I don't run Spring).
Moreover, even if I try converting to explicit types, they still are not equivalent when comparing.
to_time:
timestamp.to_time = {Time} 2014-08-01 03:33:36 -0700
started_at.to_time = {Time} 2014-08-01 03:33:36 -0700
timestamp.to_time <=> started_at.to_time = -1
to_datetime:
timestamp.to_datetime = {Time} 2014-08-01 03:33:36 -0700
started_at.to_datetime = {Time} 2014-08-01 03:33:36 -0700
timestamp.to_datetime <=> started_at.to_datetime = -1
The only "solution" I've found thus far is to convert both values using to_i, then compare, but that's extremely awkward to code everywhere I wish to do comparisons (and moreover, seems like it should be unnecessary):
timestamp.to_i = 1406889216
started_at.to_i = 1406889216
timestamp.to_i <=> started_at.to_i = 0
Any advice would be very much appreciated!
Solved
As indicated by Jon Skeet above, the comparison was failing because of hidden millisecond differences in the times:
timestamp.strftime('%Y-%m-%d %H:%M:%S.%L') = "2014-08-02 10:23:17.000"
started_at.strftime('%Y-%m-%d %H:%M:%S.%L') = "2014-08-02 10:23:17.679"
This discovery led me down a strange path to finally discover what was ultimately causing the issue. It was a combination of this issue occurring only during testing and from using MySQL as my database.
The issues was showing only in testing because within the test where this cropped up, I'm running some tests against a couple of associated models that contain the above fields. One model's instance must be saved to the database during the test -- the model that houses the timestamp value. The other model, however, was performing the processing and thus is self-referencing the instance of itself that was created in the test code.
This led to the second culprit, which is the fact I'm using MySQL as the database, which when storing datetime values, does not store millisecond information (unlike, say, PostgreSQL).
Invariably, what this means is that the timestamp variable that was being read after its ActiveRecord was retrieved from the MySQL database was effectively being rounded and shaved of the millisecond data, while the started_at variable was simply retained in memory during testing and thus the original milliseconds were still present.
My own (sub-par) solution is to essentially force both models (rather than just one) in my test to retrieve themselves from the database.
TLDR; If at all possible, use PostgreSQL if you can!
This seem to happen if you're comparing time generated in Ruby with time loaded from the database.
For example:
time = Time.zone.now
Record.create!(mark: time)
record = Record.last
In this case record.mark == time will fail because Ruby keeps time down to nanoseconds, while different databases have different precission.
In case of postgres DateTime type it'll be to miliseconds.
You can see that when you check that while record.mark.sec == time.msec - record.mark.nsec != time.nsec

Rails Time inconsistencies with rspec

I'm working with Time in Rails and using the following code to set up the start date and end date of a project:
start_date ||= Time.now
end_date = start_date + goal_months.months
I then clone the object and I'm writing rspec tests to confirm that the attributes match in the copy. The end dates match:
original[end_date]: 2011-08-24 18:24:53 UTC
clone[end_date]: 2011-08-24 18:24:53 UTC
but the spec gives me an error on the start dates:
expected: Wed Aug 24 18:24:53 UTC 2011,
got: Wed, 24 Aug 2011 18:24:53 UTC +00:00 (using ==)
It's clear the dates are the same, just formatted differently. How is it that they end up getting stored differently in the database, and how do I get them to match? I've tried it with DateTime as well with the same results.
Correction: The end dates don't match either. They print out the same, but rspec errors out on them as well. When I print out the start date and end date, the values come out in different formats:
start date: 2010-08-24T19:00:24+00:00
end date: 2011-08-24 19:00:24 UTC
This usually happens because rspec tries to match different objects: Time and DateTime, for instance. Also, comparable times can differ a bit, for a few milliseconds.
In the second case, the correct way is to use stubbing and mock. Also see TimeCop gem
In the first case, possible solution can be to compare timestamps:
actual_time.to_i.should == expected_time.to_i
I use simple matcher for such cases:
# ./spec/spec_helper.rb
#
# Dir[File.dirname(__FILE__) + "/support/**/*.rb"].each {|f| require f}
#
#
# Usage:
#
# its(:updated_at) { should be_the_same_time_as updated_at }
#
#
# Will pass or fail with message like:
#
# Failure/Error: its(:updated_at) { should be_the_same_time_as 2.days.ago }
# expected Tue, 07 Jun 2011 16:14:09 +0300 to be the same time as Mon, 06 Jun 2011 13:14:09 UTC +00:00
RSpec::Matchers.define :be_the_same_time_as do |expected|
match do |actual|
expected.to_i == actual.to_i
end
end
You should mock the now method of Time to make sure it always match the date in the spec. You never know when a delay will make the spec fail because of some milliseconds. This approach will also make sure that the time on the real code and on the spec are the same.
If you're using the default rspec mock lib, try to do something like:
t = Time.parse("01/01/2010 10:00")
Time.should_receive(:now).and_return(t)
I totally agree with the previous answers about stubbing Time.now. That said there is one other thing going on here. When you compare datetimes from a database you lose some of the factional time that can be in a ruby DateTime obj. The best way to compare date in that way in Rspec is:
database_start_date.should be_within(1).of(start_date)
One gotcha is that Ruby Time objects have nanosecond precision, but most databases have at most microsecond precision. The best way to get around this is to stub Time.now (or use timecop) with a round number. Read the post I wrote about this: http://blog.solanolabs.com/rails-time-comparisons-devil-details-etc/
Depending on your specs, you might be able to use Rails-native travel helpers:
# in spec_helper.rb
config.include ActiveSupport::Testing::TimeHelpers
start_date ||= Time.current.change(usecs: 0)
end_date = start_date + goal_months.months
travel_to start_date do
# Clone here
end
expect(clone.start_date).to eq(start_date)
Without Time.current.change(usecs: 0) it's likely to complain about the difference between time zones. Or between the microseconds, since the helper will reset the passed value internally (Timecop has a similar issue, so reset usecs with it too).
My initial guess would be that the value of Time.now is formatted differently from your database value.
Are you sure that you are using == and not eql or be? The latter two methods use object identity rather than comparing values.
From the output it looks like the expected value is a Time, while the value being tested is a DateTime. This could also be an issue, though I'd hesitate to guess how to fix it given the almost pathological nature of Ruby's date and time libraries ...
One solution I like is to just add the following to spec_helper:
class Time
def ==(time)
self.to_i == time.to_i
end
end
That way it's entirely transparent even in nested objects.
Adding .to_datetime to both variables will coerce the datetime values to be equivalent and respect timezones and Daylight Saving Time. For just date comparisons, use .to_date.
An example spec with two variables:
actual_time.to_datetime.should == expected_time.to_datetime
A better spec with clarity:
actual_time.to_datetime.should eq 1.month.from_now.to_datetime
.to_i produces ambiguity regarding it's meaning in the specs.
+1 for using TimeCop gem in specs. Just make sure to test Daylight Saving Time in your specs if your app is affected by DST.
Our current solution is to have a freeze_time method that handles the rounding:
def freeze_time(time = Time.zone.now)
# round time to get rid of nanosecond discrepancies between ruby time and
# postgres time
time = time.round
Timecop.freeze(time) { yield(time) }
end
And then you can use it like:
freeze_time do
perform_work
expect(message.reload.sent_date).to eq(Time.now)
end

Resources