Dynamic time based finder for ActiveRecord - ruby-on-rails

I have a slightly complex time arithmetic problem.
I have a reminder system where the user can set "how many x before event" duration. For example: If I set '5 minutes' - I need to get reminder before 5 minutes of the event schedule.
In my reminder system, I have a cron which runs every minute and sends reminder mails. So far so good. I want to find all calendar events which are eligible for reminder (calendar entry whose scheduled time is between "5.minutes.from_now and 6.minutes.from_now"
I am trying the write the following where clause :
conds = "'when' >= '#{eval("#{cal.remind_before.to_s}.#{cal.remind_before_what.downcase}.from_now").to_s(:db)}' AND 'when' < '#{eval("#{cal.remind_before.to_s}.#{cal.remind_before_what.downcase}.from_now + 1.minutes").to_s(:db)}'"
#mail_calendar_for_reminder= Calendar.find(:all, :conditions=> conds)
Here cal.reminder_before = '5', cal.remind_before_what.downcase='minutes'
so the eval would be evaluating (5.minutes.from_now) and (6.minutes.from_now)
The resulting SQL statement is :
SELECT "calendars".* FROM "calendars" WHERE ('when' >= '2011-01-11 14:44:54' AND 'when' < '2011-01-11 14:45:54')
This SQL is syntactically and logically correct because it gets a time range of 5.minutes.from_now and 6.minutes.from_now. But it is not selecting eligible records. I suspect two things:
1. The SQL above is doing string comparisons rather than time comparisons.
2. The database entry for calendar's scheduled time has the following format :
2011-01-11 14:45:09.000000 --the 0's the end might be messing teh date comparisons.
I tried almost all sorts of date range arithmetic but could not get the eligible records in this query.

Depending on your server and its load, a one-minute window for cron might be a little optimistic.
What happens if you login to the dbms server and execute that SQL statement? Any rows returned? Any error messages?
You can try an explicit type cast. So
'when' >= CAST('2011-01-11 14:44:54' AS DATETIME) ...
Your dbms might require a different syntax for type casting and conversion. Search your docs.
Are your column names case sensitive? Is the column 'when' or 'When'? (Or wHen?)
This query returns your test event. Note the double quotes around the column name.
SELECT "calendars".*
FROM "calendars"
WHERE ("when" >= '2011-01-10 15:56'
AND "when" < '2011-01-10 15:57')

Related

Rails: How can I use .to_sql on a .select() request

I have an ActiveRecord request:
Post.all.select { |p| Date.today < p.created_at.weeks_since(2) }
And I want to be able to see what SQL request this produces using .to_sql
The error I get is: NoMethodError: undefined method 'to_sql'
TIA!
ISSUE
There are 2 types of select when it comes to ActiveRecord objects, from the Docs
select with a Block.
First: takes a block so it can be used just like Array#select.
This will build an array of objects from the database for the scope, converting them into an array and iterating through them using Array#select.
This is what you are using right now. This implementation will load every post instantiate a Post object and then iterating over each Post using Array#select to filter the results into an Array. This is highly inefficient, cannot be chained with other AR semantics (e.g. where,order,etc.) and will cause very long lags at scale. (This is also what is causing your error because Array does not have a to_sql method)
select with a list of columns (or a String if you prefer)
Second: Modifies the SELECT statement for the query so that only certain fields are retrieved...
This version is unnecessary in your case as you do not wish to limit the columns returned by the query to posts.
Suggested Resolution:
Instead what you are looking for is a WHERE clause to filter the records at the database level before returning them to the ORM.
Your current filter is (X < Y + 2)
Date.today < p.created_at.weeks_since(2)
which means Today's Date is less than Created At plus 2 Weeks.
We can invert this criteria to make it easier to query by switching this to Today's Date minus 2 weeks is less than Created At. (X - 2 < Y)
Date.today.weeks_ago(2) < p.created_at
This is equivalent to p.created_at > Date.today.weeks_ago(2) which we can convert to a where clause using standard ActiveRecord query methods:
Post.where(created_at: Date.today.weeks_ago(2)...)
This will result in SQL like:
SELECT
posts.*
FROM
posts.*
WHERE
posts.created_at > '2022-10-28'
Notes:
created_at is a TimeStamp so it might be better to use Time.now vs Date.today.
Additional concerns may be involved from a time zone perspective since you will be performing date/time specific comparisons.
You need to call to_sql on a relation. select executes the query and gives you the result, and on the result you don't have to_sql method.
There are similar questions which you can look at as they offer some alternatives.

Where comparison on join table with buffer value on rails

How can I use the join table's column value with arithmetic operation during the where condition on Rails?
User and Order are the two Schema, Order has user via Foreign key relation
My goal is to find if an Order was created/placed within 5 minutes of User creation (Understanding Users who signup for placing an Order)
Tried the following queries
Order.where('country': 'US').joins(:user).where('orders.created_at <= :u_date', {u_date: 'users.created_at' + 5.minutes })
With this query we get the following error no implicit conversion of Time into String, so the users.created_at is not evaluating into a Date
Hence tried converting the string to DateTime objects, which failed too
Order.joins(:user).where('orders.created_at < ?', 'users.created_at'+ 5.minutes)
How can I do the comparison inside the Where query?
Right now I am plucking the data and comparing it, It'd be great to make it work inside the Where or any relevant query itself
You're invoking + on a string passing as argument a Time object, which is not an out-of-the-box operation, at least in Rails.
If the time to add is not dynamic you could try;
where("orders.created_at <= users.created_at + INTERVAL '5.minutes'")
which makes your DBMS add the proper interval to users.created_at (in this case I'm assuming Postgresql)

Selecting greatest date range count in a rails array

I have a database with a bunch of deviceapi entries, that have a start_date and end_date (datetime in the schema) . Typically these entries no more than 20 seconds long (end_date - start_date). I have the following setup:
data = Deviceapi.all.where("start_date > ?", DateTime.now - 2.weeks)
I need to get the hour within data that had the highest number of Deviceapi entries. To make it a bit clearer, this was my latest try on it (code is approximated, don't mind typos):
runningtotal = 0
(2.weeks / 1.hour).to_i.times do |interval|
current = data.select{ |d| d.start_time > (start_date + (1.hour * (interval - 1))) }.select{ |d| d.end_time < (start_date + (1.hour * interval)) }.count
if current > runningtotal
runningtotal = current
end
The problem: this code works just fine. So did about a dozen other incarnations of it, using .where, .select, SQL queries, etc. But it is too slow. Waaaaay too slow. Because it has to loop through every hour within 2 weeks. Then this method might need to be called itself dozens of times.
There has to be a faster way to do this, maybe a sort? I'm stumped, and I've been searching for hours with no luck. Any ideas?
To get adequate performance, you'll want to do everything in a single query, which will mean avoiding ActiveRecord functionality and doing a raw query (e.g. via ActiveRecord::Base.connection.execute).
I have no way to test it, since I have neither your data nor schema, but I think something along these lines will do what you are looking for:
select y.starting_hour, max(y.num_entries) as max_entries
from
(
select x.starting_hour, count(*) as num_entries
from
(
select date_trunc('hour', start_time) starting_hour
from deviceapi as d
) as x
group by x.starting_hour
) as y
where y.num_entries = max(y.num_entries);
The logic of this is as follows, from the inner-most query out:
"Bucket" each starting time to the hour
From the resulting table of buckets, get the total number of entries in each bucket
Get the maximum number of entries from that table, and then use that number to match back to get the starting_hour itself.
If there happen to be more than one bucket with the same number of entries, you could determine a consistent way to pick one -- say the min(starting_hour) or similar (since that would stay the same even as data gets added, assuming you are not deleting items).
If you wanted to limit the initial time slice -- I see 2 weeks referenced in your post -- you could do that in the inner-most query with a where clause bracketing the date range.

AWS SimpleDB where clause 'and' operator behaving unexpectedly

The following simpledb query returns 51 results:
select * from logger where time > '2011-07-29 17:45:10.540284+00:00'
This query returns 20534 results:
select * from logger where time < '2011-07-29 17:50:08.615626'
These two queries both return 0 results!!?:
select * from logger where time between '2011-07-29 17:45:10.540284+00:00' and '2011-07-29 17:50:08.615626'
select * from logger where time > '2011-07-29 17:45:10.540284+00:00' and time < '2011-07-29 17:50:08.615626'
What am I missing here?
But are any of your 51 results returned from the first query actually within the time span you are searching? If they are all later than 17:50:08.615626 then your queries are performing as expected.
I am also suspicious of the fact that you are being inconsistent in how you are representing the time. You should really be using ISO 8601 timestamps if you want consistent lexicographic matching of times with SDB.
The other option is that the queries are taking longer than the query timeout to run, are you checking for errors?
Finally - perhaps SDB is having a bad day and the query is just a bit slow - in those circumstances you can find you get 0 results but DO get a next token - and the actual results follow in the next batch.
Does any of that help?

SQLite /Rails datetime query problems

I'm building a Rails application, which is creating orders from a schedule. The schedule has a time in format hh:mm, and ticks for each day of the week. A method occasionally checks the schedules, and creates any orders required by the schedule.
Firstly, I build up the time for this week's order in a Ruby DateTime object, then check if it exists, and create if not e.g.:
order = Order.where( :delivery_datetime = del_datetime )
unless order.any?
Order.create( :status => 'Estimated', :delivery_datetime => del_datetime )
end
That works as expected on my machine, but when other people picked it up from the repository, it would recreate the orders every time. I investigated the SQL it was using, and the problem seemed to be it was creating a where clause slightly different to the insert statement:
SELECT COUNT(*) FROM orders WHERE delivery_datetime = '2011-06-30 18:00:00.000000'
INSERT INTO orders (delivery_datetime) VALUES ('2011-06-30 18:00:00.000000000')
So the difference is the three extra zeroes in the partial second field. I understand SQLite doesn't have real date types, so these are different just because the strings are different. The problem I am having now is that I can't seem to force the format of the inserted string. E.g. even if I do the following:
Order.create( :status => 'Estimated',
:delivery_datetime => del_datetime.strftime( '%Y-%m-%d %H:%M' ) )
the insert statement still uses a 'standard' format - but with 6 zeroes on my instance, and 9 on another.
No answers! Not seen that on Stack Overflow before. Right now, I have 'fixed' it by declaring the database field as a string, and then using strftime to ensure the date format remains the same. This works, but doesn't seem ideal.
After reading all the stuff within Sqlite docs about datatypes being a 'misfeature', I'm thinking of dropping it for Postgres or similar. I want datetimes to be datetimes, and for it not to randomly decide that 18:00:00.000000 and 18:00:00.000000000 are different times...

Resources