Dynamic number of days to date - ruby-on-rails

I'm trying to get to work a query where I need all the properties, whose payments expired in X number of days. X is defined by the property's community, what I have right now is the following:
Property.joins(:community).joins(:payments).where("payments.expiration_date = current_date + interval communities.sms_defaulting_days + ' days'")
Which is not working, as it does not recognize communities (I believe it's a parsing issue), the error I get is:
PG::SyntaxError: ERROR: syntax error at or near "communities"
which makes sense to me.
What I'm trying to achieve is that the last part of the query should look like this:
payments.expiration_date = current_date + interval '2 days'
And I'd get the 2 from "community.sms_defaulting_days"
Another way to think about it is "expiration_date = 2.days.from_now", but I still have the same problem, as I do not know how to make it work dynamically.

Try to write the code as :
Property.joins(:community)
.joins(:payments)
.where("payments.expiration_date = current_date + (communities.sms_defaulting_days || ' days')::interval")

Related

Selecting greatest date range count in a rails array

I have a database with a bunch of deviceapi entries, that have a start_date and end_date (datetime in the schema) . Typically these entries no more than 20 seconds long (end_date - start_date). I have the following setup:
data = Deviceapi.all.where("start_date > ?", DateTime.now - 2.weeks)
I need to get the hour within data that had the highest number of Deviceapi entries. To make it a bit clearer, this was my latest try on it (code is approximated, don't mind typos):
runningtotal = 0
(2.weeks / 1.hour).to_i.times do |interval|
current = data.select{ |d| d.start_time > (start_date + (1.hour * (interval - 1))) }.select{ |d| d.end_time < (start_date + (1.hour * interval)) }.count
if current > runningtotal
runningtotal = current
end
The problem: this code works just fine. So did about a dozen other incarnations of it, using .where, .select, SQL queries, etc. But it is too slow. Waaaaay too slow. Because it has to loop through every hour within 2 weeks. Then this method might need to be called itself dozens of times.
There has to be a faster way to do this, maybe a sort? I'm stumped, and I've been searching for hours with no luck. Any ideas?
To get adequate performance, you'll want to do everything in a single query, which will mean avoiding ActiveRecord functionality and doing a raw query (e.g. via ActiveRecord::Base.connection.execute).
I have no way to test it, since I have neither your data nor schema, but I think something along these lines will do what you are looking for:
select y.starting_hour, max(y.num_entries) as max_entries
from
(
select x.starting_hour, count(*) as num_entries
from
(
select date_trunc('hour', start_time) starting_hour
from deviceapi as d
) as x
group by x.starting_hour
) as y
where y.num_entries = max(y.num_entries);
The logic of this is as follows, from the inner-most query out:
"Bucket" each starting time to the hour
From the resulting table of buckets, get the total number of entries in each bucket
Get the maximum number of entries from that table, and then use that number to match back to get the starting_hour itself.
If there happen to be more than one bucket with the same number of entries, you could determine a consistent way to pick one -- say the min(starting_hour) or similar (since that would stay the same even as data gets added, assuming you are not deleting items).
If you wanted to limit the initial time slice -- I see 2 weeks referenced in your post -- you could do that in the inner-most query with a where clause bracketing the date range.

Return every nth row from database using ActiveRecord in rails

Ruby 1.9.2 / rails 3.1 / deploy onto heroku --> posgresql
Hi, Once a number of rows relating to an object goes over a certain amount, I wish to pull back every nth row instead. It's simply because the rows are used (in part) to display data for graphing, so once the number of rows returned goes above say 20, it's good to return every second one, and so forth.
This question seemed to point in the right direction:
ActiveRecord Find - Skipping Records or Getting Every Nth Record
Doing a mod on row number makes sense, but using basically:
#widgetstats = self.widgetstats.find(:all,:conditions => 'MOD(ROW_NUMBER(),3) = 0 ')
doesn't work, it returns an error:
PGError: ERROR: window function call requires an OVER clause
And any attempt to solve that with e.g. basing my OVER clause syntax on things I see in the answer on this question:
Row numbering in PostgreSQL
ends in syntax errors and I can't get a result.
Am I missing a more obvious way of efficiently returning every nth task or if I'm on the right track any pointers on the way to go? Obviously returning all the data and fixing it in rails afterwards is possible, but terribly inefficient.
Thank you!
I think you are looking for a query like this one:
SELECT * FROM (SELECT widgetstats.*, row_number() OVER () AS rownum FROM widgetstats ORDER BY id) stats WHERE mod(rownum,3) = 0
This is difficult to build using ActiveRecord, so you might be forced to do something like:
#widgetstats = self.widgetstats.find_by_sql(
%{
SELECT * FROM
(
SELECT widgetstats.*, row_number() OVER () AS rownum FROM widgetstats ORDER BY id
) AS stats
WHERE mod(rownum,3) = 0
}
)
You'll obviously want to change the ordering used and add any WHERE clauses or other modifications to suit your needs.
Were I to solve this, I would either just write the SQL myself, like the SQL that you linked to. You can do this with
my_model.connection.execute('...')
or just get the id numbers and find by id
ids = (1..30).step(2)
my_model.where(id => ids)

AWS SimpleDB where clause 'and' operator behaving unexpectedly

The following simpledb query returns 51 results:
select * from logger where time > '2011-07-29 17:45:10.540284+00:00'
This query returns 20534 results:
select * from logger where time < '2011-07-29 17:50:08.615626'
These two queries both return 0 results!!?:
select * from logger where time between '2011-07-29 17:45:10.540284+00:00' and '2011-07-29 17:50:08.615626'
select * from logger where time > '2011-07-29 17:45:10.540284+00:00' and time < '2011-07-29 17:50:08.615626'
What am I missing here?
But are any of your 51 results returned from the first query actually within the time span you are searching? If they are all later than 17:50:08.615626 then your queries are performing as expected.
I am also suspicious of the fact that you are being inconsistent in how you are representing the time. You should really be using ISO 8601 timestamps if you want consistent lexicographic matching of times with SDB.
The other option is that the queries are taking longer than the query timeout to run, are you checking for errors?
Finally - perhaps SDB is having a bad day and the query is just a bit slow - in those circumstances you can find you get 0 results but DO get a next token - and the actual results follow in the next batch.
Does any of that help?

Concatenating two fields in a collect

Rails 2.3.5
I'm not having any luck searching for an answer on this. I know I could just write out a manual sql statement with a concat in it, but I thought I'd ask:
To load a select, I'm running a query of shift records. I'm trying to make the value in the select be shift date followed by a space and then the shift name. I can't figure out the syntax for doing a concat of two fields in a collect. The Ruby docs make it looks like plus signs and double quotes should work in a collect but everything I try gets a "expected numeric" error from Rails.
#shift_list = [a find query].collect{|s| [s.shift_date + " " + s.shift_name, s.id]}
Thanks for any help - much appreciated.
Hard to say without knowing what s is going to be or what type s.shift_date and s.shift_name are but maybe you're looking for this:
collect{|s| ["#{s.shift_date} #{s.shift_name}", s.id]}
That is pretty much the same as:
collect{|s| [s.shift_date.to_s + ' ' + s.shift_name.to_s, s.id]}
but less noisy.

TFS Team Query: get all changed work items since a given time

Apparently it is impossible to provide the Changed Date field with a timestamp (format '2009-12-14 10:00:00') when defining a new Team Query. I get the error: "The query failed. You cannot supply a time with the date when running a query using date precision.".
Is there a workaround for this? I just want a list of work items which are changed since the last 'x' minutes.
The solution is to write your own WIQL query: http://teamfoundation.blogspot.com/2008/01/specifying-date-and-time-in-wiql.html.
You to enter the date in the same format as it is displayed by VSTS: dd-MMM-YY (01-Jan-16).
In order to filter your items in TFS by a specific date, stick to this format:
try adding query parameter timePrecision:true. This worked for me
I ran into the same problem while trying to query for the latest updates and worked around it by doing the following
// defined elsewhere
private DateTime lastUpdated;
string consult = "select * from WorkItem where [Created Date] > ' " + lastUpdated.ToString("MM/dd/yy") +
"' AND [Work Item Type] = 'Test Case'";
IEnumerable<ITestCase> tcc = testManagementTeamProject.TestCases.Query(consult).Where(tp => tp.DateCreated > lastUpdated);
I did something very similar for retrieving test results
The last parameter of this query constructor lets you define the precision:
dayPrecision
When TRUE, indicates that a DateTime should resolve to an entire day. Often, it is TRUE to avoid being more precise about a specific time.

Resources