I've got a Rails find_by_sql method that works fine locally, in console and the statement also directly in Postgresql, but the statement causes an ActiveRecord::StatementInvalid error when i deploy it to Heroku.
I'm running Postgresql version 9.0.3 locally and using a shared db on Heroku on their Cedar stack.
The error i'm getting is:
PG::Error: ERROR: syntax error at or near "WITH normal_items" LINE 1: WITH normal_items AS (SELECT normal_items_month, count(id)... ^ : WITH normal_items AS (SELECT normal_items_month, count(id) as normal_items_total FROM (SELECT date_trunc('month',created_at) as normal_items_month, id from items WHERE items.a_foreign_key_id IS NULL) z group by normal_items_month), special_items AS (SELECT special_items_month, count(id) as special_items_total FROM (SELECT date_trunc('month',created_at) as special_items_month, id from items WHERE items.a_foreign_key_id IS NOT NULL) x group by special_items_month ) SELECT to_char(month, 'fmMon') as month, coalesce(normal_items_total, 0) as normal_items_total, coalesce(special_items_total, 0) as special_items_total FROM (select generate_series(min(normal_items_month), max(normal_items_month), '1 month'::interval) as month FROM normal_items) m LEFT OUTER JOIN normal_items ON normal_items_month = month LEFT OUTER JOIN special_items ON special_items_month = month
For readability the statement is:
WITH normal_items AS (SELECT normal_items_month, count(id) as normal_items_total
FROM (SELECT date_trunc('month',created_at) as normal_items_month, id from items
WHERE items.a_foreign_key_id IS NULL) z
group by normal_items_month),
special_items AS (SELECT special_items_month, count(id) as special_items_total
FROM (SELECT date_trunc('month',created_at) as special_items_month, id from items
WHERE items.a_foreign_key_id IS NOT NULL) x
group by special_items_month )
SELECT
to_char(month, 'fmMon') as month,
coalesce(normal_items_total, 0) as normal_items_total,
coalesce(special_items_total, 0) as special_items_total
FROM (select generate_series(min(normal_items_month), max(normal_items_month), '1 month'::interval) as month FROM normal_items) m
LEFT OUTER JOIN normal_items ON normal_items_month = month
LEFT OUTER JOIN special_items ON special_items_month = month
This just a provides me with some stats to use with Google Charts, the output is:
Jun 178 0
Jul 0 0
Aug 72 0
Sep 189 0
Oct 24 0
Nov 6 0
Dec 578 0
Jan 0 0
Feb 0 0
Mar 89 0
Apr 607 0
May 281 0
Jun 510 0
Jul 190 0
Aug 0 0
Sep 501 0
Oct 409 0
Nov 704 0
Heroku's shared plan runs PostgreSQL 8.3 which doesn't support WITH keyword (it was introduced in PostgreSQL 8.4).
If you upgrade to Heroku's dedicated database package, you'll be able to use PostgreSQL 9.1.
The default Heroku shared DB is Postgres 8.3 - you can use 9.1 in the public beta of the Shared DB plan - more details at https://postgres.heroku.com/blog/past/2012/4/26/heroku_postgres_development_plan/.
For production you can make use of the newly announced Crane plan at $50 per month https://postgres.heroku.com/blog/past/2012/4/26/heroku_postgres_development_plan/
Related
I have a query that does two queries. The first query looks at future events and what user is in that event from one table. The second query looks at the historical events and creates stats for the user in that event based off all the previous events that user was in from a second table.
I am having troubles joining the two queries. The goal of the join would be to join the second query to the first query based off the most recent stats for each user. The stat can't be newer than the date that the future event occured on. If that user hasn't been in any previous event it would just return null for the query 2 in the join.
Below is also an example table for the future query, a table for the historic query, and an ideal output of the join.
Future:
user
date
event code
event info
User 1
1/26/2023
5596
info_5596
User 2
1/26/2023
5586
info_5586
User 3
1/26/2023
5582
info_5582
User 1
1/20/2023
5492
info_5492
User 1
1/2/2023
5341
info_5341
User 2
1/2/2023
5333
info_5333
Historical:
user
date
stat 1
stat2
event code
event info
User 1
1/25/2023
10
52
4352
info_4352
User 2
1/25/2023
11
22
4332
info_4332
User 2
1/12/2023
2
45
4298
info_4298
User 3
1/12/2023
8
88
4111
info_4111
User 1
1/12/2023
7
67
4050
info_4050
User 3
1/2/2023
3
91
4000
info_4000
User 1
1/1/2023
6
15
3558
info_3558
Output of the JOIN:
user
date future
stat 1
stat2
event code future
event info future
User 1
1/26/2023
10
52
5596
info_5596
User 2
1/26/2023
11
22
5586
info_5586
User 3
1/26/2023
8
88
5582
info_5582
User 1
1/20/2023
7
67
5492
info_5492
User 1
1/2/2023
3
91
5341
info_5341
User 2
1/2/2023
null
null
5333
info_5333
I tried using a subquery in the join, but bigquery was saying that it is unsupported. Below is my code attempt. I have also tried using MAX() but it was not liking that to be used in the join as well
Another option I am thinking of is to join the two datasets before ever calculating the query 2 stats. Then filtering. I have a large query already written for both though, so I would prefer not to start over.
Select Distict
A.*
B.Stat1, B.Stat2
from future as A
Left Join historic as B
ON (
A.Date = (Select MAX(B.date) FROM historic as recent_historic WHERE recent_historic.user = A.user)
AND
A.user = B.user
)
ORDER BY A.date
I have been stuck with this problem for a while at work. The data given has been radically changed as I just need the general idea of how to approach the problem, and it would not be possible to provide the actual schema of the tables.
I have a table Users and and another table Membership. And each user has a one to many relationship with membership through the user_membership table. A mock up of the following table is shown below:
id
name
email
1
John
john#gmail.com
2
James
james#gmail.com
...
...
...
id
user_id
membership_id
1
2
1
2
1
2
3
1
3
4
1
4
5
1
5
...
...
...
id
created_at
1
31st Dec 2021
2
1st Jan 2022
3
2nd Jan 2022
4
3rd Jan 2022
5
4th Jan 2022
...
...
I have some level of rather complex querying that returns an ActiverecordRelation.
ie:
users = Users.select(....)
I then need to chain the above query with another query that allows each user with their latest membership_created_at date. Ie:
<User, id: 1, name: John ,email: john#gmail.com, latest_membership_created_at: 4th Jan 2022>
<User, id: 2, name: James ,email: james#gmail.com, latest_membership_created_at: 31st Dec 2021>
My approach:
users = users.joins(user_memberships: :membership).merge(User.all).group(:id).select('membership.*, MAX(membership.created_at) AS membership_created_at_raw')
I get an error:
Query 1 ERROR: ERROR: column "users.id" must appear in the GROUP BY clause or be used in an aggregate function...
Qn 1: Is there anyway I can fix this?
In another related note, is it also possible to do a join of the result of 2 queries? I am thinking perhaps I can do a group of user_membership table by user_id and join with the membership table. Something like
users_created_at = User.all.joins(user_memberships: :membership[).group(:id).select('user.id, MAX(memberships.created_at) AS membership_created_at_raw')
Qn 2: Can we then somehow do an innerjoin between users and users_created_at using rails?
Thank you!
You can do it like this:
User.select(
'DISTINCT ON (users.id) users.id, memberships.*'
).joins(
user_memberships: :membership
).order('users.id, memberships.created_at DESC')
Or raw SQL query
SELECT DISTINCT ON (users.id) users.id, memberships.*
FROM users
LEFT JOIN user_memberships ON user_memberships.user_id = users.id
LEFT JOIN memberships ON memberships.id = user_memberships.membership_id
ORDER BY users.id, memberships.created_at DESC
ksql> CREATE TABLE HOPPING_TABLE AS SELECT ID, WINDOWSTART() AS WINSTART, COUNT(*) AS CNT FROM MY_STREAM HOPPING WINDOW (SIZE 30 DAYS, ADVANCED BY 1 DAYS) GROUP BY ID;
ksql> SELECT ID, WINSTART, CNT FROM HOPPING_TABLE;
id winstart cnt
-------------------------------------------
874163197805291909 1547164800000 23
874163197805291909 1547424000000 11
874163197805291909 1547510400000 26
874163197805291909 1547683200000 12
660071199310134801 1545868800000 6
660071199310134801 1546560000000 7
660071199310134801 1547251200000 3
Now, I just care about the cnt of window with max winstart value for each ID grouped, how to achieve that by KSQL then?
Considering above example (2 IDs grouped), I hope a table can be generated from above HOPPING TABLE with the following records:
id winstart cnt
-------------------------------------------
874163197805291909 1547683200000 12
660071199310134801 1547251200000 3
Hey guys I have some queries that are getting generated dynamically by active record and for performance reasons I need to merge them all together and send them to MSSQL in one go.
I tried the following and it works great in postgresql but I can't get it to work in MSSQL.
(SELECT [panels].* FROM [panels] WHERE [panels].[environment_id] = 14 AND [panels].[agglo_code_id] = 23 AND [panels].[advert_area_id] = 161 AND [panels].[product_id] = 25 AND (NOT EXISTS(SELECT 1 FROM campaign_search_panels WHERE campaign_search_panels.panel_id = panels.panel_id AND campaign_search_panels.campaign_id = 65)) AND (NOT EXISTS(SELECT 1 FROM "AIDAAU_Avails" WHERE "AIDAAU_Avails"."PanelID" = panels.panel_uid AND "AIDAAU_Avails"."TillDate" >= '08-21-2017' AND "AIDAAU_Avails"."FromDate" <= '09-03-2017')) ORDER BY [panels].[random_order] ASC OFFSET 0 ROWS FETCH NEXT 3 ROWS ONLY)
UNION ALL
(SELECT [panels].* FROM [panels] WHERE [panels].[environment_id] = 14 AND [panels].[agglo_code_id] = 23 AND [panels].[advert_area_id] = 136 AND [panels].[product_id] = 25 AND (NOT EXISTS(SELECT 1 FROM campaign_search_panels WHERE campaign_search_panels.panel_id = panels.panel_id AND campaign_search_panels.campaign_id = 65)) AND (NOT EXISTS(SELECT 1 FROM "AIDAAU_Avails" WHERE "AIDAAU_Avails"."PanelID" = panels.panel_uid AND "AIDAAU_Avails"."TillDate" >= '08-21-2017' AND "AIDAAU_Avails"."FromDate" <= '09-03-2017')) ORDER BY [panels].[random_order] ASC OFFSET 0 ROWS FETCH NEXT 2 ROWS ONLY)
Now I think there are two issues that I can spot already. If I remove the brackets surrounding each query then I get closer but it still complains about ORDER. I have a feeling you can only order after the result but I don't have that much control over how each individual sql query is put together only how i combine them. I would ideally like to keep the ability to both order and have the limit clause for each subquery. Is there any easy way of putting these together so they will work in MSSQL and not just postgres?
Thanks for your help!
In a UNION query you can only have one ORDER BY clause and it must go at the end:
SELECT * from <table1>
UNION ALL
SELECT * from <table2>
ORDER BY <col1>
You must remove that ORDER BY on your top query and it should work correctly
If you want to order union results, you would have to throw the results into a CTE. Something like this:
with cte_name as
(
select col_1, col_2, etc
from table
union all
select col_1, col_2, etc
from table_2
)
select col_1, col_2
from cte_name
order by col_1
I have a simple problem which gave em a headache
I need to sort integers in a Database table TDBGrid ( its ABS database from component ace ) with the following order
0
1
11
111
121
2
21
211
22
221
and so on
which means every number starting with 1 should be under 1
1
11
111
5
55
can anyone help me?
thanks
This should work to get stuff in the right order:
Convert the original number to a string;
Right-pad with zeroes until you have a string of 3 characters wide;
(optional) Convert back to integer.
Then sorting should always work the way you want. Probably it's best to let the database do that for you. In MySQL you'd do something like this:
select RPAD(orderid,3,'0') as TheOrder
from MyTable
order by 1
I just ran this in SQL Server Management Studio - note I mixed up the rows in the input so they were not in sorted order:
create table #temp( ID Char(3));
insert into #temp (ID)
select '111' union
select '221';
select '0' union
select '21' union
select '1' union
select '11' union
select '211' union
select '121' union
select '2' union
select '22' union
select * from #temp order by ID;
I got the following output:
ID
----
0
1
11
111
121
2
21
211
22
221
(10 row(s) affected)
If you're getting different results, you're doing something wrong. However, it's hard to say what because you didn't post anything about how you're retrieving the data from the database.
Edit: Some clarification by the poster indicates that the display is in a TDBGrid attached to a table using Component Ace ABS Database. If that indeed is the case, then the answer is to create an index on the indicated column, and then set the table's IndexName property to use that index.
select cast(INT_FIELD as varchar(9)) as I
from TABxxx
order by 1