I am testing a Ext JS application (Client Side) and Play Framework (Service Side).
I am using a grid in Ext JS with pagination.
The pagination part requires to send URL Query Parameters to my Play! server. This is no big deal, but how to process these parameters in the SQL Statement??
Example:
First request:
http://myDomain:9000/GetUsers?_dc=123456789&page=1&start=0&limit=25
Second reqeust:
http://myDomain:9000/GetUsers?_dc=123456789&page=2&start=25&limit=25
My thoughts:
Normally in SQL you can set the TOP results:
SELECT TOP 25 FROM USERS
But how to translate the second request into a Sql query?
Thank you for taking time to help me out!
======>>
EDIT: I am developing on SQL Server 2008, but I want this working on Sql Server 2005 or higher and Oracle 9 and higher :-)
Since you're using the Play! framework, what you should do is have a proper model, with entities representing your SQL tables. Then receiving a range of results is built in:
// 25 max users start at 25
List<User> users = User.all().from(25).fetch(25);
You should also look at the pagination module. I haven't tested it, but it looks like exactly what you want.
You could try something like:
WITH Query_1 AS (
SELECT
Field1, Field2, etc
ROW_NUMBER() OVER (ORDER BY Field1, Field2, etc) AS RowID
FROM Table
WHERE x=y
)
SELECT * FROM Query_1 WHERE RowID >= #start
AND RowID < #start + #limit
Of course ROW_NUMBER didn't exist back in SQL 2000 but since you've not told us which SQL you're working with I'm assuming something newer.
Related
We are currently running a number of hand-crafted and optimized OData queries on Exact Online using Python. This runs on several thousand of divisions. However, I want to migrate them to Invantive SQL for ease of maintenance.
But some of the optimizations like explicit orderby in the OData query are not forwarded to Exact Online by Invantive SQL; they just retrieve all data or the top x and then do an orderby.
Especially for maximum value determination that can be a lot slower.
Simple sample on small table:
https://start.exactonline.nl/api/v1/<<division>>/financial/Journals?$select=BankAccountIBAN,BankAccountDescription&$orderby=BankAccountIBAN desc&$top=5
Is there an alternative to optimize the actual OData queries executed by Invantive SQL?
You can either use the Data Replicator or send the hand-craft OData query through a native platform request, such as:
insert into NativePlatformScalarRequests
( url
, orig_system_group
)
select replace('https://start.exactonline.nl/api/v1/{division}/financial/Journals?$select=BankAccountIBAN,BankAccountDescription&$orderby=BankAccountIBAN desc&$top=5', '{division}', code)
, 'MYSTUFF-' || code
from systempartitions#datadictionary
limit 100 /* First 100 divisions. */
create or replace table exact_online_download_journal_top5#inmemorystorage
as
select jte.*
from ( select npt.result
from NativePlatformScalarRequests npt
where npt.orig_system_group like 'MYSTUFF-%'
and npt.result is not null
) npt
join jsontable
( null
passing npt.result
columns BankAccountDescription varchar2 path 'd[0].BankAccountDescription'
, BankAccountIBAN varchar2 path 'd[0].BankAccountIBAN'
) jte
From here on you can use the in memory table, such as:
select * from exact_online_download_journal_top5#inmemorystorage
But of course you can also 'insert into sqlserver'.
I'm just beginning with ruby on rails and have a question regarding a bit more complex query. So far I've done simple queries while looking at rails guide and it worked really well.
Right now I'm trying to get some Ids from database and I would use those Ids to get the real objects and do something with them. Getting those is a bit more complex than simple Object.find method.
Here is how my query looks like :
select * from quotas q, requests r
where q.id=r.quota_id
and q.status=3
and r.text is not null
and q.id in
(
select A.id from (
select max(id) as id, name
from quotas
group by name) A
)
order by q.created_at desc
limit 1000;
This would give me 1000 ids when executing this query from sql manager. And I was thinking to obtain the list of ids first and then find objects by id.
Is there a way to get these objects directly by using this query? Avoiding ids lookup? I googled that you can execute query like this :
ActiveRecord::Base.connection.execute(query);
Assuming Quota has_many :requests,
Quota.includes(:requests).
where(status:3).
where('requests.text is not null').
where("quotas.id in (#{subquery_string_here})").
order('quotas.created_at desc').limit(1000)
I'm by no means an expert but most basic SQL functionality is baked into ActiveRecord. You might also want to look at the #group and #pluck methods for ways to eliminate the ugly string subquery.
Calling #to_sql on a relationship object will show you the SQL command it is equivalent to, and may help with your debugging.
I would use find_by_sql for this. I wouldn't swear that this is exactly right, but as I recall you can pretty much plonk an SQL statement into a find_by_sql and the resulting columns will be returned as attributes of an array of objects of the class you call it on:
status = 3
Quota.find_by_sql('
select *
from quotas q, requests r
where q.id=r.quota_id
and q.status= ?
and r.text is not null
and q.id in
(
select A.id from (
select max(id) as id, name
from quotas
group by name) A
)
order by q.created_at desc
limit 1000;', status)
If you come to Rails as someone used to writing raw SQL, you're probably better off using this syntax than stringing together a bunch of ActiveRecord methods - the result is the same, so it's just a matter of what you find more readable.
Btw, you shouldn't use string interpolation (i.e. #{variable} syntax) inside an SQL query. Use the '?' syntax instead (see my example) to avoid SQL injection potential.
I have an MVC web site the presents a paged list of data records from a SQL Server database. The UI allows the user to filter the returned data on a number of different criteria, e.g. email address. Here is a snippet of code:
Stopwatch stopwatch = new Stopwatch();
var temp = SubscriberDB
.GetSubscribers(model.Filter, model.PagingInfo);
// Inspect SQL expression here
stopwatch.Start();
model.Subscribers = temp.ToList();
stopwatch.Stop(); // 9 seconds plus compared to < 1 second in Query Analyzer
When this code is run, the StopWatch shows an execution time of around 9 seconds. If I capture the generated SQL expression (just before it is evaluated with the .ToList() method) and cut'n'paste that as a query into SQL Server Management Studio, the execution times drops to less than 1 second. For reference here is the generated SQL expression:
SELECT [t2].[SubscriberId], [t2].[Email], [t3].[Reference] AS [DataSet], [t4].[Reference] AS [DataSource], [t2].[Created]
FROM (
SELECT [t1].[SubscriberId], [t1].[SubscriberDataSetId], [t1].[SubscriberDataSourceId], [t1].[Email], [t1].[Created], [t1].[ROW_NUMBER]
FROM (
SELECT ROW_NUMBER() OVER (ORDER BY [t0].[Email], [t0].[SubscriberDataSetId]) AS [ROW_NUMBER], [t0].[SubscriberId], [t0].[SubscriberDataSetId], [t0].[SubscriberDataSourceId], [t0].[Email], [t0].[Created]
FROM [dbo].[inbox_Subscriber] AS [t0]
WHERE [t0].[Email] LIKE '%_EMAIL_ADDRESS_%'
) AS [t1]
WHERE [t1].[ROW_NUMBER] BETWEEN 0 + 1 AND 0 + 20
) AS [t2]
INNER JOIN [dbo].[inbox_SubscriberDataSet] AS [t3] ON [t3].[SubscriberDataSetId] = [t2].[SubscriberDataSetId]
INNER JOIN [dbo].[inbox_SubscriberDataSource] AS [t4] ON [t4].[SubscriberDataSourceId] = [t2].[SubscriberDataSourceId]
ORDER BY [t2].[ROW_NUMBER]
If I remove the email filter clause, then the controller's StopWatch returns a similar response time to the SQL Management Studio query, less than 1 second - so I am assuming that the basic interface to SQL plumbing is working correctly and that the problem lies with the evaluation of the Linq expression. I should also mention that this is quite a large database with upwards of 1M rows in the subscriber table.
Can anyone throw any light on why there should be such a high (x10) performance differential and what, if anything can be done to address this?
Well not sure about that. 1M rows with a full like can take quiet time. Is Email indexed? Can you run the query with Email% instead of %Email% and see what happen?
I have a Custom Query that look like this
self.account.websites.find(:all,:joins => [:group_websites => {:group => :users}],:conditions=>["users.id =?",self])
where self is a User Object
I manage to generate the equivalent SQL for same
Here how it look
sql = "select * from websites INNER JOIN group_websites on group_websites.website_id = websites.id INNER JOIN groups on groups.id = group_websites.group_id INNER JOIN group_users ON (groups.id = group_users.group_id) INNER JOIN users on (users.id = group_users.user_id) where (websites.account_id = #{account_id} AND (users.id = #{user_id}))"
With the decent understanding of SQL and ActiveRecord I assumed that(which most would agree on) the result obtained from above query might take a longer time as compare to result obtained from find_by_sql(sql) one.
But Surprisingly
When I ran the above two
I found the ActiveRecord custom Query leading the way from ActiveRecord "find_by_sql" in term of load time
here are the test result
ActiveRecord Custom Query load time
Website Load (0.9ms)
Website Columns(1.0ms)
find_by_sql load time
Website Load (1.3ms)
Website Columns(1.0ms)
I repeated the test again an again and the result still the came out the same(with Custom Query winning the battle)
I know the difference aren't that big but still I just cant figure out why a normal find_by_sql query is slower than Custom Query
Can Anyone Share a light on this.
Thanks Anyway
Regards
Viren Negi
With the find case, the query is parameterized; this means the database can cache the query plan and will not need to parse and compile the query again.
With the find_by_sql case the entire query is passed to the database as a string. This means there is no caching that the database can do on the structure of the query, and it needs to be parsed and compiled on each occasion.
I think you can test this: try find_by_sql in this way (parameterized):
User.find_by_sql(["select * from websites INNER JOIN group_websites on group_websites.website_id = websites.id INNER JOIN groups on groups.id = group_websites.group_id INNER JOIN group_users ON (groups.id = group_users.group_id) INNER JOIN users on (users.id = group_users.user_id) where (websites.account_id = ? AND (users.id = ?))", account_id, users.id])
Well, the reason is probably quite simple - with custom SQL, the SQL query is sent immediately to db server for execution.
Remember that Ruby is an interpreted language, therefore Rails generates a new SQL query based on the ORM meta language you have used before it can be sent to the actual db server for execution. I would say additional 0.1 ms is the time taken by framework to generate the query.
In rails I have 2 tables:
bans(ban_id, admin_id)
ban_reasons(ban_reason_id, ban_id, reason_id)
I want to find all the bans for a certain admin where there is no record in the ban_reasons table. How can I do this in Rails without looping through all the ban records and filtering out all the ones with ban.ban_reasons.nil? I want to do this (hopefully) using a single SQL statement.
I just need to do: (But I want to do it the "rails" way)
SELECT bans.* FROM bans WHERE admin_id=1234 AND
ban_id NOT IN (SELECT ban_id FROM ban_reasons)
Your solution works great (only one request) but it's almost plain SQL:
bans = Ban.where("bans.id NOT IN (SELECT ban_id from ban_reason)")
You may also try the following, and let rails do part of the job:
bans = Ban.where("bans.id NOT IN (?)", BanReason.select(:ban_id).map(&:ban_id).uniq)
ActiveRecord only gets you to a point, everything after should be done by raw SQL. The good thing about AR is that it makes it pretty easy to do that kind of stuff.
However, since Rails 3, you can do almost everything with the AREL API, although raw SQL may or may not look more readable.
I'd go with raw SQL and here is another query you could try if yours doesn't perform well:
SELECT b.*
FROM bans b
LEFT JOIN ban_reason br on b.ban_id = br.ban_id
WHERE br.ban_reason_id IS NULL
Using Where Exists gem (which I'm author of):
Ban.where(admin_id: 123).where_not_exists(:ban_reasons)