Delphi knows if there's a new row after query refresh - delphi

I'm running a MySQL table with orders table from E-commerce running on VPS(Centos6).
I'm doing querys on desktop application that notify when there's a new order.
I have a query (TFDQuery) running at FormCreate event. I'm refreshing this query with a TTimer.
Is there some NATIVE way to know if a new row exists after refresh?
What i'm doing for now?
I'm counting rows number with query.RecordCount; After query on FormCreate, setting it to a public var. At the Timer event I'm doing the same thing but with local var.
-
public
a : integer;
....
FormCreate event after query result:
a := query.RecordCount;
Timer event:
var
b: Integer;
begin
query.Refresh();
b:=query.RecordCount;
if (b>a) then
begin
//do what i want to do
a := query.RecordCount;
end;
end;
Well, everything works fine. But is this the right way? I've been searching around for a case like mine, but i didn't find anything.
Is there some native way to do it ?
I do have DevExpress components.

From your reference to FDQuery, I assume you're using FireDAC, so in theory you should be able to do this using a TFDEventAlerter, which can receive events from various RDMSs and feed them to your app as Delphi-style events.
See http://docwiki.embarcadero.com/RADStudio/XE8/en/Database_Alerts_%28FireDAC%29
Unfortunately now that you've mentioned in a comment that your RDMS is MySQL, I don't think TFDEventAlerter will help, because I don't think its amongst the RDMSs that TFDEventAlerter supports. I don't think MySQL can provide the types of notification that TFDEventAlerter needs. But don't take my word for it, try it.
Btw, if your table rows have a row ID column, then a way to find out whether rows have been added by another user with less load on the server than doing a full refresh is to make a record of the highest ID your query returns and periodically do a
Select Count(*) from mytable where ID > :ID
and only refresh your query if that returns a value greater than zero.
Btw, in some Delphi versions prior to XE8, there is a TFDEventAlerter demo:
C:\Users\Public\Documents\Embarcadero\Studio\14.0\Samples\Object Pascal\Database\FireDAC\Samples\Comp Layer\TFDEventAlerter.
but this demo seems to be missing from XE8 (in my set-up, at any rate) and the ones from earlier versions won't compile in XE8 because of changes made in the FireDAC.Stan.Intf unit.

Related

Delphi and Firedac : query Active vs query Open

Most of my application's forms use 2 or more db grids : basically, the user clicks a record in main grid and get child results in a secondary grid.
All my primary DBGrids FDQueries (that is the one with SELECT) have been set on the form but none are "active", they fire on FormShow.
I noticed that, wether I write :
FDQuery1.Active := True;
or
FDQuery1.Open;
the result is the same : my rows are displayed in the main DBGrid.
Accordingly, I call Close or Active := False on FormClose.
But there must be a difference between those approaches and this is the subject of my question : what is the difference between Query.Open and Query.Active := True; ?
What should I use if there is any significant difference ?
Thanks in advance
Math, getting less and less noob as you take the time to answer my questions :)
SIDE NOTE : my INSERT and UPDATE queries are set up with a CLEAR then, SQL.ADD, then Parameters declaration and finally an ExecSQL
Active is a property, Open is a method. When Active is set to true, it calls the Open method, when it is set to false, it calls Close. Active is useful as it can be used to check whether the dataset is actually open.
if Qry.Active then DoSomething;
SIDE NOTE : my INSERT and UPDATE queries are set up with a CLEAR then,
SQL.ADD, then Parameters declaration and finally an ExecSQL
Between Active and Open is no difference.(see whosrdaddy comment) They do the same thing - The dataset becomes active and returns the result from SELECT statement.
You can also use property Active to check if the dataset is active for example:
if not MyQuery.Active then
MyQuery.Open; // or MyQuery.Active := true;
ExecSQL execute queries that do not return a cursor to data (such as INSERT, UPDATE, DELETE, and CREATE TABLE).

Key value for this row was changed or deleted at the data store. The local row is now deleted

Migrating 1.5 million lines of D7,BDE,Paradox to XE2,ADO,MS-SQL.
We have a TDBLookupComboBox that works fine. We provide the user with an ellipsis button so they can add or delete records from the combo box's ListSource table while the combo box is visible.
If the user clicks on the ellipsis, we let them edit the table and then we Refresh the comboboxes datasource, like this:
EditTable.ShowModal; // user edits ListSource.Dataset table
Form1.DBComboBox1.ListSource.DataSet.Refresh
This worked fine in the Paradox world.
In the SQL/ADO world, if the user deletes a record from the ListSource, we get the message on the Refresh statement above:
Key value for this row was changed or deleted at the data store.
The local row is now deleted.
This occurs even if the record the user deleted was not the currently selected item in the combo box.
We don't understand why this is happening now but not in the Paradox version.
Our solution has been (after the user edits) to close and open the ListSource dataset as shown below, but this is clumsy (and we'll have to replicate in almost 100 places we do this kind of thing.)
Here's our current fix:
var
KeyBeforeUserEdit: Integer;
KeyBeforeUserEdit:= Form1.DBComboBox.KeyValue;
EditTable.ShowModal; // user edits ListSource.Dataset table
Form1.DBComboBox1.ListSource.DataSet.Close;
Form1.DBComboBox1.ListSource.DataSet.Open;
if Form1.DBComboBox1.ListSource.DataSet.Locate('UniqueKey', KeyBeforeUserEdit, []) then
From1.DBComboBox1.KeyValue := KeyBeforeUserEdit;
Any alternate suggestions or explanations why this is necessary?
I can't know for sure what is going on but you may be able to simplify your migration (albeit not good practice) in the following way.
ShowModal is a virtual function so you can override it in the class EditTable belongs to (TEditTable?) providing that you have that source. Within the unit add the Form1 unit to the uses clause in the implementation section (if it is not already there) and add your override as follows
function TEditTable.ShowModal : integer;
var
KeyBeforeUserEdit: Integer;
begin
KeyBeforeUserEdit:= Form1.DBComboBox.KeyValue;
Result := inherited ShowModal; // user edits ListSource.Dataset table
Form1.DBComboBox1.ListSource.DataSet.Close;
Form1.DBComboBox1.ListSource.DataSet.Open;
if Form1.DBComboBox1.ListSource.DataSet.Locate('UniqueKey', KeyBeforeUserEdit, []) then
From1.DBComboBox1.KeyValue := KeyBeforeUserEdit;
end;
It is a bit of a kludge but may be pragmatic and save a lot of work.

How to refresh modified records to CloneSource before ApplyUpdates?

So I have a ClientDataset (cdsM1) with a Nested Detail (cdsD1). I need to print it before do ApplyUpdates, so I clone them (cdsMclone and cdsDclone) and filter the master clone just to show only one master record.
After printing, I need to update the record. At first I tried something like this:
cdsMclone.Edit;
cdsMclone.FieldByName('DATEPRINTED').AsString := Now;
cdsMclone.Post;
cdsMclone.ApplyUpdates(0);
But after this, if I change anything more in the source clientdataset, cdsM1.ApplyUpdates(0) generates a conflict (and by this I mean one that you should respond in a OnReconcileError). But this one should not be showed to the user. Also, using this approach I am sending at least two requests to the Database.
One workaround is using CloneSource property. Example:
cdsMclone.CloneSource.Edit;
cdsMclone.CloneSource.FieldByName('DATEWASPRINTED').AsString := Now;
cdsMclone.CloneSource.Post;
This way, I can call cdsM1.ApplyUpdates(0) without worries. But I really don't like this. Seems the code is changing something it should not do directly. Also, how would the code looks like if in the future I need to change something in the nested detail?
There is any other way to return back changes from the cloned clientdataset to the source?
I only have the version of Rave 5 for D7, but the following works for me, without having to use a filter on the Master table, CloneCursor or anything else like that, to produce a Master-Detail report on only the current row in the Master. So, it should avoid your problem and obviate the need for a work-around.
The MasterConnFirst and MasterConnValidateRow event handlers shown below are set in the Delphi IDE on the TRvDataSetConnection for the master.
TForm1 = class(TForm)
[...]
cdsMaster: TClientDataSet;
cdsDetail: TClientDataSet;
[...]
public
{ Public declarations }
Recs : Integer;
BM : TBookmark;
end;
procedure TForm1.btnReportClick(Sender: TObject);
begin
BM := cdsMaster.GetBookmark; // place a bookmark on the current Master row
Recs := 0; // counter for Master records processed by the report
RvProject.Execute;
end;
procedure TForm1.MasterConnFirst(Connection: TRvCustomConnection);
begin
// The following moves the cdsMaster to its current row, causing the RvReport to skip the
// Master rows preceding it.
cdsMaster.GotoBookmark(BM);
cdsMaster.FreeBookmark(BM);
end;
procedure TForm1.MasterConnValidateRow(Connection: TRvCustomConnection; var
ValidRow: Boolean);
begin
// This counts the number of Master records processed and returns ValidRow := False
// if the Master current row has already been processed. This will cause
// Master rows after our current one to be skipped by the RvReport.
Inc(Recs);
if Recs > 1 then
ValidRow := False;
end;
Anyway, if that works for you, most of the rest of this answer can probably be removed, which I'll do later on.
Notes:
The TRvDataSetConnection also has an OnEOF event with an EOF parameter which, according to the Rave5 Developers Guide, you're supposed to be able to set to True once the current master row has been processed, to make the RVReport think the master has no more rows you are interested in. However in Rave 5.0.4, which is the version which came with D7, even assigning a handler to this event causes the report to generate nothing. Presumably a bug in Rave 5.0.4, and may have been fixed in a later version. It's a pity the OnEOF handler doesn't seem to work, as using a ValidateRow handler seems a bit inefficient, considering the report engine still has to iterate over the master records following the current one.
There are two properties of the TRvDataSetConnection, DataIndex and DataRows which are supposed to provide an even simpler way of telling it which records to process, by setting DataIndex to the current record's RecNo and DataRows to 1. DataIndex works fine, but the DataRows setting seems to be ignored, presumably another bug in 5.0.4 but maybe it works properly in later versions.
If your clones are created using CloneCursor() then the original and the clone share the same underlying dataset. They are not separate sets of data, only two different views into the same data.
You do not provide details of the filter used to restrict your original data set to 1 record. However, I suspect that your change to the record is conflicting with the conditions of that filter leading to the conflict error you are seeing.
Using the clone avoids this because the cloned dataset does not have this filter applied. It is a separate view with its own filters. Hence there is no conflict.
There is no issue with modifying the underlying dataset through the cloned cursor in this way. As stated, the clone and the clone source are operating over the same data set, so this would appear to achieve precisely what you wish.
It is an aspect of your application that would benefit from some documentation however, to make the intention and dependencies involved more immediately apparent in the code for those who maintain that code in months and years to come.

DataSnap using AutoInc key and refresh current record only after insert

I've not been able to find an answer on this anywhere. Using Delphi XE7 with TClientDataSet, DataSnap & SQL Server. I need to insert a record, apply updates and then refresh that record so I can get the Id and assign it to my object. Seems pretty basic requirement, but on the contrary it is proving to be a royal pain.
I've found the obvious stuff on EDN, SO and Dr Bob:
http://edn.embarcadero.com/article/20847
DataSnap and the autoinc field
http://www.drbob42.com/examines/examinC0.htm
However these seem to focus on a "Refresh" of the TClientDataSet to re-fetches the entire table/query. Whilst this does actually resolve the Id field itself (good!), it also moves the cursor off the current record which was just inserted and so I'm not able to get the Id and assign it to my object. Also, for performance over HTTP I don't really want to refetch the entire table every time a record is inserted, if there's 10,000 records this will consume too much bandwidth and be ridiculously slow!
Consider the following code:
function TRepository<I>.Insert(const AEntity: I): I;
begin
FDataSet.DisableControls;
try
FDataSet.Insert;
AssignEntityToDataSet(AEntity); // SET'S ALL THE RELEVANT FIELDS
FDataSet.Post;
FDataSet.ApplyUpdates(-1);
FDataSet.Refresh; // <--- I tried RefreshRecord here but it cannot resolve the record
AEntity.Id := FDataSet.FieldByName('Id').AsInteger; // <----- THIS NOW POINTS TO WRONG ROW
finally
FDataSet.EnableControls;
end;
end;
Does anyone know how to achieve this? I need to be able to refresh and stay on the current record otherwise I do not know the Id of the record just created and the GUI cannot stay focused on the current record.
Hopefully something obvious I'm missing.
Cheers.
Rick.
Assuming you can get hands on the new ID inside the AfterUpdateRecord event of your DataProvider, your event handler then may look like this (the current record of DeltaDS is the one just inserted into SourceDS):
if (UpdateKind = ukInsert) then begin
DeltaDS.FindField('Id').NewValue := <TheNewID>;
end;
Make sure to have the poPropogateChanges option set in the provider. This will transfer the changed Id field back to the ClientDataSet.
Now you can get rid of the FDataSet.Refresh call.
SQL Server does allow you to get the last identity it generated in several ways - there's no need to "refresh" the record/query which means re-issuing a SELECT and can generate undesiderable side-effects. You can use SELECT SCOPE_IDENTITY() or use an OUTPUT clause. If the Delphi database driver supports it, TField.AutogenerateValue should accomplish that task automatically (see http://docwiki.embarcadero.com/Libraries/XE7/en/Data.DB.TField.AutoGenerateValue)
Otherwise you have to put that new data into your delta (see Raabe answer - this has to be done on the datasnap server which actually talks to the database) after reading it, so it's sent back to the client. You also need to set properly and TField.ProviderFlags to ensure data are applied correctly (see http://docwiki.embarcadero.com/RADStudio/XE7/en/Influencing_How_Updates_Are_Applied), usually you don't want those field appear in an UPDATE.

How to prevent Delphi ADO from loading the entire table into memory?

I am not a Delphi programmer, but I I got an old Delphi 7 application that I need to fix and it is using ADO.
The database table (MS Accesss) contains +100,000 rows and when I set the ADOTable.Active=true it starts to load the entire table into RAM and that takes a lot of memory and time.
How can I prevent ADO to load the entire table? I tried to set the MaxRecords but it does not help.
Basically all we do is att program startup:
// Connect to database
DataModule.MyADOConnection.Connected:=true;
DataModule.MeasurementsADOTable.MaxRecords:=1;
// Open datatables
DataModule.MeasurementsADOTable.Active:=true;
After setting Active=true it starts to load the entire measurements into RAM and it takes TIME!
We are using the MSDASQL.1 provider. Perhaps it does not support the MaxRecords property?
How do I add some limiting query into this data object to only "load TOP 1 * from Measurements" ?
You could use TADOQuery to limit the result set with a sql query. Or you could use TADOTable and set the CursorLocation to a Server side cursor to prevent the client loading the complete resultset in memory.
You could use that adoTable with an Server OpenForwardOnly cursor and
an TCLientDataset with PacketRecords set to nonzero value. Worked
wonderfully when I had to write an app to pump data from MSSQL to
Oracle on a customized way with tables with millions of records.
EDIT -> It would be something on the lines of this:
procedure ConfigCDSFromAdoQuery(p_ADOQ: TADOQuery; p_CDS: TClientDataset; p_Prov: TDatasetProvider);
begin
If p_ADOQ.Active then p_ADOQ.Close;
p_ADOQ.CursorLocation := clServer;
p_ADOQ.CursorType := ctOpenForwardOnly;
p_Prov.Dataset := p_ADOQ;
p_CDS.SetProvider(p_Prov);
p_CDS.PacketRecords := 100;
p_CDS.Open;
end ;
I've done this all by code, but most of that you can do in design-time.
This article is BDE specific, but applies to ADO or most client data access libraries.
http://dn.codegear.com/article/28160
I would recommend using TADODataSet (it's "closer" to the ADO layer than TADOQuery) and selecting only the data the client needs by providing a custom search form (date range, list of specific items, etc)
Good luck
On your datamodule where "MeasurementsADOTable" currently resides, drop a TADOQuery and name it "MeasurementsADOQuery"
Set the Connection property of MeasurementsADOQuery to MyADOConnection (assuming this is the case based on the little code snippet provided.)
I'm also assuming that you are displaying a grid or otherwise using a DataSource - change the DataSource component's "DataSet" property from MeasurementsADOTable to MeasurementsADOQuery
Edit the actual query to be executed by setting the SQL property of MeasurementsADOQuery. (In runtime before opening: Measurements.SQL.Text := 'select top 10 * from measurements order by whatever')
Analyze/change all references in code from MeasurementsADOTable to MeasurementsADOQuery
Dont make the adotable active on startup and turning it true later is one way but still not really gonna help....use a adodataset and populate that rather as needed during runtime with your connection text. Only relevant data will be retrieved making it much faster.
use adoquery
If you do not need any row and just want insert new row use sql command like this
'select * from myTable where id=-1'
Since Id is autonumber no rows will return .
or
'select * from myTable where 1=-1'
But I think it is not good way for Insering data. Using adocommand is sure much better.
if you want X rows
'select top X * from myTable '
In furthering Fabrico's answer above, I have legacy application which has a table with 177000 rows and 212 columns. Upon trying to open this table I get the error 'table already open' and no records are available for update. setting Table.CursorLocation := clUseServer;
fixed this issue for me.
I have found ADO + Access w/Delphi to be painfully slow, for lots of things (big table reads like you're describing, but also inserts as well, etc). My answer became "Quit using ADO and Access altogether." Never did understand why it performed so poorly, especially when earlier technologies seemed not to.

Resources