Delphi KBMMemtable filter speed problem - delphi

Hi I am using KBmmemtable in a small project and come across a small speed issue i cannot seem to fix.
Basically I have a field in the table which has a boolean value, the table has about 100 records in it. If I itterate though the records in the table setting the value of the field to true it does it very quickly, however if I set a filter on the table and then itterate through the filtered records it takes about 10 times longer even though there could only be 10 records to iteerate through.
Anyone got any ideas
The code I am using is
DM1.DS1.Enabled := False;
with DM1.DS1.DataSet do begin
First;
while not Eof do begin
edit;
Fields[18].AsBoolean := TickState;
// FieldByName('Selected').AsBoolean := TickState;
post;
next;
end;
end;
DM1.DS1.Enabled := true;
I do have an index on the field, I have also tried it without an index
thanks
colin

There is a way to use a filter on a kbmMemTable and make it work really fast...
Set kbmmem.Filtered:=true;
and dont use the Filter property, instead use the OnFilter Event...
procedure Tform1.kbmmemFilterRecord(DataSet: TDataSet;
var Accept: Boolean);
begin
Accept:=Fields[18].AsBoolean;
// when you iter your table you would see only thouse rows having "true"
// on the field "Selected"
end;
and yes dont forget to DisableControls before the while...
with kbmMem do
try
DisableControls;
Filtered:=true;
First;
while not eof do
begin
// do your stuff here
Next;
end;
finally
EnableControls;
end;

This is a suggestion for the loop, it should not take any longer than with an unfiltered kbmMemTable:
with kbmMemTable do
begin
First;
while not EOF do
begin
//do something, but don't change the position of the record-pointer!
//if you do some writing to the record, be sure to
// enable "AutoReposition" in your kbmMemTable
Next;
end;
end;
Disabling the DataSource is not such a good option. Every Component attached to the DataSource is then "empty" and must be refreshed. You get a lot of problems if you use recursion or more than one "disabling" of a DataSource. Same, when you enable the DataSet. With DisableControls you signal all the attached components, that they must not update data. With EnableControls, this condition is ended and the controls are refreshed. Another advantage is, that there is a counter incremented with every DisableControls and decremented with every EnableControls. So you can call this multiple times (for example in an recursion) and only the last call of EnableControls finally enables the controls.

Related

Delphi DBGrid date format for Firebird timestamp field

I display the content of a Firebird database into a TDBgrid. The database has a 'TIMESTAMP' data type field that I would like to display with date/time format:
'YYYY/MM/DD HH:mm:ss'. (Now its displayed as 'YYMMDD HHmmss')
How to achieve this?
I tried this:
procedure TDataModule1.IBQuery1AfterOpen(DataSet: TDataSet);
begin
TDateTimeField(IBQuery1.FieldByName('timestamp_')).DisplayFormat := 'YYYY/MM/DD HH:mm:ss';
end;
But this causes some side effects at other parts of the program, so its not an alternative. For example at the 'IBQuery1.Open' statement I get the '...timestamp_ not found...' debugger message in the method that I clear the database with.
function TfrmLogger.db_events_clearall: integer;
begin
result := -1;
try
with datamodule1.IBQuery1 do begin
Close;
With SQL do begin
Clear;
Add('DELETE FROM MEVENTS')
end;
if not Prepared then
Prepare;
Open; //Exception here
Close;
Result := 1;
end;
except
on E: Exception do begin
ShowMessage(E.ClassName);
ShowMessage(E.Message);
Datamodule1.IBQuery1.close;
end;
end;
end;
I get the same exception message when trying to open the query for writing into the database.
*EDIT >>
I have modified the database clear as the following:
function TfrmLogger.db_events_clearall: integer;
var
IBQuery: TIBQuery;
IBTransaction: TIBTransaction;
DataSource: TDataSource;
begin
result := -1;
//Implicit local db objects creation
IBQuery := TIBQuery.Create(nil);
IBQuery.Database := datamodule1.IBdbCLEVENTS;
DataSource := TDataSource.Create(nil);
DataSource.DataSet := IBQuery;
IBTransaction := TIBTransaction.Create(nil);
IBTransaction.DefaultDatabase := datamodule1.IBdbCLEVENTS;
IBQuery.Transaction := IBTransaction;
try
with IBQuery do begin
SQL.Text := DELETE FROM MSTEVENTS;
ExecSQL;
IBTransaction.Commit;
result := 1;
end;
except
on E : Exception do
begin
ShowMessage(E.ClassName + ^M^J + E.Message);
IBTransaction.Rollback;
end;
end;
freeandnil(IBQuery);
freeandnil(DataSource);
freeandnil(IBTransaction);
end;
After clearing the database yet i can load the records into the dbgrid, seems like the database has not been updated. After the program restart i can see all the records been deleted.
The whole function TfrmLogger.db_events_clearall seems very dubious.
You do not provide SQL_DELETE_ROW but by the answer this does not seem to be SELECT-request returning the "resultset". So most probably it should NOT be run by ".Open" but instead by ".Execute" or ".ExecSQL" or something like that.
UPD. it was added SQL_DELETE_ROW = 'DELETE FROM MEVENTS'; confirming my prior and further expectations. Almost. The constant name suggests you want to delete ONE ROW, and the query text says you delete ALL ROWS, which is correct I wonder?..
Additionally, since there is no "resultset" - there is nothing to .Close after .Exec.... - but you may check the .RowsAffected if there is such a property in DBX, to see how many rows were actually scheduled to be deleted.
Additionally, no, this function DOES NOT delete rows, it only schedules them to be deleted. When dealing with SQL you do have to invest time and effort into learning about TRANSACTIONS, otherwise you would soon get drown in side-effects.
In particular, here you have to COMMIT the deleting transaction. For that you either have to explicitly create, start and bind to the IBQuery a transaction, or to find out which transaction was implicitly used by IBQuery1 and .Commit; it. And .Rollback it on exceptions.
Yes, boring, and all that. And you may hope for IBX to be smart-enough to do commits for you once in a while. But without isolating data changes by transactions you would be bound to hardly reproducible "side effects" coming from all kinds of "race conditions".
Example
FieldDefs.Clear; // frankly, I do not quite recall if IBX has those, but probably it does.
Fields.Clear; // forget the customizations to the fields, and the fields as well
Open; // Make no Exception here
Close;
Halt; // << insert this line
Result := 1;
Try this, and I bet your table would not get cleared despite the query was "opened" and "closed" without error.
The whole With SQL do begin monster can be replaced with the one-liner SQL.Text := SQL_DELETE_ROW;. Learn what TStrings class is in Delphi - it is used in very many places of Delphi libraries so it would save you much time to know this class services and features.
There is no point to Prepare a one-time query, that you execute and forget. Preparation is done to the queries where you DO NOT CHANGE the SQL.Text but only change PARAMETERS and then re-open the query with THE SAME TEXT but different values.
Okay, sometimes I do use(misuse?) explicit preparation to make sure the library fetches parameters datatypes from the server. But in your example there is neither. Your code however does not use parameters and you do not use many opens with the same neverchanging SQL.text. Thus, it becomes a noise, making longer to type and harder to read.
Try ShowMessage(E.ClassName + ^M^J + E.Message) or just Application.ShowException(E) - no point to make TWO stopping modal windows instead of one.
Datamodule1.IBQuery1.close; - this is actually a place for rolling back the transaction, rather than merely closing queries, which were not open anyway.
Now, the very idea to make TWO (or more?) SQL requests going throw ONE Delphi query object is questionable per se. You make customization to the query, such as fixing DisplayFormat or setting fields' event handlers, then that query is quite worth to be left persistently customized. You may even set DisplayFormat in design-time, why not.
There is little point in jockeying on one single TIBQuery object - have as many as you need. As of now you have to pervasively and accurately reason WHICH text is inside the IBQuery1 in every function of you program.
That again creates the potential for future side effects. Imagine you have some place where you do function1; function2; and later you would decide you need to swap them and do function2; function1;. Can you do it? But what if function2 changes the IBQuery1.SQL.Text and function1 is depending on the prior text? What then?
So, basically, sort your queries. There should be those queries that do live across the function calls, and then they better to have a dedicated query object and not get overused with different queries. And there should be "one time" queries that only are used inside one function and never outside it, like the SQL_DELETE_ROW - those queries you may overuse, if done with care. But still better remake those functions to make their queries as local variables, invisible to no one but themselves.
PS. Seems you've got stuck with IBX library, then I suggest you to take a look at this extension http://www.loginovprojects.ru/download.php?getfilename=uploads/other/ibxfbutils.zip
Among other things it provides for generic insert/delete functions, which would create and delete temporary query objects inside, so you would not have to think about it.
Transactions management is still on you to keep in mind and control.

Delphi DBGrid showing compressed rows

I am having the strangest of issues with Delphi's DBGrid.
I noticed that Sometimes, and I mean only sometimes (It is completely random) when I load rows into a delphi DBGrid, the grid does not show the data.
It instead shows a couple of compressed rows, basically the delphi rows are so narrow in height that the information cannot even be read.
What would be the cause of this? And how can one fix it?
Update
I have finally been able to catch the rows do it myself to get an image.
As you can see, the rows are technically showing as 1 is selected. But it is asif they are being compressed very close together so that it apears to be empty...
Please see the image below:
ANY IDEAS would be awesome as to what is causing this, and how to prevent it...
This problem occured to me, too. And I think I have solved.
In my situation I was calling ADOQuery.Open(); inside TThread, and this ADOQuery was bound to DataSource and it was bound to DBGrid. I suspected there may be something with execution in a secondary thread, so I played a little with ADOQuery.
Here's what I did that solved my problem. Before calling ADOQuery.Open() and before starting a new thread, I did DataSource.DataSet := nil;. I assign Thread.OnTerminate := RefreshGridFinished;. Then I start that new TThread with some procedure in which ADOQuery.Open(); eventually is called. Then, when TThread finishes, I have this handler, which will assign fetched and full ADOQuery aka DataSet to DataSource:
procedure TMyForm.RefreshGridFinished(Sender: TObject);
begin
TThread.Synchronize(TThread(Sender),
procedure
begin
DataSource.DataSet := ADOQuery; // I assign fetched dataset
end);
if TThread(Sender).FatalException <> nil then
begin
Exit;
end;
Thread := nil; // Class field
end;

Refresh Nested DataSet with poFetchDetailsOnDemand

Is there a way to refresh only the Detail DataSet without reloading all master dataset?
this is what I've tried so far:
DM.ClientDataSet2.Refresh;
DM.ClientDataSet2.RefreshRecord;
I have also tried:
DM.ClientDataSet1.Refresh;
But the method above refreshes the entire Master dataset, not just the current record.
Now, the following code seems to do anything:
DM.ClientDataSet1.RefreshRecord;
Is there a workaround or a proper way to do what I want? (maybe an interposer...)
Additional Info:
ClientDataSet1 = Master Dataset
ClientDataSet2 = Detail DataSet , is the following: *
object ClientDataSet2: TClientDataSet
Aggregates = <>
DataSetField = ClientDataSet1ADOQuery2
FetchOnDemand = False
.....
end
Provider properties:
object DataSetProvider1: TDataSetProvider
DataSet = ADOQuery1
Options = [poFetchDetailsOnDemand]
UpdateMode = upWhereKeyOnly
Left = 24
Top = 104
end
Googling finds numerous articles that say that it isn't possible at all with nested ClientDataSets without closing and re-opening the master CDS, which the OP doesn't want to do in this case. However ...
The short answer to the q is yes, in the reasonably simple case I've tested, and it's quite straightforward, if a bit long-winded; getting the necessary steps right took a while to figure out.
The code is below and includes comments explaining how it works and a few potential problems and how it avoids or works around them. I have only tested it with TAdoQueries feeding the CDSs' Provider.
When I started looking into all this, it soon became apparent that with the usual master
+ detail set-up, although Providers + CDSs are happy to refresh the master data from the server, they simply will not refresh the detail records once they've been read from the server for the first time since the cdsMaster was opened. This may be by design of course.
I don't think I need to post a DFM to go with the code. I simply have AdoQueries set up in the usual master-detail way (with the detail query having the master's PK as a parameter), a DataSetProvider pointed at the master AdoQuery, a master CDS pointed at the provider, and a detail cDS pointed at the DataSetField of the cdsMaster. To experiment and see what's going on, there are DBGrids and DBNavigators for each of these datasets.
In brief, the way the code below works is to temporarily filter the AdoQuery master and the CDS masterdown to the current row and then force a refresh of their data and the dtail data for the current master row. Doing it this way, unlike any other I tried, results in the detail rows nested in the cdsMaster's DataSet field getting refreshed.
Btw, the other blind alleys I tried included with and without poFetchDetailsOnDemand set to true, ditto cdsMaster.FetchDetailsOnDemand. Evidently "FetchDetailsOnDemand" doesn't mean ReFetchDetailsOnDemand!
I ran into a problem or two getting my "solution" working, the stickiest one being described in this SO question:
Refreshing a ClientDataSet nested in a DataSetField
I've verified that this works correctly with a Sql Server 2000(!) back-end, including picking up row data changes fired at the server from ISqlW. I've also verified, using Sql Server's Profiler, that the network traffic in a refresh only involves the single master row and its details.
Delphi 7 + Win7 64-bit, btw.
procedure TForm1.cdsMasterRowRefresh(MasterPK : Integer);
begin
// The following operations will cause the cursor on the cdsMaster to scroll
// so we need to check and set a flag to avoid re-entrancy
if DoingRefresh then Exit;
DoingRefresh := True;
try
// Filter the cdsMaster down to the single row which is to be refreshed.
cdsMaster.Filter := MasterPKName + ' = ' + IntToStr(MasterPK);
cdsMaster.Filtered := True;
cdsMaster.Refresh;
Inc(cdsMasterRefreshes); // just a counter to assist debugging
// release the filter
cdsMaster.Filtered := False;
// clearing the filter may cause the cdsMaster cursor to move, so ...
cdsMaster.Locate(MasterPKName, MasterPK, []);
finally
DoingRefresh := False;
end;
end;
procedure TForm1.qMasterRowRefresh(MasterPK : Integer);
begin
try
// First, filter the AdoQuery master down to the cdsMaster current row
qMaster.Filter := MasterPKName + ' = ' + IntToStr(MasterPK);
qMaster.Filtered := True;
// At this point Ado is happy to refresh only the current master row from the server
qMaster.Refresh;
// NOTE:
// The reason for the following operations on the qDetail AdoQuery is that I noticed
// during testing situations where this dataset would not be up-to-date at this point
// in the refreshing operations, so we update it manually. The reason I do it manually
// is that simply calling qDetail's Refresh provoked the Ado "Insufficient key column
// information for updating or refreshing" despite its query not involving a join
// and the underlying table having a PK
qDetail.Parameters.ParamByName(MasterPKName).Value := MasterPK;
qDetail.Close;
qDetail.Open;
// With the master and detail rows now re-read from the server, we can update
// the cdsMaster
cdsMasterRowRefresh(MasterPK);
finally
// Now, we can clear the filter
qMaster.Filtered := False;
qMaster.Locate(MasterPKName, MasterPK, []);
// Obviously, if qMaster were filtered in the first place, we'd need to reinstate that later on
end;
end;
procedure TForm1.RefreshcdsMasterAndDetails;
var
MasterPK : Integer;
begin
if cdsMaster.ChangeCount > 0 then
raise Exception.Create(Format('cdsMaster has %d change(s) pending.', [cdsMaster.ChangeCount]));
MasterPK := cdsMaster.FieldByName(MasterPKName).AsInteger;
cdsDetail.DisableControls;
cdsMaster.DisableControls;
qDetail.DisableControls;
qMaster.DisableControls;
try
try
qMasterRowRefresh(MasterPK);
except
// Add exception handling here according to taste
// I haven't encountered any during debugging/testing so:
raise;
end;
finally
qMaster.EnableControls;
qDetail.EnableControls;
cdsMaster.EnableControls;
cdsDetail.EnableControls;
end;
end;
procedure TForm1.cdsMasterAfterScroll(DataSet: TDataSet);
begin
RefreshcdsMasterAndDetails;
end;
procedure TForm1.cdsMasterAfterPost(DataSet: TDataSet);
// NOTE: The reason that this, in addition to cdsMasterAfterScroll, calls RefreshcdsMasterAndDetails is
// because RefreshcdsMasterAndDetails only refreshes the master + detail AdoQueries for the current
// cdsMaster row. Therefore in the case where the current cdsMaster row or its detail(s)
// have been updated, this row needs the refresh treatment before we leave it.
begin
cdsMaster.ApplyUpdates(-1);
RefreshcdsMasterAndDetails;
end;
procedure TForm1.btnRefreshClick(Sender: TObject);
begin
RefreshcdsMasterAndDetails;
end;
procedure TForm1.cdsDetailAfterPost(DataSet: TDataSet);
begin
cdsMaster.ApplyUpdates(-1);
end;

How to force a Client DataSet to recalculate calculated and internal calculated fields?

I have a ClientDatSet with a few fkInternalCalc fields. The CDS is not linked to any provider; instead it's filled on the fly. How can I force CDS to recalculate all the "calculable" fields? I can not call Refresh() because there is no provider to refresh data from. The only way I have come with so far has been to navigate through all records, which is not the best way.
PS: I have read this question and this post, but I'm hoping for a more elegant way.
I achieve that with a helper (stripped here to the necessary), which allows to call the protected methods without any hack. Make sure to check for DataSet.State = dsInternalCalc inside OnCalcFields for fkInternalCalc fields.
type
TClientDataSetHelper = class helper for TClientDataSet
public
function AssureEditing: Boolean;
procedure InternalCalc;
end;
function TClientDataSetHelper.AssureEditing: Boolean;
begin
result := not (State in [dsEdit, dsInsert]);
if result then
Edit;
end;
procedure TClientDataSetHelper.InternalCalc;
var
needsPost: Boolean;
saveState: TDataSetState;
begin
needsPost := AssureEditing;
saveState := setTempState(dsInternalCalc);
try
RefreshInternalCalcFields(ActiveBuffer);
finally
RestoreState(saveState);
end;
if needsPost then
Post;
end;
This can easily be expanded for normal calculated fields using CalculateFields. Although this shouldn't be necessary as calculated fields are recalculated whenever any other data field changes.
This is a bit of a hack, but it works!
DBGrid.Height := 30;
DBGrid.Height := 200; // Refresh all Rows after first
CalculatedProc(DataSet); // Refresh first calculated fields. (Write name of your calculate procedure)

Delphi. How to Disable/Enable controls without triggering controls events

I have a DataSet (TZQuery), which has several boolean fields, that have TDBCheckBoxes assigned to them.
These CheckBoxes have "OnClick" events assigned to them and they are triggered whenever I change field values (which are assigned to checkboxes).
The problem is that I do not need these events triggerred, during many operations i do with the dataset.
I've tried calling DataSet.DisableControls, but then events are called right after i call DataSet.EnableControls.
So my question is - is there a way to disable triggering Data-aware controls events.
Edit (bigger picture):
If an exception happens while let's say saving data, i have to load the default values (or the values i've had before saving it). Now while loading that data, all these events (TDBCheckBoxes and other data-aware controls) are triggered, which do all sorts of operations which create lag and sometimes even unwanted changes of data, i'm looking for an universal solution of disabling them all for a short period of time.
Building on Guillem's post:
Turn off everything:
Traverse each component on the form with the for-loop, shown below, changing the properties to the desired value.
If you want to later revert back to the original property values, then you must save the original value (as OldEvent is used below.)
Edit: The code below shows the key concept being discussed. If components are being added or deleted at run-time, or if you'd like to use the absolutely least amount of memory, then use a dynamic array, and as Pieter suggests, store pointers to the components rather than indexing to them.
const
MAX_COMPONENTS_ON_PAGE = 100; // arbitrarily larger than what you'd expect. (Use a dynamic array if this worries you.
var
OldEvent: Array[0.. MAX_COMPONENTS_ON_PAGE - 1] of TNotifyEvent; // save original values here
i: Integer;
begin
for i := 0 to ComponentCount - 1 do
begin
if (Components[i] is TCheckBox) then
begin
OldEvent[i] := TCheckBox(Components[i]).OnClick; // remember old state
TCheckBox(Components[i]).OnClick := nil;
end
else if (Components[i] is TEdit) then
begin
OldEvent[i] := TEdit(Components[i]).OnClick; // remember old state
TEdit(Components[i]).OnClick := nil;
end;
end;
Revert to former values
for i := 0 to ComponentCount - 1 do
begin
if (Components[i] is TCheckBox) then
TCheckBox(Components[i]).OnClick := OldEvent[i]
else if (Components[i] is TEdit) then
TEdit(Components[i]).OnClick := OldEvent[i];
end;
There may be a way to fold all of the if-statements into one generic test that answers "Does this component have an OnClickEvent" -- but I don't know what it is.
Hopefully someone will constructively criticize my answer (rather than just down voting it.) But, hopefully what I've shown above will be workable.
One way to do this is following:
var
Event : TNotifyEvent;
begin
Event := myCheckbox.OnClick;
try
myCheckbox.OnClick := nil;
//your code here
finally
myCheckbox.OnClick := Event;
end;
end;
HTH
The internal design of the TCustomCheckBox is that it triggers the Click method every time the Checked property if changed. Be it by actually clicking it or setting it in code. And this is happening here when you call EnableControls because the control gets updated to display the value of the linked field in your dataset.
TButtonControl (which is what TCustomCheckBox inherits from) has the property ClicksDisabled. Use this instead of (or in addition to) the DisableControls/EnableControls call. Unfortunately it is protected and not made public by TCustomCheckBox but you can use a small hack to access it:
type
TButtonControlAccess = class(TButtonControl)
public
property ClicksDisabled;
end;
...
TButtonControlAccess(MyCheckBox1).ClicksDisabled := True;
// do some dataset stuff
TButtonControlAccess(MyCheckBox1).ClicksDisabled := False;
Of course you can put this into a method that checks all components and sets this property if the control inherits from TCustomCheckBox or some other criteria.

Resources