Delphi: is DBexpress faster than Firedac - delphi

I'm running a Mysql server on my network (Mariadb 10.3.24), and have made a performance test with dbexpress and firedac on the same data, same machine and with no other users on the database. I'm using Delphi 10.1 and made no changes to the connection or query components setup.
My findings were (total number of records is 261.000):
Reading 100.000 records without a "where-clause"
Firedac : 184 sec
DBexpress: 93 sec
Reading 100.000 records with a where clause (indexed)
Firedac: 160 sec
DBexpress: 86 sec
All my programs are programmed with Firedac, is there a simple way to speed up Firedac, or do i need to switch to dbexpress to get a decent performance ?
My test (identical for dxexpress and firedac):
var start, slut : tdatetime;
n : integer;
begin
start := now;
listbox1.Items.Clear;
sqlq.Close;
sqlq.SQLConnection:=sqlcon;
sqlq.SQL.Clear;
sqlq.SQL.Add('select * from forsendelser where kundenummer="test" limit '+spinedit1.Text);
sqlq.Open;
while not sqlq.Eof do begin
listbox1.Items.Add(sqlq.FieldByName('stregkode').AsString );
sqlq.Next;
end;
sqlq.Close;
n :=SecondsBetween(Now, start);
edit2.Text:=n.ToString;
end;

There are several things that can be done with your code to improve the performance.
Start with not updating the ListBox.Items during the loop, as each time an item is added or deleted the screen has to update. This isn't needed while the loop is running.
Second, stop using FieldByName inside the loop. It forces a search through the table's fields to find that field each time the loop is executed, which isn't needed. You can get the field one time before the loop runs, store it in a variable, and access it through that variable in the loop.
This should improve performance considerably for you.
var
start: TDateTime;
n: Integer;
Fld: TField;
begin
start := now;
ListBox1.Items.BeginUpdate;
try
listbox1.Items.Clear;
sqlq.Close;
sqlq.SQLConnection := sqlcon;
sqlq.SQL.Text := 'select * from forsendelser where kundenummer="test" limit ' + spinedit1.Text;
sqlq.Open;
Fld := sqlq.FieldByName('stregkode');
while not sqlq.Eof do
begin
listbox1.Items.Add(Fld.AsString);
sqlq.Next;
end;
sqlq.Close;
finally
ListBox1.Items.EndUpdate;
end;
n :=SecondsBetween(Now, start);
edit2.Text:=n.ToString;
end;

Related

Why does scrolling through ADOTable get slower and slower?

I want to read the entire table from an MS Access file and I'm trying to do it as fast as possible. When testing a big sample I found that the loop counter increases faster when it's reading the top records comparing to last records of the table. Here's a sample code that demonstrates this:
procedure TForm1.Button1Click(Sender: TObject);
const
MaxRecords = 40000;
Step = 5000;
var
I, J: Integer;
Table: TADOTable;
T: Cardinal;
Ts: TCardinalDynArray;
begin
Table := TADOTable.Create(nil);
Table.ConnectionString :=
'Provider=Microsoft.ACE.OLEDB.12.0;'+
'Data Source=BigMDB.accdb;'+
'Mode=Read|Share Deny Read|Share Deny Write;'+
'Persist Security Info=False';
Table.TableName := 'Table1';
Table.Open;
J := 0;
SetLength(Ts, MaxRecords div Step);
T := GetTickCount;
for I := 1 to MaxRecords do
begin
Table.Next;
if ((I mod Step) = 0) then
begin
T := GetTickCount - T;
Ts[J] := T;
Inc(J);
T := GetTickCount;
end;
end;
Table.Free;
// Chart1.SeriesList[0].Clear;
// for I := 0 to Length(Ts) - 1 do
// begin
// Chart1.SeriesList[0].Add(Ts[I]/1000, Format(
// 'Records: %s %d-%d %s Duration:%f s',
// [#13, I * Step, (I + 1)*Step, #13, Ts[I]/1000]));
// end;
end;
And the result on my PC:
The table has two string fields, one double and one integer. It has no primary key nor index field. Why does it happen and how can I prevent it?
I can reproduce your results using an AdoQuery with an MS Sql Server dataset of similar size to yours.
However, after doing a bit of line-profiling, I think I've found the answer to this, and it's slightly counter-intuitive. I'm sure everyone who does
DB programming in Delphi is used to the idea that looping through a dataset tends to be much quicker if you surround the loop by calls to Disable/EnableControls. But who would bother to do that if there are no db-aware controls attached to the dataset?
Well, it turns out that in your situation, even though there are no DB-aware controls, the speed increases hugely if you use Disable/EnableControls regardless.
The reason is that TCustomADODataSet.InternalGetRecord in AdoDB.Pas contains this:
if ControlsDisabled then
RecordNumber := -2 else
RecordNumber := Recordset.AbsolutePosition;
and according to my line profiler, the while not AdoQuery1.Eof do AdoQuery1.Next loop spends 98.8% of its time executing the assignment
RecordNumber := Recordset.AbsolutePosition;
! The calculation of Recordset.AbsolutePosition is hidden, of course, on the "wrong side" of the Recordset interface, but the fact that the time to call it apparently increases the further you go into the recordset makes it reasonable imo to speculate that it's calculated by counting from the start of the recordset's data.
Of course, ControlsDisabled returns true if DisableControls has been called and not undone by a call to EnableControls. So, retest with the loop surrounded by Disable/EnableControls and hopefully you'll get a similar result to mine. It looks like you were right that the slowdown isn't related to memory allocations.
Using the following code:
procedure TForm1.btnLoopClick(Sender: TObject);
var
I: Integer;
T: Integer;
Step : Integer;
begin
Memo1.Lines.BeginUpdate;
I := 0;
Step := 4000;
if cbDisableControls.Checked then
AdoQuery1.DisableControls;
T := GetTickCount;
{.$define UseRecordSet}
{$ifdef UseRecordSet}
while not AdoQuery1.Recordset.Eof do begin
AdoQuery1.Recordset.MoveNext;
Inc(I);
if I mod Step = 0 then begin
T := GetTickCount - T;
Memo1.Lines.Add(IntToStr(I) + ':' + IntToStr(T));
T := GetTickCount;
end;
end;
{$else}
while not AdoQuery1.Eof do begin
AdoQuery1.Next;
Inc(I);
if I mod Step = 0 then begin
T := GetTickCount - T;
Memo1.Lines.Add(IntToStr(I) + ':' + IntToStr(T));
T := GetTickCount;
end;
end;
{$endif}
if cbDisableControls.Checked then
AdoQuery1.EnableControls;
Memo1.Lines.EndUpdate;
end;
I get the following results (with DisableControls not called except where noted):
Using CursorLocation = clUseClient
AdoQuery.Next AdoQuery.RecordSet AdoQuery.Next
.MoveNext + DisableControls
4000:157 4000:16 4000:15
8000:453 8000:16 8000:15
12000:687 12000:0 12000:32
16000:969 16000:15 16000:31
20000:1250 20000:16 20000:31
24000:1500 24000:0 24000:16
28000:1703 28000:15 28000:31
32000:1891 32000:16 32000:31
36000:2187 36000:16 36000:16
40000:2438 40000:0 40000:15
44000:2703 44000:15 44000:31
48000:3203 48000:16 48000:32
=======================================
Using CursorLocation = clUseServer
AdoQuery.Next AdoQuery.RecordSet AdoQuery.Next
.MoveNext + DisableControls
4000:1031 4000:454 4000:563
8000:1016 8000:468 8000:562
12000:1047 12000:469 12000:500
16000:1234 16000:484 16000:532
20000:1047 20000:454 20000:546
24000:1063 24000:484 24000:547
28000:984 28000:531 28000:563
32000:906 32000:485 32000:500
36000:1016 36000:531 36000:578
40000:1000 40000:547 40000:500
44000:968 44000:406 44000:562
48000:1016 48000:375 48000:547
Calling AdoQuery1.Recordset.MoveNext calls directly into the MDac/ADO layer, of
course, whereas AdoQuery1.Next involves all the overhead of the standard TDataSet
model. As Serge Kraikov said, changing the CursorLocation certainly makes a difference and doesn't exhibit the slowdown we noticed, though obviously it's significantly slower than using clUseClient and calling DisableControls. I suppose it depends on exactly what you're trying to do whether you can take advantage of the extra speed of using clUseClient with RecordSet.MoveNext.
When you open a table, ADO dataset internally creates special data structures to navigate dataset forward/backward - "dataset CURSOR". During navigation, ADO stores the list of already visited records to provide bidirectional navigation.
Seems ADO cursor code uses quadratic-time O(n2) algorithm to store this list.
But there are workaround - use server-side cursor:
Table.CursorLocation := clUseServer;
I tested your code using this fix and get linear fetch time - fetching every next chunk of records takes the same time as previous.
PS Some other data access libraries provides special "unidirectional" datasets - this datasets can traverse only forward and don't even store already traversed records - you get constant memory consumption and linear fetch time.
DAO is native to Access and (IMHO) is typically faster.
Whether or not you switch, use the GetRows method. Both DAO and ADO support it.
There is no looping. You can dump the entire recordset into an array with a couple of lines of code. Air code:
yourrecordset.MoveLast
yourrecordset.MoveFirst
yourarray = yourrecordset.GetRows(yourrecordset.RecordCount)

Delphi 2009 - create a delay of over an hour?

My program needs to send about 600 emails out. As my ISP only allows 400 in each hour I need to send say 300 of them, wait for an hour and send another 300. I don't really want my program hung for an hour as the user might want to do something else with it while waiting. Better still, they might even want to shut it down.
Is there a way to delay for an hour whilst keeping the program responsive or even allowing it to be shut down and woken up to continue?
I found this which is helpful
[What is the best way to program a delay in Delphi?
but that declares a variable
var
SE: TSimpleEvent;
which Delphi 2009 does not understand
BTW if the answer involves threads please could you explain carefully or even give code as I have never used threads before (not knowingly anyway!)
A simple solution is often the best. You could use a standard TTimer with an interval of 1 second and the count down from 3600. When you counter is zero one hours has past.
This is a dummy implpementation showing my point :
procedure TForm1.FormCreate(Sender: TObject);
begin
Timer1.Tag := 1;
end;
procedure Send300EMails;
begin
// Dummy
end;
procedure TForm1.Timer1Timer(Sender: TObject);
begin
Timer1.Tag := Timer1.Tag - 1;
if Timer1.Tag = 0 then
begin
Timer1.Enabled := false;
Timer1.Tag := SecsPerHour;
Send300EMails;
Timer1.Enabled := True;
end;
end;

How to reduce CPU usage when scanning for folders/sub-folders/files?

I have developed an application that scans basically everywhere for a file or list of files.
When I scan small folders like 10 000 files and sub files there is no problem. But when I scan for instance my entire users folder with more than 100 000 items, it is very heavy on my processor. It takes about 40% of my processor's power.
Is there a way to optimize this code so that it uses less CPU?
procedure GetAllSubFolders(sPath: String);
var
Path: String;
Rec: TSearchRec;
begin
try
Path := IncludeTrailingBackslash(sPath);
if FindFirst(Path + '*.*', faAnyFile, Rec) = 0 then
try
repeat
Application.ProcessMessages;
if (Rec.Name <> '.') and (Rec.Name <> '..') then
begin
if (ExtractFileExt(Path + Rec.Name) <> '') And
(ExtractFileExt(Path + Rec.Name).ToLower <> '.lnk') And
(Directoryexists(Path + Rec.Name + '\') = False) then
begin
if (Pos(Path + Rec.Name, main.Memo1.Lines.Text) = 0) then
begin
main.ListBox1.Items.Add(Path + Rec.Name);
main.Memo1.Lines.Add(Path + Rec.Name)
end;
end;
GetAllSubFolders(Path + Rec.Name);
end;
until FindNext(Rec) <> 0;
finally
FindClose(Rec);
end;
except
on e: Exception do
ShowMessage(e.Message);
end;
end;
My app searches for all the files in a selected folder and sub-folder, zip's them and copies them to another location you specify.
The Application.ProcessMessages command is there to make sure the application doesn't look like it is hanging and the user closes it. Because finding 100 000 files for instance can take an hour or so...
I am concerned about the processor usage, The memory is not really affected.
Note: The memo is to make sure the same files are not selected twice.
I see the following performance problems:
The call to Application.ProcessMessages is somewhat expensive. You are polling for messages rather than using a blocking wait, i.e. GetMessage. As well as the performance issue, the use of Application.ProcessMessages is generally an indication of poor design for various reasons and one should, in general, avoid the need to call it.
A non-virtual list box performs badly with a lot of files.
Using a memo control (a GUI control) to store a list of strings is exceptionally expensive.
Every time you add to the GUI controls they update and refresh which is very expensive.
The evaluation of Memo1.Lines.Text is extraordinarily expensive.
The use of Pos is likewise massively expensive.
The use of DirectoryExists is expensive and spurious. The attributes returned in the search record contain that information.
I would make the following changes:
Move the search code into a thread to avoid the need for ProcessMessages. You'll need to devise some way to transport the information back to the main thread for display in the GUI.
Use a virtual list view to display the files.
Store the list of files that you wish to search for duplicates in a dictionary which gives you O(1) lookup. Take care with case-insensitivity of file names, an issue that you have perhaps neglected so far. This replaces the memo.
Check whether an item is a directory by using Rec.Attr. That is check that Rec.Attr and faDirectory <> 0.
I agree with the answer which says you would do best to do what you're doing in a background thread and I don't want to encourage you to persist in doing it in your main thread.
However, if you go to a command prompt and do this:
dir c:\*.* /s > dump.txt & notepad dump.txt
you may be surprised quite how quickly Notepad pops into view.
So there are few things you could do to speed up your GetAllSubFolders, even if you keep it in your main thread, e.g. to bracket the code by calls to main.Memo1.Lines.BeginUpdate and main.Memo1.Lines.EndUpdate, likewise main.Listbox1.Items.BeginUpdate and EndUpdate. This will stop these controls being updated while it executes (which is actually what your code is spending most of its time doing, that and the "if Pos( ...)" business I've commented on below). And, if you haven't gathered already, Application.ProcessMessages is evil (mostly).
I did some timings on my D: drive, which is a 500Gb SSD with 263562 files in 35949 directories.
The code in your q: 6777 secs
Doing a dir to Notepad as per the above: 15 secs
The code below, in main thread: 9.7 secs
The reason I've included the code below in this answer is that you'll find it much easier to execute in a thread because it gathers the results into a TStringlist, whose contents you can then assign to your memo and listbox once the thread has completed.
A few comments on the code in your q, which I imagine you might have got from somewhere.
It pointlessly recurses even when the current entry in Rec is a plain file. The code below only recurses if the current Rec entry is a directory.
It apparently tries to avoid duplicates by the "if Pos( ...)" business, which shouldn't be necessary (except maybe if there's a symbolic link (e.g created with the MkLink command) somewhere that points elsewhere on the drive) and does it in a highly inefficient manner, i.e. by searching for the filename in the memo contents - those will get longer and longer as it finds more files). In the code below, the stringlist is set up to discard duplicates and has its Sorted property set to True, which makes its checking for duplicates much quicker, becauseit can then do a binary search through its contents rather than a serial one.
It calculates Path + Rec.Name 6 times for each thing it finds, which is avoidably inefficient at r/t and inflates the source code. This is only a minor point, though, compared to the first two.
Code:
function GetAllSubFolders(sPath: String) : TStringList;
procedure GetAllSubFoldersInner(sPath : String);
var
Path,
AFileName,
Ext: String;
Rec: TSearchRec;
Done: Boolean;
begin
Path := IncludeTrailingBackslash(sPath);
if FindFirst(Path + '*.*', faAnyFile, Rec) = 0 then begin
Done := False;
while not Done do begin
if (Rec.Name <> '.') and (Rec.Name <> '..') then begin
AFileName := Path + Rec.Name;
Ext := ExtractFileExt(AFileName).ToLower;
if not ((Rec.Attr and faDirectory) = faDirectory) then begin
Result.Add(AFileName)
end
else begin
GetAllSubFoldersInner(AFileName);
end;
end;
Done := FindNext(Rec) <> 0;
end;
FindClose(Rec);
end;
end;
begin
Result := TStringList.Create;
Result.BeginUpdate;
Result.Sorted := True;
Result.Duplicates := dupIgnore; // don't add duplicate filenames to the list
GetAllSubFoldersInner(sPath);
Result.EndUpdate;
end;
procedure TMain.Button1Click(Sender: TObject);
var
T1,
T2 : Integer;
TL : TStringList;
begin
T1 := GetTickCount;
TL := GetAllSubfolders('D:\');
try
Memo1.Lines.BeginUpdate;
try
Memo1.Lines.Text := TL.Text;
finally
Memo1.Lines.EndUpdate;
end;
T2 := GetTickCount;
Caption := Format('GetAll: %d, Load: %d, Files: %d', [T2 - T1, GetTickCount - T2, TL.Count]);
finally
TL.Free;
end;
end;

Why is my code so slow?

Top-posted (sorry) answer, for those who don't have time to get into it but may have similar problems.
Rule #1, as always, move as much as you can out of loops.
2, moving TField var := ADODataSet.FieldByname() out of the loop
3, ADODataSet.DisableControls(); and ADODataSet.EnableControls(); around the loop
4, stringGrid.Rows[r].BeginUpdate() and EndUpdate() on each row (cannot do on teh whle control)
each of these shaved off a few seconds, but I got it down to "faster than the eye can see" by changing
loop
stringGrid.RowCount := stringGrid.RowCount + 1;
end loop
to putting stringGrid.RowCount := ADODataSet.RecordCount; before the loop
+1 and heartfelt thanks to all who helped.
(now I will go and see what I can do to optimize drawing a TChart, which is also slow ;-)
with about 3,600 rows in the table this takes 45 seconds to populate the string grid. What am I doing wrong?
ADODataSet := TADODataSet.Create(Nil);
ADODataSet.Connection := AdoConnection;
ADODataSet.CommandText := 'SELECT * FROM measurements';
ADODataSet.CommandType := cmdText;
ADODataSet.Open();
while not ADODataSet.eof do
begin
TestRunDataStringGrid.RowCount := TestRunDataStringGrid.RowCount + 1;
measurementDateTime := UnixToDateTime(ADODataSet.FieldByname('time_stamp').AsInteger);
DoSQlCommandWithResultSet('SELECT * FROM start_time_stamp', AdoConnection, resultSet);
startDateTime := UnixToDateTime(StrToInt64(resultSet.Strings[0]));
elapsedTime := measurementDateTime - startDateTime;
TestRunDataStringGrid.Cells[0, Pred(TestRunDataStringGrid.RowCount)] := FormatDateTime('hh:mm:ss', elapsedTime);
TestRunDataStringGrid.Cells[1, Pred(TestRunDataStringGrid.RowCount)] := FloatToStrWithPrecision(ADODataSet.FieldByname('inputTemperature').AsFloat);
TestRunDataStringGrid.Cells[2, Pred(TestRunDataStringGrid.RowCount)] := FloatToStrWithPrecision(ADODataSet.FieldByname('outputTemperature').AsFloat);
TestRunDataStringGrid.Cells[3, Pred(TestRunDataStringGrid.RowCount)] := FloatToStrWithPrecision(ADODataSet.FieldByname('flowRate').AsFloat);
TestRunDataStringGrid.Cells[4, Pred(TestRunDataStringGrid.RowCount)] := FloatToStrWithPrecision(ADODataSet.FieldByname('waterPressure').AsFloat * convert);
TestRunDataStringGrid.Cells[5, Pred(TestRunDataStringGrid.RowCount)] := FloatToStrWithPrecision(ADODataSet.FieldByname('waterLevel').AsFloat);
TestRunDataStringGrid.Cells[6, Pred(TestRunDataStringGrid.RowCount)] := FloatToStrWithPrecision(ADODataSet.FieldByname('cod').AsFloat);
ADODataSet.Next;
end;
ADODataSet.Close();
ADODataSet.Free();
update:
Function DoSQlCommandWithResultSet(const command : String; AdoConnection : TADOConnection; resultSet : TStringList): Boolean;
var
i : Integer;
AdoQuery : TADOQuery;
begin
Result := True;
resultSet.Clear();
AdoQuery := TADOQuery.Create(nil);
try
AdoQuery.Connection := AdoConnection;
AdoQuery.SQL.Add(command);
AdoQuery.Open();
i := 0;
while not AdoQuery.eof do
begin
resultSet.Add(ADOQuery.Fields[i].Value);
i := i + 1;
AdoQuery.Next;
end;
finally
AdoQuery.Close();
AdoQuery.Free();
end;
end;
You are executing the command SELECT * FROM start_time_stamp 3,600 times, but it does not appear to me that it is correlated with your outer loop in any way. Why not execute it once before the loop?
That SELECT command appears to return only a single column of a single record, yet you use "*" to load all columns, and no WHERE clause to limit the results to a single row (if there's more than one row in the table).
You use only a limited number of columns from Measurements, but you retrieve all columns with "*".
You don't show the contents of DoSQlCommandWithResultSet, so it's not clear if there's a problem in that routine.
It's not clear whether the problem is in your database access or the string grid. Comment out all the lines pertaining to the string grid and run the program. How long does the database access alone take?
Additionally to Larry Lustig points:
In general, FieldByName is comparably slow method. You are calling it in loop for the same fields. Move the getting of field references out of the loop and store references in the variables. Like: InputTempField := ADODataSet.FieldByname('inputTemperature');
You are resizing the grid in the loop TestRunDataStringGrid.RowCount := TestRunDataStringGrid.RowCount + 1. That is the case, when you should use ADODataSet.RecordCount before the loop: TestRunDataStringGrid.RowCount := ADODataSet.RecordCount.
That is a good practice to call ADODataSet.DisableControls before loop and ADODataSet.EnableControls after loop. Even more actual that is for ADO dataset, which has not optimal implementation and those calls help.
Depending on a DBMS you are using, you can improve the fetching performance by setting a larger "rowset size". Not sure, how it control in ADO, probably setting ADODataSet.CacheSize to a greater value will help. Also, there are cursor settings :)
instead of calling ADODataSet.FieldByname('Fieldname') inside the loop you should declare local variables of type TField for each field, assign ADODataset.FindField('Fieldname') to the variables and use the variables inside the loop. FindFieldByName searches a list with every call.
Update:
procedure TForm1.Button1Click(Sender: TObject);
var
InputTemp, OutputTemp: TField;
begin
ADODataSet := TADODataSet.Create(Nil);
try
ADODataSet.Connection := ADOConnection;
ADODataSet.CommandText := 'SELECT * FROM measurements';
ADODataSet.Open;
InputTemp := ADODataSet.FindField('inputTemperature');
OutputTemp := ADODataSet.FindField('outputTemperature');
// assign more fields here
while not ADODataSet.Eof do begin
// do something with the fields, for example:
// GridCell := Format ('%3.2f', [InputTemp.AsFloat]);
// GridCell := InputTemp.AsString;
ADODataSet.Next;
end;
finally
ADODataSet.Free;
end;
end;
Another option would be to drop the TADODataset Componont on the form (or use a TDataModule) and define the fields at designtime.
Additional to the Larry Lustig answer, consider using data-aware controls instead, like the TDbGrid component.
If you aren't using data-aware controls you should use TestRunDataStringGrid.BeginUpdate before and TestRunDataStringGrid.EndUpdate after loop. Without this is your grid constantly redrawing after each modification (adding new row, cell update).
Another tip is set AdoQuery.LockType := ltReadOnly before opening query.
You could also try an instrumenting profiler instead of a sampling profiler to get better results (sampling profilers miss lot of detail info, and most time they have less then 1000 samples per second, and 1000 is already low: only good to get a quick overview).
Instrumenting profilers:
AQTime (commercial)
AsmProfiler (open source)
http://code.google.com/p/asmprofiler/wiki/AsmProfilerInstrumentingMode

Delphi: Problems with TList of Frames

I'm having a problem with an interface that consists of a number of frames (normally 25) within a TScrollBox.
There are 2 problems, and I am hoping that one is a consequence of the other...
Background:
When the application starts up, I create 25 frames, each containing approx. 20 controls, which are then populated with the default information. The user can then click on a control to limit the search to a subset of information at which point I free and recreate my frames (as the search may return < 25 records)
The problem:
If I quit the application after the initial search then it takes approx. 5 seconds to return to Delphi. After the 2nd search (and dispose / recreate of frames) it takes approx. 20 seconds)
Whilst I could rewrite the application to only create the frames once, I would like to understand what is going on.
Here is my create routine:
procedure TMF.CreateFrame(i: Integer; var FrameBottom: Integer);
var
NewFrame: TSF;
begin
NewFrame := TSF.Create(Self);
NewFrame.Name := 'SF' + IntToStr(i);
if i = 0 then
NewSF.Top := 8
else
NewSF.Top := FrameBottom + 8;
FrameBottom := NewFrame.Top + NewFrame.Height;
NewFrame.Parent := ScrollBox1;
FrameList.Add(NewFrame);
end;
And here is my delete routine:
procedure TMF.ClearFrames;
var
i: Integer;
SF: TSF;
begin
for i := 0 to MF.FrameList.Count -1 do
begin
SF := FrameList[i];
SF.Free;
end;
FrameList.Clear;
end;
What am I missing?
As you are taking control over the memory allocation of the Frames you are creating by Free'ing them, so there's no need to provide Self as the owner parameter in the create constructor. Pass nil instead to prevent the owner trying to free the frame.
Also, don't like the look of your ClearFrames routine. Try this instead:
while FrameList.count > 0 do
begin
TSF(Framelist[0]).free;
Framelist.delete(0);
end;
Framelist.clear;
If you want to know why your app is taking so long to do something, try profiling it. Try running Sampling Profiler against your program. The helpfile explains how to limit the profiling to only a specific section of your app, which you could use to only get sampling results on the clearing or creating parts. This should show you where you're actually spending most of your time and take a lot of the guesswork out of it.

Resources