I have binary data that needs to be stored in a BLOB field in a SQL-database.
In case of an UPDATE (storing into the database), the binary data comes as a string (BDS2006, no unicode).
When the BLOB field is READ, the binary data needs to be returned as a string.
Therefore, I have used these two pieces of code (qry is a TQuery):
READ:
var s: string;
begin
qry.SQL.Text := 'SELECT BlobField FROM Table WHERE ID=xxx';
qry.Open;
if qry.RecordCount > 0 then
begin
qry.First;
s := qry.FieldByName('BlobField').AsString;
end;
end;
UPDATE:
var s: string;
begin
s := ...binary data...
qry.SQL.Text := 'UPDATE Table Set BlobField=:blobparam WHERE ID=xxx';
qry.ParamByName('blobparam').AsBlob = s;
qry.ExecSQL;
end;
I'm not sure if that's the right/good/ok way to do it, but it has worked fine for a couple of years.
Now there is a problem with a specific set of binary data, which after being UPDATE'd into the database and then READ from the database is changed/corrupted.
When comparing the param value before ExecSQL with the value of s after reading, the last byte of data (in this case 1519 bytes total), is changed from 02h to 00h.
Since I am not sure if my code works correctly, I have tried to use TBlobStream, to check if the results change.
READ:
var s: string;
bs: TStream;
st: TStringStream;
begin
qry.SQL.Text := 'SELECT BlobField FROM Table WHERE ID=xxx';
qry.Open;
if qry.RecordCount > 0 then
begin
qry.First;
st := TStringStream.Create('');
bs := qry.CreateBlobStream(qry.FieldByName('BlobField'), bmRead);
bs.Position := 0;
st.CopyFrom(bs, bs.Size);
st.Position := 0;
s := st.ReadString(st.Size);
end;
end;
UPDATE:
var s: string;
bs: TStream;
st: TStringStream;
begin
s := ...binary data...
st := TStringStream.Create(s);
st.Position := 0;
qry.SQL.Text := 'UPDATE Table Set BlobField=:blobparam WHERE ID=xxx';
qry.ParamByName('blobparam').LoadFromStream(st, ftBlob);
qry.ExecSQL;
end;
The result is the same, the last byte of the read data is corrupted.
What could be my problem?
EDIT:
Using only streams produces the same problem.
I found that this only happens if the data is exactly 1519 bytes. Then, and only then, the last byte is set to 0, no matter what it was before. Of course there might be other cases for the problem, but that's one that I can reproduce every time.
If I add one more byte to the end, making it 1520 bytes, everything works fine.
I just don't see anything special here that could cause it.
I agree with Gerry that the trailing NULL looks like a string problem.
Your modified code still writes the data using TStringStream. Have you tried writing the data using a TBlobStream, and seeing if that makes a difference?
Alternatively, add some packing bytes at the end of the problem data, to check if it is related to a specific size/boundary issue. Or try replacing the problem data with a fixed test pattern, to narrow the problem down.
FWIW I have used blobs without problem for a long time, but have never treated them as strings.
Good luck narrowing the issue down.
UPDATE: looks to me like your code is fine, but you are running into somebody else's bug somewhere in the database/data access software. What database/driver/access code are you using?
Related
I'am using a firebird 2.5 server to write in a Database file(BD.fbd). My delphi XE8 project has a Data module(DMDados) with:
SQLConnection (conexao)
TSQLQUery1 (QueryBDPortico_Inicial) + TDataSetProvider1 (DSP_BDPortico_Inicial) + TClientDataSet1 (cdsBDPortico_Inicial)
TSQLQUery2 (QueryConsulta) (just for use SQL strings)
My database file has this table:
PORTICO_INICIAL
The table has these fields (all integer):
NPORTICO
ELEMENTO
ID
None of those fields are primary keys because I will have repeated values in some cases. The connection with the file is ok. The client data set is open when I run the code. The TSQLQUery2 (QueryConsulta) is open when needed.
My code, when triggered by a button, has to delete all tables' records (if exist) then full the table with integer numbers created by a LOOP.
In the first try the code just work fine, but when I press the button the second time i get the error 'Unable to find record. No key specified' then when I check the records the table is empty.
I tried to change the ProviderFlags of my query but this make no difference. I checked the field names, the table name or some SQL text error but find nothing.
My suspect is that when my code delete the records the old values stay in memory then when try apply updates with the new values the database use the old values to find the new record's place therefore causing this error.
procedure monta_portico ();
var
I,K,L,M, : integer;
begin
with DMDados do
begin
QUeryCOnsulta.SQL.Text := 'DELETE FROM PORTICO_INICIAL;';
QueryConsulta.ExecSQL();
K := 1;
for I := 1 to 10 do
begin
L := I*100;
for M := 1 to 3 do
begin
cdsBDPortico_Inicial.Insert;
cdsBDPortico_Inicial.FieldbyName('NPORTICO').AsInteger :=
M+L;
cdsBDPortico_Inicial.FieldbyName('ELEMENTO').AsInteger := M;
cdsBDPortico_Inicial.ApplyUpdates(0);
K := K +1;
end;
end;
end;
end;
I want that every time I use the code above it first delete all records in the table then fill it again with the loop.
When I use the code for the first time it do what I want but in the second time it just delete the records and can not fill the table with the values.
Update I've added some example code below. Also, when I wrote the original version of this answer, I'd forgotten that one of the TDataSetProvider Options is
poAllowMultiRecordUpdates, but I'm not sure that's involved in your problem.
The error message Unable to find record. No key specified is generated by the DataSetProvider, so isn't directly connected to your
QUeryCOnsulta.SQL.Text := 'DELETE FROM PORTICO_INICIAL;'
because that bypasses the DataSetProvider. The error is coming from an failed attempt to ApplyUpdates on the CDS. Try changing your call to it to
Assert(cdsBDPortico_Inicial.ApplyUpdates(0) = 0);
That will show you when the error occurs because the return result of ApplyUpdates gives the number of errors that occurred when calling it.
You say
will have repeated values in some cases
If that's true when the problem occurs, it's because you are hitting a fundamental limitation in the way a DataSetProvider works. To apply the updates on the source dataset, it has to generate SQL to send back to the source dataset (TSqlQuery1) which uniquely identifies the row to update in the source data, which is impossible if the source dataset contains duplicated rows.
Basically, you need to re-think your code so that the source dataset rows are all unique. Once you've done that, setting the DSP's UpdateMode to upWhereAll should avoid the problem. It would be best for the source dataset to have a primary key, of course.
A quick work-around would be to use CDS.Locate in the loop where you insert the records, to see if it can locate an already-existing record with the values you're about to add.
Btw, sorry for raising the point about the ProviderFlags. It's irrelevant if there are duplicated rows, because whatever they are set to, the DSP will still fail to update a single record.
In case it helps, here is some code which might help populating your table
in a way which avoids getting duplicates. It only populates the first two
columns, as in the code you show in your q.
function RowExists(ADataset : TDataSet; FieldNames : String; Values : Variant) : Boolean;
begin
Result := ADataSet.Locate(FieldNames, Values, []);
end;
procedure TForm1.PopulateTable;
var
Int1,
Int2,
Int3 : Integer;
i : Integer;
RowData : Variant;
begin
CDS1.IndexFieldNames := 'Int1;Int2';
for i := 1 to 100 do begin
Int1 := Round(Random(100));
Int2 := Round(Random(100));
RowData := VarArrayOf([Int1, Int2]);
if not RowExists(CDS1, 'Int1;Int2', RowData) then
CDS1.InsertRecord([Int1, Int2]);
end;
CDS1.First;
Assert(CDS1.ApplyUpdates(0) = 0);
end;
Splite the problem into small parties using functions and procedures
create an instance of TSqlQuery Execute the SQL statment's and destroy the instance when you finish with it...
procedure DeleteAll;
var
Qry: TSqlQuery;
begin
Qry := TSqlQuery.Create(nil);
try
Qry.SqlConnection := DMDados.conexao;
Qry.Sql.Text := 'DELETE FROM PORTICO_INICIAL;';
Qry.ExecSql;
finally
Qry.Free;
end;
end;
your can even execute directly from TSQlConnection with one line...
DMDados.conexao.ExecuteDirect('DELETE FROM PORTICO_INICIAL;')
procedure monta_portico ();
var
I,K,L,M, : integer;
begin
with DMDados do
begin
DeleteAll;
K := 1;
for I := 1 to 10 do
begin
L := I*100;
for M := 1 to 3 do
begin
cdsBDPortico_Inicial.Insert;
cdsBDPortico_Inicial.FieldbyName('NPORTICO').AsInteger :=
M+L;
cdsBDPortico_Inicial.FieldbyName('ELEMENTO').AsInteger := M;
cdsBDPortico_Inicial.ApplyUpdates(0);
K := K +1;
end;
end;
end;
end;
Just few obvervations, cause the primary answers were given, but they not deal with the secondary problems.
cdsBDPortico_Inicial.FieldbyName('NPORTICO').AsInteger :=
FieldByName is slow function - it is linear search over objects array with uppercased string comparison over each one. You better only call it once for every field, not do it again in again in the loop.
cdsBDPortico_Inicial.ApplyUpdates(0);
Again, applying updates is relatively slow - it requires roundtrip to the server all through internal guts of DataSnap library, why so often?
BTW, you delete rows from SQL table - but where do you delete rows from cdsBDPortico_Inicial ??? I do not see that code.
Was I in your shows I would write something like that (granted I am not big fan of Datasnap and CDS):
procedure monta_portico ();
var
Qry: TSqlQuery;
_p_EL, _p_NP: TParam;
Tra: TDBXTransaction;
var
I,K,L,M, : integer;
begin
Tra := nil;
Qry := TSqlQuery.Create(DMDados.conexao); // this way the query would have owner
try // thus even if I screw and forget to free it - someone eventually would
Qry.SqlConnection := DMDados.conexao;
Tra := Qry.SqlConnection.BeginTransaction;
// think about making a special function that would create query
// and set some its properties - like connection, transaction, preparation, etc
// so you would not repeat yourself again and again, risking mistyping
Qry.Sql.Text := 'DELETE FROM PORTICO_INICIAL'; // you do not need ';' for one statement, it is not script, not a PSQL block here
Qry.ExecSql;
Qry.Sql.Text := 'INSERT INTO PORTICO_INICIAL(NPORTICO,ELEMENTO) '
+ 'VALUES (:NP,:EL)';
Qry.Prepared := True;
_p_EL := Qry.ParamByName('EL'); // cache objects, do not repeat linear searches
_p_NP := Qry.ParamByName('NP'); // for simple queries you can even do ... := Qry.Params[0]
K := 1;
for I := 1 to 10 do
begin
L := I*100;
for M := 1 to 3 do
begin
_p_NP.AsInteger := M+L;
_p_EL.AsInteger := M;
Qry.ExecSQL;
Inc(K); // why? you seem to never use it
end;
end;
Qry.SqlConnection.CommitFreeAndNil(tra);
finally
if nil <> tra then Qry.SqlConnection.RollbackFreeAndNil(tra);
Qry.Destroy;
end;
end;
This procedure does not populate cdsBDPortico_Inicial - but do you really need it?
If you do - maybe you can re-read it from the database: there could be other programs that added rows into the table too.
Or you can insert many rows and then apply them all in one command, before committing the transaction (often abreviated tx), but even then, do not call FieldByName more than once.
Also, think about logical blocks of your program work in advance, those very transactions, temporary TSQLQuery objects etc.
However boring and tedious it is now, you would bring yourself many more spaghetti trouble if you don't. Adding this logic retroactively after you have many small functions calling one another in unpredictable order is very hard.
Also, if you make Firebird server auto-assigning the ID field (and your program does not need any special values in ID and will be ok with Firebird-made values) then the following command might server yet better for you: INSERT INTO PORTICO_INICIAL(NPORTICO,ELEMENTO) VALUES (:NP,:EL) RETURNING ID
I want to read the entire table from an MS Access file and I'm trying to do it as fast as possible. When testing a big sample I found that the loop counter increases faster when it's reading the top records comparing to last records of the table. Here's a sample code that demonstrates this:
procedure TForm1.Button1Click(Sender: TObject);
const
MaxRecords = 40000;
Step = 5000;
var
I, J: Integer;
Table: TADOTable;
T: Cardinal;
Ts: TCardinalDynArray;
begin
Table := TADOTable.Create(nil);
Table.ConnectionString :=
'Provider=Microsoft.ACE.OLEDB.12.0;'+
'Data Source=BigMDB.accdb;'+
'Mode=Read|Share Deny Read|Share Deny Write;'+
'Persist Security Info=False';
Table.TableName := 'Table1';
Table.Open;
J := 0;
SetLength(Ts, MaxRecords div Step);
T := GetTickCount;
for I := 1 to MaxRecords do
begin
Table.Next;
if ((I mod Step) = 0) then
begin
T := GetTickCount - T;
Ts[J] := T;
Inc(J);
T := GetTickCount;
end;
end;
Table.Free;
// Chart1.SeriesList[0].Clear;
// for I := 0 to Length(Ts) - 1 do
// begin
// Chart1.SeriesList[0].Add(Ts[I]/1000, Format(
// 'Records: %s %d-%d %s Duration:%f s',
// [#13, I * Step, (I + 1)*Step, #13, Ts[I]/1000]));
// end;
end;
And the result on my PC:
The table has two string fields, one double and one integer. It has no primary key nor index field. Why does it happen and how can I prevent it?
I can reproduce your results using an AdoQuery with an MS Sql Server dataset of similar size to yours.
However, after doing a bit of line-profiling, I think I've found the answer to this, and it's slightly counter-intuitive. I'm sure everyone who does
DB programming in Delphi is used to the idea that looping through a dataset tends to be much quicker if you surround the loop by calls to Disable/EnableControls. But who would bother to do that if there are no db-aware controls attached to the dataset?
Well, it turns out that in your situation, even though there are no DB-aware controls, the speed increases hugely if you use Disable/EnableControls regardless.
The reason is that TCustomADODataSet.InternalGetRecord in AdoDB.Pas contains this:
if ControlsDisabled then
RecordNumber := -2 else
RecordNumber := Recordset.AbsolutePosition;
and according to my line profiler, the while not AdoQuery1.Eof do AdoQuery1.Next loop spends 98.8% of its time executing the assignment
RecordNumber := Recordset.AbsolutePosition;
! The calculation of Recordset.AbsolutePosition is hidden, of course, on the "wrong side" of the Recordset interface, but the fact that the time to call it apparently increases the further you go into the recordset makes it reasonable imo to speculate that it's calculated by counting from the start of the recordset's data.
Of course, ControlsDisabled returns true if DisableControls has been called and not undone by a call to EnableControls. So, retest with the loop surrounded by Disable/EnableControls and hopefully you'll get a similar result to mine. It looks like you were right that the slowdown isn't related to memory allocations.
Using the following code:
procedure TForm1.btnLoopClick(Sender: TObject);
var
I: Integer;
T: Integer;
Step : Integer;
begin
Memo1.Lines.BeginUpdate;
I := 0;
Step := 4000;
if cbDisableControls.Checked then
AdoQuery1.DisableControls;
T := GetTickCount;
{.$define UseRecordSet}
{$ifdef UseRecordSet}
while not AdoQuery1.Recordset.Eof do begin
AdoQuery1.Recordset.MoveNext;
Inc(I);
if I mod Step = 0 then begin
T := GetTickCount - T;
Memo1.Lines.Add(IntToStr(I) + ':' + IntToStr(T));
T := GetTickCount;
end;
end;
{$else}
while not AdoQuery1.Eof do begin
AdoQuery1.Next;
Inc(I);
if I mod Step = 0 then begin
T := GetTickCount - T;
Memo1.Lines.Add(IntToStr(I) + ':' + IntToStr(T));
T := GetTickCount;
end;
end;
{$endif}
if cbDisableControls.Checked then
AdoQuery1.EnableControls;
Memo1.Lines.EndUpdate;
end;
I get the following results (with DisableControls not called except where noted):
Using CursorLocation = clUseClient
AdoQuery.Next AdoQuery.RecordSet AdoQuery.Next
.MoveNext + DisableControls
4000:157 4000:16 4000:15
8000:453 8000:16 8000:15
12000:687 12000:0 12000:32
16000:969 16000:15 16000:31
20000:1250 20000:16 20000:31
24000:1500 24000:0 24000:16
28000:1703 28000:15 28000:31
32000:1891 32000:16 32000:31
36000:2187 36000:16 36000:16
40000:2438 40000:0 40000:15
44000:2703 44000:15 44000:31
48000:3203 48000:16 48000:32
=======================================
Using CursorLocation = clUseServer
AdoQuery.Next AdoQuery.RecordSet AdoQuery.Next
.MoveNext + DisableControls
4000:1031 4000:454 4000:563
8000:1016 8000:468 8000:562
12000:1047 12000:469 12000:500
16000:1234 16000:484 16000:532
20000:1047 20000:454 20000:546
24000:1063 24000:484 24000:547
28000:984 28000:531 28000:563
32000:906 32000:485 32000:500
36000:1016 36000:531 36000:578
40000:1000 40000:547 40000:500
44000:968 44000:406 44000:562
48000:1016 48000:375 48000:547
Calling AdoQuery1.Recordset.MoveNext calls directly into the MDac/ADO layer, of
course, whereas AdoQuery1.Next involves all the overhead of the standard TDataSet
model. As Serge Kraikov said, changing the CursorLocation certainly makes a difference and doesn't exhibit the slowdown we noticed, though obviously it's significantly slower than using clUseClient and calling DisableControls. I suppose it depends on exactly what you're trying to do whether you can take advantage of the extra speed of using clUseClient with RecordSet.MoveNext.
When you open a table, ADO dataset internally creates special data structures to navigate dataset forward/backward - "dataset CURSOR". During navigation, ADO stores the list of already visited records to provide bidirectional navigation.
Seems ADO cursor code uses quadratic-time O(n2) algorithm to store this list.
But there are workaround - use server-side cursor:
Table.CursorLocation := clUseServer;
I tested your code using this fix and get linear fetch time - fetching every next chunk of records takes the same time as previous.
PS Some other data access libraries provides special "unidirectional" datasets - this datasets can traverse only forward and don't even store already traversed records - you get constant memory consumption and linear fetch time.
DAO is native to Access and (IMHO) is typically faster.
Whether or not you switch, use the GetRows method. Both DAO and ADO support it.
There is no looping. You can dump the entire recordset into an array with a couple of lines of code. Air code:
yourrecordset.MoveLast
yourrecordset.MoveFirst
yourarray = yourrecordset.GetRows(yourrecordset.RecordCount)
I need the RcordCount of a TADOTable before opening it. So I run this query:
ADOQuery1.SQL.Text := ‘SELECT Count(*) AS Record_Count FROM Table1’;
ADOQuery1.Open;
But this takes 9.7s (average of 10 runs) on my PC for a table consisting of over 5 million records. This time seems nothing compared to even touching a table with this size, but I think this kind of data should be stored somewhere in the .mdb file. Is there any other way to get RecordCount of a table directly from the file?
I'm using Delphi XE-5 and the file is in Microsoft Access 2003 format.
No, SELECT Count(*) is the only and the correct way to do it.
But: almost 10s for this query is awfully slow, especially if this is on your local PC.
Does the table have a Primary Key? If yes, what data type is it?
I created a simple table with an Autonumber Primary Key and inserted 5 million records.
SELECT COUNT(*) FROM tBig takes less than 1/10 seconds to execute.
Well, I searched a little more and found it. You can read the number of records of a table directly from the schema:
function ReadRecordCountFromSchema(const Connection: TADOConnection;
const TableName: string): Integer;
var
CardinalityField,
NameField: TField;
DataSet: TADODataSet;
begin
DataSet := TADODataSet.Create(nil);
Result := -1;
try
Connection.OpenSchema(siStatistics, EmptyParam, EmptyParam, DataSet);
CardinalityField := DataSet.FieldByName('CARDINALITY'); { do not localize }
NameField := DataSet.FieldByName('TABLE_NAME'); { do not localize }
while not DataSet.EOF do
begin
if CompareText(NameField.AsString, TableName) = 0 then
begin
Result := CardinalityField.AsInteger;
Break;
end;
DataSet.Next;
end;
finally
DataSet.Free;
end;
end;
Actually TADOConnection.GetTableNames gave me the idea.
Edit:
As Andre451 mentioned, it's better to pass TableName to OpenSchema as a restriction:
function ReadRecordCountFromSchema(const Connection: TADOConnection;
const TableName: string): Integer;
var
DataSet: TADODataSet;
begin
DataSet := TADODataSet.Create(nil);
Result := -1;
try
Connection.OpenSchema(siStatistics,
VarArrayOf([Unassigned, Unassigned, TableName]),
EmptyParam, DataSet);
if DataSet.RecordCount > 0 then
Result := DataSet.FieldByName('CARDINALITY').AsInteger;
finally
DataSet.Free;
end;
end;
I agree with Andre451. The only other thing I would say is that you should specify the field in the Count statement instead of using the *.
SELECT COUNT(PrimaryKey) FROM tBig; would do it.
Today I have met very strange bug.
I have the next code:
var i: integer;
...
for i := 0 to FileNames.Count - 1 do
begin
ShowMessage(IntToStr(i) + ' from ' + IntToStr(FileNames.Count - 1));
FileName := FileNames[i];
...
end;
ShowMessage('all');
FileNames list has one element. So, I consider then loop will be executed once and I see
0 from 0
all
It is a thing I did thousands times :).
But in this case I see the second loop iteration when code optimization is switched on.
0 from 0
1 from 0
all
Without code optimization loop iterates right.
For the moment I don't know even the direction to move with this issue (and upper loop bound stays unchanged, yes).
So any suggestion will be very helpful. Thanks.
I use Delphi 2005 (Upd2) compiler.
Considering the QC report referred to by LU RD, and my own experience with D2005, here is a few workarounds. I recall using the while loop solution myself.
1.Rewrite the for loop as a while loop
var
i: integer;
begin
i := 0;
while i < FileNames.Count do
begin
...
inc(i);
end;
end;
2.Leave the for loop control variable alone from any other processing and use a separate variable, that you increment in the loop, for string manipulation and FileNames indexing.
var
ctrl, indx: integer;
begin
indx := 0;
for ctrl := 0 to FileNames.Count-1 do
begin
// use indx for string manipulation and FileNames indx
inc(indx);
end;
end;
3.You hinted at a workaround in saying Without code optimization loop iterates right.
Assuming you have optimization on turn it off ( {$O-} ) before the procedure/function and on ( {$O+} ) again after. Note! The Optimization directive can only be used around at least whole procedures/functions.
Ok, it seems to me I solved the problem and can explain it.
Unfortunately, I cannot make test to reproduce the bug, and I cannot show the real code, which under NDA. So I must use simplified example again.
Problem is in dll, which used in my code. Consider the next data structure:
type
TData = packed record
Count: integer;
end;
TPData = ^TData;
and function, which defined in dll:
Calc: function(Data: TPData): integer; stdcall;
In my code I try to proceed data records which are taken from list (TList):
var
i: integer;
Data: TData;
begin
for i := 0 to List.Count - 1 do
begin
Data := TPData(List[i])^;
Calc(#Data);
end;
and in case when optimization is on I see second iteration in loop from 0 to 0.
If rewrite code as
var
i: integer;
Data, Data2: TData;
begin
for i := 0 to List.Count - 1 do
begin
Data := TPData(List[i])^;
Data2 := TPData(List[i])^;
Calc(#Data2);
end;
all works as expected.
Dll itself was developed by another programmer, so I asked him to take care about it.
What was unexpected for me - that local procedure's stack can be corruped in so unusual way without access violations or other similar errors. BTW, Data and Data2 variables contains correct values.
Maybe, my experience will be useful to someone. Thanks all who helps me and please sorry my unconscious mistakes.
I want to terminate file that user selected from my program. I wrote this sample code:
var
aFile: TFileStream;
Const
FileAddr: String = 'H:\Akon.mp3';
Buf: Byte = 0;
begin
if FileExists(FileAddr) then
begin
// Open given file in file stream & rewrite it
aFile:= TFileStream.Create(FileAddr, fmOpenReadWrite);
try
aFile.Seek(0, soFromBeginning);
while aFile.Position <> aFile.Size do
aFile.Write(Buf, 1);
finally
aFile.Free;
ShowMessage('Finish');
end;
end;
end;
As you can see, I overwrite given file with 0 (null) value. This code works correctly, but the speed is very low in large files. I would like do this process in multithreaded code, but I tried some test some code and can't do it. For example, I create 4 threads that do this work to speed up this process.
Is there any way to speed up this process?
I don't know if it could help you, but I think you could do better (than multithreading) writing to file a larger buffer.
For example you could initialize a buffer 16k wide and write directly to FileStream; you have only to check the last part of file, for which you write only a part of the full buffer.
Believe me, it will be really faster...
OK, I'll bite:
const
FileAddr: String = 'H:\Akon.mp3';
var
aFile: TFileStream;
Buf: array[0..1023] of Byte;
Remaining, NumBytes: Integer;
begin
if FileExists(FileAddr) then
begin
// Open given file in file stream & rewrite it
aFile:= TFileStream.Create(FileAddr, fmOpenReadWrite);
try
FillChar(Buf, SizeOf(Buf), 0);
Remaining := aFile.Size;
while Remaining > 0 do begin
NumBytes := SizeOf(Buf);
if NumBytes < Remaining then
NumBytes := Remaining;
aFile.WriteBuffer(Buf, NumBytes);
Dec(Remaining, NumBytes);
end;
finally
aFile.Free;
ShowMessage('Finish');
end;
end;
end;
Multiple threads won't help you here. Your constraint is disk access, primarily because you're writing only 1 byte at a time.
Declare Buf as an array of bytes, and initialise it with FillChar or ZeroMemory. Then change your while loop as follows:
while ((aFile.Position + SizeOf(Buf)) < aFile.Size) do
begin
aFile.Write(Buf, SizeOf(Buf));
end;
if (aFile.Position < aFile.Size) then
begin
aFile.Write(Buf, aFile.Size - aFile.Position);
end;
You should learn from the slowness of the above code that:
Writing one byte at a time is the slowest and worst way to do it, you incur a huge overhead, and reduce your overall performance.
Even writing 1k bytes (1024 bytes) at a time, would be a vast improvement, but writing a larger amount will of course be even faster, until you reach a point of diminishing returns, which I would guess is going to be somewhere between 200k and 500k write buffer size. The only way to find out when it stops mattering for your application is to test, test, and test.
Checking position against size so often is completely superfluous. If you read the size once, and write the correct number of bytes, using a local variable you will save yourself more overhead, and improve performance. ie, Inc(LPosition,BufSize) to increment LPosition:Integer logical variable, by the buffer size amount BufSize.
Not sure if this meets your requirments but it works and it's fast.
var
aFile: TFileStream;
const
FileAddr: String = 'H:\Akon.mp3';
begin
if FileExists(FileAddr) then
begin
aFile:= TFileStream.Create(FileAddr, fmOpenReadWrite);
try
afile.Size := 0;
finally
aFile.Free;
ShowMessage('Finish');
end;
end;
end;
So will something along these lines (declaring b outside the function will improve your performance in the loop, especially when dealing with a large file ). I assume that in the app filename would be a var:
const
b: byte=0;
procedure MyProc;
var
aFile: TFileStream;
Buf: array of byte;
len: integer;
FileAddr: String;
begin
FileAddr := 'C:\testURL.txt';
if FileExists(FileAddr) then
begin
aFile := TFileStream.Create(FileAddr, fmcreate);
try
len := afile.Size;
setlength(buf, len);
fillchar(buf[0], len, b);
aFile.Position := 0;
aFile.Write(buf, len);
finally
aFile.Free;
ShowMessage('Finish');
end;
end;
end;