delphi TFileStream "out of memory" - delphi

I am having trouble with some Delphi code that uses TFileStream to read chunks of data from a file to a dynamic array. The original objective in writing the code is to compare the contents of two files that have the same size but potentially different date and time stamps to see if the contents are the same. This is done by reading the data from each file of the pair into separate dynamic arrays and comparing each byte of one array with the corresponding byte of the other.
The code makes multiple calls to TFileStream.Read. After about 75 calls, the program crashes with an 'Out of Memory' Error message.
It does not seem to matter how large the blocks of data that are read, it seems to be the number of calls that results in the error message.
The code is a function that I have written that is called elsewhere whenever the program encounters two files that it needs to compare (which, for reasons that I won't go into, could be forty or fifty different file pairs). The 'Out of Memory' error occurs whether it is a single file that is being read in small blocks, or multiple files that are being read in their entirety. It seems to be the number of calls that is the determinant of the error.
While I realize that there might be more elegant ways of achieving the comparison of the files than what I have shown below, what I would really like to know is what is wrong with the use of the TFileStream and/or SetLength calls that are causing the memory problems. I have tried freeing the memory after every call (as shown in the code) and it seems to make no difference.
I would be grateful if someone could explain what is going wrong.
function Compare_file_contents(SPN,TPN : String; SourceFileSize : int64) : boolean;
var
SF : TFileStream; //First file of pair for comparison
TF : TFileStream; //Second file of pair
SourceArray : TBytes; // Buffer array to receive first file data
TargetArray : TBytes; //Buffer array to receive second file data
ArrayLength : int64; //Length of dynamic array
Position : int64; //Position within files to start each block of data read
TestPosition : int64; //Position within dynamic arrays to compare each byte
MaxArrayLength : integer; //Maximum size for the buffer arrays
LastRun : Boolean; //End first repeat loop
begin
{ The comparison has an arbitrary upper boundary of 100 MB to avoid slowing the
the overall program. The main files bigger than this will be *.pst files that
will most likely have new dates every time the program is run, so it will take
about the same time to copy the files as it does to read and compare them, and
it will have to be done every time.
The function terminates when it is confirmed that the files are not the same.
If the source file is bigger than 100 MB, it is simply assumed that they are
not identical, thus Result = False. Also, LongInt integers (=integers) have
a range of -2147483648..2147483647, so files bigger than 2 GB will have
overflowed to a negative number. Hence the check to see if the file size is
less than zero.
The outer repeat ... until loop terminates on LastRun, but LastRun should only
be set if SecondLastRun is True, because it will skip the final comparisons in
the inner repeat ... until loop otherwise. }
Result := True;
LastRun := False;
MaxArrayLength := 1024*1024;
if (SourceFileSize > 100*1024*1024) or (SourceFileSize < 0) then Result := False
else
begin
{ The comparison is done by using TFileStream to open and read the data from
the source and target files as bytes to dynamic arrays (TBytes). Then a repeat
loop is used to compare individual bytes until a difference is found or all
of the information has been compared. If a difference is found, Result is
set to False. }
if SourceFileSize > MaxArrayLength then ArrayLength := MaxArrayLength
else ArrayLength := SourceFileSize;
SF := TFileStream.Create(SPN,fmOpenRead);
TF := TFileStream.Create(TPN,fmOpenRead);
Position := 0;
SetLength(SourceArray,ArrayLength);
SetLength(TargetArray,ArrayLength);
try
SF.Read(SourceArray,ArrayLength);
TF.Read(TargetArray,ArrayLength);
Position := SF.Position;
finally
SF.Free;
TF.Free;
end;
repeat
TestPosition := 0;
repeat
if SourceArray[TestPosition] <> TargetArray[TestPosition] then
Result := False;
Inc(TestPosition);
until (Result = False) or (TestPosition = ArrayLength);
if SourceFileSize > Position then
begin
if SourceFileSize - Position - MaxArrayLength > 0 then
ArrayLength := MaxArrayLength
else ArrayLength := SourceFileSize - Position;
SF := TFileStream.Create(SPN,fmOpenRead);
TF := TFileStream.Create(TPN,fmOpenRead);
SF.Position := Position;
TF.Position := Position;
try
SF.Read(SourceArray,ArrayLength);
TF.Read(TargetArray,ArrayLength);
Position := SF.Position;
finally
SF.Free;
TF.Free;
end;
end else LastRun := True;
until (Result = False) or LastRun;
Finalize(SourceArray);
Finalize(TargetArray);
end;
end; { Compare_file_contents }

This routine seems to be far more complicated than it needs to be. Rather than trying to debug it, I offer you my routine that compares streams.
function StreamsEqual(Stream1, Stream2: TStream): Boolean;
const
OneKB = 1024;
var
Buffer1, Buffer2: array [0..4*OneKB-1] of Byte;
SavePos1, SavePos2: Int64;
Count: Int64;
N: Integer;
begin
if Stream1.Size<>Stream2.Size then begin
Result := False;
exit;
end;
SavePos1 := Stream1.Position;
SavePos2 := Stream2.Position;
Try
Stream1.Position := 0;
Stream2.Position := 0;
Count := Stream1.Size;
while Count <> 0 do begin
N := Min(SizeOf(Buffer1), Count);
Stream1.ReadBuffer(Buffer1, N);
Stream2.ReadBuffer(Buffer2, N);
if not CompareMem(#Buffer1, #Buffer2, N) then begin
Result := False;
exit;
end;
dec(Count, N);
end;
Result := True;
Finally
Stream1.Position := SavePos1;
Stream2.Position := SavePos2;
End;
end;
If you wish to add your 100MB size check to this function, it's obvious where and how to do it.
The routine above uses a stack allocated buffer. In contrast your version allocates on the heap. Perhaps your version leads to heap fragmentation.
I realise that this does not answer the direct question that you asked. However, it does solve your problem. I hope this proves useful.

Related

Delphi: fast stream to sha256 for large file (100 MB)

I have a function for generate the sha256 for a stream. This function take around 5 seconds for 100 MB FileStream. Any tips for make it faster?
function GetStreamToHashSHA256Hex(const Content: TStream): string;
const
//ChunkSize = $F000; // 61440
ChunkSize = 1024*1024; // 1 mb
var
aHashSHA2: THashSHA2;
aBytes: TBytes;
aBytesRead: Integer;
begin
aHashSHA2 := THashSHA2.create;
SetLength(aBytes, ChunkSize);
try
//Content.Seek(0, soBeginning);
Content.Position := 0;
repeat
aBytesRead := Content.Read(aBytes, ChunkSize);
if (aBytesRead = 0) then Break; // Done
aHashSHA2.Update(aBytes, aBytesRead);
until False;
//Content.Seek(0, soBeginning);
Content.Position := 0;
Result := aHashSHA2.HashAsString;
finally
aHashSHA2.Reset;
aBytes := nil;
end;
end;
There is nothing in this code that can be improved upon. You already read the file in large chunks. The only opportunity to improve performance is in the hash implementation itself.
In other words, you might try alternative hash implementations to see if others are faster. An obvious place to start is with those from the Synopse project.
You should also compare the performance of your code with that obtained using a respectable command line hash program. This will give you a feel for what sort of performance is attainable.

How to insert string at index in TMemoryStream?

How can I insert a string at a specified index in TMemoryStream? If you added a string
"a" to an existing string "b" at Index 0 it would move it forward "ab" etc. For example this is what TStringBuilder.Insert does.
Expand the stream so that there is room for the text to be inserted. So, if the text to be inserted has length N, then you need to make the stream N bytes larger.
Copy all the existing content, starting from the insertion point, to the right to make room for the insertion. A call to Move will get this done. You'll be moving this text N bytes to the right.
Write the inserted string at the insertion point.
I'm assuming an 8 bit encoding. If you use a 16 bit encoding, then the stream needs to be grown by 2N bytes, and so on.
You will find that this is a potentially expensive operation. If you care about performance you will do whatever you can to avoid ever having to do this.
P.S. I'm sorry if I have offended any right-to-left language readers with my Anglo-centric assumption that strings run from left to right!
You asked for some code. Here it is:
procedure TMyStringBuilder.Insert(Index: Integer; const Str: string);
var
N: Integer;
P: Char;
begin
N := Length(Str);
if N=0 then
exit;
FStream.Size := FStream.Size + N*SizeOf(Char);
P := PChar(FStream.Memory);
Move((P + Index)^, (P + Index + N)^, (FStream.Size - N - Index)*SizeOf(Char));
Move(Pointer(Str)^, (P + Index)^, N*SizeOf(Char));
end;
Note that I wrote this code, and then looked at the code in TStringBuilder. It's pretty much identical to that!
The fact that the code you end up writing for this operation is identical to that in TStringBuilder should cause you to stop and contemplate. It's very likely that this new string builder replacement class that you are building will end up with the same implementation as the original. It's highly likely that your replacement will perform no better than the original, and quite plausible that the performance of the replacement will be worse.
It looks a little to me as though you are optimising prematurely. According to your comments below you have not yet timed your code to prove that time spent in TStringBuilder methods is your bottleneck. That's really the first thing that you need to do.
Assuming that you do this timing, and prove that TStringBuilder methods are your bottleneck, you then need to identify why that code is performing below par. And then you need to work out how the code could be improved. Simply repeating the implementation of the original class is not going to yield any benefits.
To move the existing data to make some room for the new string data, you can use pointer operation and Move procedure for faster operation. But it only has to be done if the insertion index is lower then the size of the original stream. If the index is larger than stream size then you could: (1) expand the stream size to accomodate the index number and fill the extra room with zero values or spaces, or (2) reduce the index value to the stream size, so the string will be inserted or appended in the end of the stream.
Depend on your needs, you could: (1) make a class derived from TMemoryStream, or (2) make a function to process an instance of TMemoryStream. Here's the first case:
type
TExtMemoryStream = class(TMemoryStream)
public
procedure InsertString(Index: Integer; const S: string);
end;
procedure TExtMemoryStream.InsertString(Index: Integer; const S: string);
var
SLength, OldSize: Integer;
Src, Dst, PointerToS: ^Char;
begin
if Index > Size then Index := Size;
SLength := Length(S);
OldSize := Size;
SetSize(Size + SLength);
Src := Memory; Inc(Src, Index);
Dst := Src; Inc(Dst, SLength);
Move(Src^, Dst^, OldSize - Index);
PointerToS := #S[1];
Move(PointerToS^, Src^, SLength);
end;
or the second case:
procedure InsertStringToMemoryStream(MS: TMemoryStream;
Index: Integer; const S: string);
var
SLength, OldSize: Integer;
Src, Dst, PointerToS: ^Char;
begin
if Index > MS.Size then Index := MS.Size;
SLength := Length(S);
OldSize := MS.Size;
MS.SetSize(MS.Size + SLength);
Src := MS.Memory; Inc(Src, Index);
Dst := Src; Inc(Dst, SLength);
Move(Src^, Dst^, OldSize - Index);
PointerToS := #S[1];
Move(PointerToS^, Src^, SLength);
end;
There, hope it helps :)

How to save / load this data?

I have a few items I need to save, Its all value for a Map. So each time a user can either load an old map or create and save a new one. But I am unsure how or what process to use to save all the data.
I have
TCube array (1..10000)
Row, Column : integers
NumberOfBlocks : integers
So when a user clicks save it should save the array data ( I am not sure it matters but not all 10,000 cubes are created) , along with the 3 integers.
Full code of the creation of the TCUBE array and integers.
procedure TForm2.createMap(r:integer;c:integer);
var
i : integer;
rows,columns,columnssave : integer;
x,y,z : single;
player : tmodel3d;
begin
columns := r;
rows := c;
i:=1;
x := 0;
y := 0;
z := 0;
NumberOfRows := rows;
NumberOfColumns := columns;
camera1.Position.X := rows/2;
camera1.Position.Y := columns/2;
dummy1.Position.X := rows/2;
dummy1.Position.Y := columns/2;
while rows>0 do
begin
columnssave := columns ;
while columns >0 do
begin
CreateCube[i]:=tcube.Create(self);
CreateCube[i].Visible := true;
CreateCube[i].Position.x := x;
CreateCube[i].Position.Y := y;
CreateCube[i].Position.Z := z;
CreateCube[i].Material.Texture.CreateFromFile(gamedir+'\pics\Grass.bmp');
CreateCube[i].Material.Modulation := TTextureMode.tmReplace;
CreateCube[i].Parent := viewport3d1;
CreateCube[i].OnClick := cubeClick;
CreateCube[i].OnMouseDown := mousedown;
y:= y+1;
i:=i+1;
//z:=z-1;
columns := columns -1;
end;
y:=0;
x:= x+1;
z:=0;
columns := columnssave;
rows:= rows-1;
end;
totalblocks := i;
setblocks := false;
label2.Text := inttostr(totalblocks);
end;
Thanks
Glen
There are lots of ways to save data out to a file. Normally I use NativeXML to read and write XML documents. This probably isn't the best way but it works for me and has been suitable in my projects so far.
It's human readable so you can see what's going on. Great for debugging when something isn't working right.
It's structured. That helps when you need to save data for objects contained by other objects. For example, in your game world, there is a village, the village contains 10 buildings, building 1 contains 7 lamps, building 2 contains 3 swords, etc.
It's a standard and commonly used format. Your game data files can be read and manipulated in other programs.
The XML file can be encrypted as a way to prevent users from modifying the data.
Example of what the save code could look like:
procedure TMyGame.SaveToFile(const FileName: string);
var
XML : TNativeXML;
RootNode : TXmlNode;
CubeNode : TXmlNode;
begin
XML := TNativeXML.CreateName('root');
try
RootNode := XML.Root;
for c1 := 0 to NumberOfCubes-1 do
begin
CubeNode := RootNode.NodeNew('Cube');
CubeNode.NodeNew('XPos').ValueAsInteger := 10;
CubeNode.NodeNew('YPos').ValueAsInteger := 2;
end;
XML.SaveToFile(FileName);
finally
XML.Free;
end;
end;
I strongly advocate for having human-readable data, such as with an XML or .ini file. If you need to block the user from being able to edit/modify the data, you can do that by "signing" the data and adding a validation key. The key can be generated by your program by taking your sensitive data, appending something that only you know, and then generating an MD5 hash or encrypting it. The MD5 result, or the first 100 bytes of the encrypted string, becomes your key which you would store in the .ini or whatever.
ex:
[MAPSTUFF]
SecretCoordinates=1,2,3,PIRATE TREASURE IS HERE
SignedKey=00042055215e2c6c0d751cb5b086e653
The SignedKey is actually an MD5 of the SecretCoordinates string + :XYZZY:
i.e. I ran this through MD5:
1,2,3,PIRATE TREASURE IS HERE:XYZZY:
and got:
00042055215e2c6c0d751cb5b086e653
So to check in the program, you read SecretCoordinates from the .ini file, add :ZYZZY: to it (in the program, so the user can't see (but if they have a debugger they still can)). And see if you get the same MD5. If it's a match, you're good. If it doesn't match, then the SecretCoordinates string has been tampered with, and you reject it. Very simple.
One of the best ways to save this type of data is by createing a record and saving the record. Then its not something human readable either. Sof first create a new type
TCubeSave = record
number : integer;
x : single;
y : single;
z : single;
texture : string[20];
end;
in this case we want to save the number in the array, along with the data x,y,z and texture?
If doing any string make sure its an array or you will get an error.
Next assign your variables and file and make it write ready.
var
myfile : File of TCubeSave;
cube : TcubeSave;
i : integer;
begin
assignfile(myfile, gamedir+'\maps\test.map');
ReWrite(myfile);
i:=0;
Now write each component of the array,
while (i < totalblocks) do
begin
Check if the component exsist, incase you have removed it or it was not created.
if FindComponent('cubename'+inttostr(i)) <> nil then
begin
Write the data to the file cube
cube.number := i;
cube.x := createcube[i].Position.X;
cube.y := createcube[i].Position.Y;
cube.z := createcube[i].Position.Z;
cube.texture := createcube[i].Material.Texture.ToString;
Write(myfile, cube);
end;
i := i+1;
end;
close the file when done.
CloseFile(myfile);
showmessage('Map Saved');
end;
when ready to read in the file..
// Reopen the file in read only mode
FileMode := fmOpenRead;
Reset(myFile);
// Display the file contents
while not Eof(myFile) do
begin
Read(myFile, cube);
i:= cube.number;
CreateCube[i]:=tcube.Create(self);
createCube[i].Position.x := cube.x;
createcube[i].position.y := cube.y;
ect.....
end;
// Close the file
CloseFile(myFile);

Sorting Racers in timing application

I am creating an application which uses the AMB MyLaps decoder P3 Protocols.
I can't get my head around a way to sort the racers out based on laps and lap times. For example, the person in 1st has done 3 laps, the person in 2nd has done 2 laps. But then how do I order a situation where 2 people are on the same lap?
This is the record I'm using to hold the information:
type
TTimingRecord = record
position: integer;
transId: integer;
racerName: string;
kartNumber: integer;
lastPassingN: integer;
laps: integer;
lastRTCTime: TDateTime;
bestTimeMs: Extended;
lastTimeMs: Extended;
gapTimeMs: Extended;
splitTimeMs: Extended;
timestamp: TDateTime;
end;
A new record is created for each racer.
The code I'm currently using is:
procedure sortRacers();
var
Pos, Pos2: Integer;
Temp: TTimingRecord;
GapTime: Extended;
begin
for Pos := 0 to length(DriversRecord)-1 do
begin
for Pos2 := 0 to Length(DriversRecord)-2 do
begin
if(DriversRecord[Pos2].laps < DriversRecord[Pos2+1].laps)then
begin
Temp := DriversRecord[Pos2];
DriversRecord[Pos2] := DriversRecord[Pos2+1];
DriversRecord[Pos2+1] := Temp;
end
else if DriversRecord[Pos2].laps = DriversRecord[Pos2+1].laps then
begin
if DriversRecord[Pos2].lastRTCTime > DriversRecord[Pos2+1].lastRTCTime then
begin
Temp := DriversRecord[Pos2];
DriversRecord[Pos2] := DriversRecord[Pos2+1];
DriversRecord[Pos2+1] := Temp;
end;
end;
end;
end;
for pos := 1 to length(DriversRecord) -1 do //Gap Time
begin
if DriversRecord[Pos].laps = DriversRecord[0].laps then
begin
DriversRecord[Pos].gapTimeMs := DriversRecord[Pos].lastRTCTime - DriversRecord[0].lastRTCTime;
DriversRecord[Pos].splitTimeMs := DriversRecord[Pos].lastRTCTime - DriversRecord[Pos-1].lastRTCTime;
end;
end;
end;
But doesn't work too well :)
I'm assuming from your comment to the question, that you have decomposed the problem into sorting and comparing, and that you have got the sorting part covered. Which leaves order comparison.
You need a function that will perform a lexicographic order comparison based first on the number of laps completed, and secondly on the time since the start of this lap. Basically it will look like this:
function CompareRacers(const Left, Right: TTimingRecord): Integer;
begin
Result := CompareValue(Left.laps, Right.laps);
if Result=0 then
Result := CompareDateTime(Left.lastRTCTime, Right.lastRTCTime);
end;
You'll find CompareValue in Math and CompareDateTime in DateUtils.
What I'm not sure about is what the sense of the lastRTCTime values is. You may need to negate the result of the call to CompareDateTime to get the result you desire.
Result := -CompareDateTime(Left.lastRTCTime, Right.lastRTCTime);
Also, what happens if there is overtaking during the lap? Presumably you won't be able to detect that until the racers complete the current lap.
Instead of doing the sort algorithm yourself, try this technique (if you have a Delphi version compatible) : Best way to sort an array
And your function could look like this :
uses Types;
function CustomSort(const Left, Right: TTimingRecord): Integer
begin
If (left.laps > right.laps) then
result := GreaterThanValue
else
if (left.laps < right.laps) then
result := LessThanValue
else
begin
// Same laps count... Test on LastRTCTime
if (left.lastRTCTime < right.lastRTCTime) then
result := GreaterThanValue1
else
if (left.lastRTCTime > right.lastRTCTime) then
result := LessThanValue
else
result := EqualsValue;
end;
end));
It might be easier to look at this as 2 separate sorts.
Obviously you know the bubble-sort method, so I will not go into that.
Make 2 passes on your sorting.
1st, you sort the laps.
2nd, you run through the entire list of sorted laps. find begin point and end point in array for identical lap-values. Do the sorting again from begin and end points, but this time compare only the secondary value. iterate through all identical secondary values if the count of identical values are larger than 1.
This code is about sorting data using an Index. Way faster than bubble-sort.
It is dynamic and provides for ability to sort from a start-point to an end-point in an array.
The code itself is bigger than Bubble-Sort, but not many algorithms can compare on speed.
The code (when understanding how it works) can easily be modified to suit most kinds of sorting. On an array of 65536 strings, it only need to do 17 compares (or there about)
Some more CPU Cycles per compare cycle compared with Bubble Sort, but still among the fastest methods.
To search is equally as fast as BTREE. The actual sorting is perhaps slower, but the data is easier manageable afterwards with no real need for balancing the tree.
Enjoy.
Note: The routine is not the full solution to the actual problem, but it provides the beginning of an extreemely fast approach.
TYPE
DynamicIntegerArray = ARRAY OF INTEGER;
DynamicStringArray = ARRAY OF STRING;
VAR
BinSortLo, BinSortMid, BinSortHi : INTEGER;
FUNCTION FindMid:INTEGER;
BEGIN
FindMid:=BinSortLo+((BinSortHi-BinSortLo) DIV 2);
END;
PROCEDURE ShiftIndexUpAndStorePointer(VAR ArrParamIndex: DynamicIntegerArray; HighBound:INTEGER);
VAR
x : INTEGER;
BEGIN
FOR x:=HighBound-1 DOWNTO BinSortMid DO ArrParamIndex[x+1] := ArrParamIndex[x];// Shift the index.
ArrParamIndex[BinSortMid]:=HighBound;// Store the pointer to index at its sorted place
END;
PROCEDURE BinarySortUp(CONST ArrParam:DynamicStringArray; VAR ArrParamIndex: DynamicIntegerArray; CONST LoBound,HighBound:INTEGER); OVERLOAD;
VAR
TempVar : STRING;
BEGIN
BinSortLo:=LoBound; BinSortHi:=HighBound; BinSortMid:=FindMid;
TempVar := ArrParam[HighBound];
REPEAT
IF TempVar>ArrParam[ArrParamIndex[BinSortMid]] THEN BinSortLo:=BinSortMid ELSE BinSortHi:=BinSortMid;
BinSortMid:=FindMid;
UNTIL (BinSortMid=BinSortLo); {OR (BinSortMid=BinSortHi);}
IF TempVar>ArrParam[ArrParamIndex[BinSortMid]] THEN INC(BinSortMid);// We always need a last check just in case.
ShiftIndexUpAndStorePointer(ArrParamIndex,HighBound);
END;
PROCEDURE DynamicCreateIndex(CONST ArrParam:DynamicStringArray; VAR ArrParamIndex: DynamicIntegerArray; CONST LoBound,HighBound:INTEGER);
VAR
x : INTEGER;
BEGIN
FOR x:=LoBound TO HighBound DO
BinarySortUp(ArrParam,ArrParamIndex,LoBound,x);
END;
BEGIN
{
1. Create your STRING ARRAY as a DynamicStringArray.
2. Create your INDEX ARRAY as a DynamicIntegerArray.
3. Set the size of these arrays to any INTEGER size and fill the strings with data.
4. Run a call to DynamicCreateIndex(YourStringArray,YourIndexArray,0,SizeOfArray
Now you have a sorted Index of all the strings.
}
END.

case insensitive Pos

Is there any comparable function like Pos that is not case-sensitive in D2010 (unicode)?
I know I can use Pos(AnsiUpperCase(FindString), AnsiUpperCase(SourceString)) but that adds a lot of processing time by converting the strings to uppercase every time the function is called.
For example, on a 1000000 loop, Pos takes 78ms while converting to uppercase takes 764ms.
str1 := 'dfkfkL%&/s"#<.676505';
for i := 0 to 1000000 do
PosEx('#<.', str1, 1); // Takes 78ms
for i := 0 to 1000000 do
PosEx(AnsiUpperCase('#<.'), AnsiUpperCase(str1), 1); // Takes 764ms
I know that to improve the performance of this specific example I can convert the strings to uppercase first before the loop, but the reason why I'm looking to have a Pos-like function that is not case-sensitive is to replace one from FastStrings. All the strings I'll be using Pos for will be different so I will need to convert each and every one to uppercase.
Is there any other function that might be faster than Pos + convert the strings to uppercase?
The built-in Delphi function to do that is in both the AnsiStrings.ContainsText for AnsiStrings and StrUtils.ContainsText for Unicode strings.
In the background however, they use logic very similar to your logic.
No matter in which library, functions like that will always be slow: especially to be as compatible with Unicode as possible, they need to have quite a lot of overhead. And since they are inside the loop, that costs a lot.
The only way to circumvent that overhead, is to do those conversions outside the loop as much as possible.
So: follow your own suggestion, and you have a really good solution.
--jeroen
This version of my previous answer works in both D2007 and D2010.
In Delphi 2007 the CharUpCaseTable is 256 bytes
In Delphi 2010 it is 128 KB (65535*2).
The reason is Char size. In the older version of Delphi my original code only supported the current locale character set at initialization. My InsensPosEx is about 4 times faster than your code. Certainly it is possible to go even faster, but we would lose simplicity.
type
TCharUpCaseTable = array [Char] of Char;
var
CharUpCaseTable: TCharUpCaseTable;
procedure InitCharUpCaseTable(var Table: TCharUpCaseTable);
var
n: cardinal;
begin
for n := 0 to Length(Table) - 1 do
Table[Char(n)] := Char(n);
CharUpperBuff(#Table, Length(Table));
end;
function InsensPosEx(const SubStr, S: string; Offset: Integer = 1): Integer;
var
n: Integer;
SubStrLength: Integer;
SLength: Integer;
label
Fail;
begin
Result := 0;
if S = '' then Exit;
if Offset <= 0 then Exit;
SubStrLength := Length(SubStr);
SLength := Length(s);
if SubStrLength > SLength then Exit;
Result := Offset;
while SubStrLength <= (SLength-Result+1) do
begin
for n := 1 to SubStrLength do
if CharUpCaseTable[SubStr[n]] <> CharUpCaseTable[s[Result+n-1]] then
goto Fail;
Exit;
Fail:
Inc(Result);
end;
Result := 0;
end;
//...
initialization
InitCharUpCaseTable({var}CharUpCaseTable);
I have also faced the problem of converting FastStrings, which used a Boyer-Moore (BM) search to gain some speed, for D2009 and D2010. Since many of my searches are looking for a single character only, and most of these are looking for non-alphabetic characters, my D2010 version of SmartPos has an overload version with a widechar as the first argument, and does a simple loop through the string to find these. I use uppercasing of both arguments to handle the few non-case-sensitive case. For my applications, I believe the speed of this solution is comparable to FastStrings.
For the 'string find' case, my first pass was to use SearchBuf and do the uppercasing and accept the penalty, but I have recently been looking into the possibility of using a Unicode BM implementation. As you may be aware, BM does not scale well or easily to charsets of Unicode proportions, but there is a Unicode BM implementation at Soft Gems. This pre-dates D2009 and D2010, but looks as if it would convert fairly easily. The author, Mike Lischke, solves the uppercasing issue by including a 67kb Unicode uppercasing table, and this may be a step too far for my modest requirements. Since my search strings are usually short (though not as short as your single three-character example) the overhead for Unicode BM may also be a price not worth paying: the BM advantage increases with the length of the string being searched for.
This is definitely a situation where benchmarking with some real-world application-specific examples will be needed before incorporating that Unicode BM into my own applications.
Edit: some basic benchmarking shows that I was right to be wary of the "Unicode Tuned Boyer-Moore" solution. In my environment, UTBM results in bigger code, longer time. I might consider using it if I needed some of the extras this implementation provides (handling surrogates and whole-words only searches).
Here's one that I wrote and have been using for years:
function XPos( const cSubStr, cString :string ) :integer;
var
nLen0, nLen1, nCnt, nCnt2 :integer;
cFirst :Char;
begin
nLen0 := Length(cSubStr);
nLen1 := Length(cString);
if nLen0 > nLen1 then
begin
// the substr is longer than the cString
result := 0;
end
else if nLen0 = 0 then
begin
// null substr not allowed
result := 0;
end
else
begin
// the outer loop finds the first matching character....
cFirst := UpCase( cSubStr[1] );
result := 0;
for nCnt := 1 to nLen1 - nLen0 + 1 do
begin
if UpCase( cString[nCnt] ) = cFirst then
begin
// this might be the start of the substring...at least the first
// character matches....
result := nCnt;
for nCnt2 := 2 to nLen0 do
begin
if UpCase( cString[nCnt + nCnt2 - 1] ) <> UpCase( cSubStr[nCnt2] ) then
begin
// failed
result := 0;
break;
end;
end;
end;
if result > 0 then
break;
end;
end;
end;
Why not just convert the both the substring and the source string to lower or upper case within the regular Pos statement. The result will effectively be case-insensitive because both arguments are all in one case. Simple and lite.
The Jedi Code Library has StrIPos and thousands of other useful functions to complement Delphi's RTL. When I still worked a lot in Delphi, JCL and its visual brother JVCL were among the first things I added to a freshly installed Delphi.
Instead 'AnsiUpperCase' you can use Table it is much faster.
I have reshape my old code. It is very simple and also very fast.
Check it:
type
TAnsiUpCaseTable = array [AnsiChar] of AnsiChar;
var
AnsiTable: TAnsiUpCaseTable;
procedure InitAnsiUpCaseTable(var Table: TAnsiUpCaseTable);
var
n: cardinal;
begin
for n := 0 to SizeOf(TAnsiUpCaseTable) -1 do
begin
AnsiTable[AnsiChar(n)] := AnsiChar(n);
CharUpperBuff(#AnsiTable[AnsiChar(n)], 1);
end;
end;
function UpCasePosEx(const SubStr, S: string; Offset: Integer = 1): Integer;
var
n :integer;
SubStrLength :integer;
SLength :integer;
label
Fail;
begin
SLength := length(s);
if (SLength > 0) and (Offset > 0) then begin
SubStrLength := length(SubStr);
result := Offset;
while SubStrLength <= SLength - result + 1 do begin
for n := 1 to SubStrLength do
if AnsiTable[SubStr[n]] <> AnsiTable[s[result + n -1]] then
goto Fail;
exit;
Fail:
inc(result);
end;
end;
result := 0;
end;
initialization
InitAnsiUpCaseTable(AnsiTable);
end.
I think, converting to upper or lower case before Pos is the best way, but you should try to call AnsiUpperCase/AnsiLowerCase functions as less as possible.
On this occasion I couldn't find any approach that was even as good as, let alone better than Pos() + some form of string normalisation (upper/lowercase conversion).
This is not entirely surprising as when benchmarked the Unicode string handling in Delphi 2009 I found that the Pos() RTL routine has improved significantly since Delphi 7, explained in part by the fact that aspects of the FastCode libraries have been incorporated into the RTL for some time now.
The FastStrings library on the other hand has not - iirc - been significantly updated for a long time now. In tests I found that many FastStrings routines have in fact been overtaken by the equivalent RTL functions (with a couple of exceptions, explained by the unavoidable overhead incurred by the additional complications of Unicode).
The "Char-Wise" processing of the solution presented by Steve is the best so far imho.
Any approach that involves normalising the entire strings (both string and sub-string) risks introducing errors in any character-based position in the results due to the fact that with Unicode strings a case conversion may result in a change in the length of the string (some characters convert to more/fewer characters in a case conversion).
These may be rare cases but Steve's routine avoids them and is only about 10% slower than the already quite fast Pos + Uppercase (your benchmarking results don't tally with mine on that score).
Often the simple solution is the one you'd want to use:
if AnsiPos(AnsiupperCase('needle'), AnsiupperCase('The Needle in the haystack')) <> 0 then
DoSomething;
Reference:
http://www.delphibasics.co.uk/RTL.asp?Name=ansipos
http://www.delphibasics.co.uk/RTL.asp?Name=UpCase
Any program on Windows can call a shell-API function, which keeps your code-size down. As usual, read the program from the bottom up. This has been tested with Ascii-strings only, not wide strings.
program PrgDmoPosIns; {$AppType Console} // demo case-insensitive Pos function for Windows
// Free Pascal 3.2.2 [2022/01/02], Win32 for i386
// FPC.EXE -vq -CoOr -Twin32 -oPrgStrPosDmo.EXE PrgStrPosDmo.LPR
// -vq Verbose: Show message numbers
// -C Code generation:
// o Check overflow of integer operations
// O Check for possible overflow of integer operations - Integer Overflow checking turns on Warning 4048
// r Range checking
// -Twin32 Target 32 bit Windows operating systems
// 29600 bytes code, 1316 bytes data, 35,840 bytes file
function StrStrIA( pszHaystack, pszNeedle : PChar ) : PChar; stdcall; external 'shlwapi.dll'; // dynamic link to Windows API's case-INsensitive search
// https://learn.microsoft.com/en-us/windows/win32/api/shlwapi/nf-shlwapi-strstria
// "FPC\3.2.2\Source\Packages\winunits-base\src\shlwapi.pp" line 557
function StrPos( strNeedle, strHaystk : string ) : SizeInt; // return the position of Needle within Haystack, or zero if not found
var
intRtn : SizeInt; // function result
ptrHayStk , // pointers to
ptrNeedle , // search strings
strMchFnd : PChar ; // pointer to match-found string, or null-pointer/empty-string when not found
bolFnd : boolean; // whether Needle was found within Haystack
intLenHaystk , // length of haystack
intLenMchFnd : SizeInt; // length of needle
begin
strHayStk := strHayStk + #0 ; // strings passed to API must be
strNeedle := strNeedle + #0 ; // null-terminated
ptrHayStk := Addr( strHayStk[ 1 ] ) ; // set pointers to point at first characters of
ptrNeedle := Addr( strNeedle[ 1 ] ) ; // null-terminated strings, so API gets C-style strings
strMchFnd := StrStrIA( ptrHayStk, ptrNeedle ); // call Windows to perform search; match-found-string now points inside the Haystack
bolFnd := ( strMchFnd <> '' ) ; // variable is True when match-found-string is not null/empty
if bolFnd then begin ; // when Needle was yes found in Haystack
intLenMchFnd := Length( strMchFnd ) ; // get length of needle
intLenHaystk := Length( strHayStk ) ; // get length of haystack
intRtn := intLenHaystk - intLenMchFnd; // set function result to the position of needle within haystack, which is the difference in lengths
end else // when Needle was not found in Haystack
intRtn := 0 ; // set function result to tell caller needle does not appear within haystack
StrPos := intRtn ; // pass function result back to caller
end; // StrPos
procedure TstOne( const strNeedle, strHayStk : string ); // run one test with this Needle
var
intPos : SizeInt; // found-match location of Needle within Haystack, or zero if none
begin
write ( 'Searching for : [', strNeedle, ']' ); // bgn output row for this test
intPos := StrPos( strNeedle, strHaystk ); // get Needle position
writeln(' StrPos is ' , intPos ); // end output row for this test
end; // TstOne
procedure TstAll( ); // run all tests with various Needles
const
strHayStk = 'Needle in a Haystack'; // all tests will search in this string
begin
writeln( 'Searching in : [', strHayStk, ']' ); // emit header row
TstOne ( 'Noodle' , strHayStk ); // test not-found
TstOne ( 'Needle' , strHayStk ); // test found at yes-first character
TstOne ( 'Haystack' , strHayStk ); // test found at not-first character
end; // TstAll
begin // ***** MAIN *****
TstAll( ); // run all tests
end.
function TextPos(const ASubText, AText: UnicodeString): Integer;
var
res: Integer;
begin
{
Locates a substring in a given text string without case sensitivity.
Returns the index of the first occurence of ATextin AText,
or zero if the text was not found
}
res := FindNLSString(LOCALE_USER_DEFAULT, FIND_FROMSTART or LINGUISTIC_IGNORECASE, PWideChar(AText), Length(AText), PWideChar(ASubText), Length(ASubText), nil);
Result := (res+1); //convert zero-based to one-based index, and -1 not found to zero.
end;
And in case you don't have the definitions:
function FindNLSString(Locale: LCID; dwFindNLSStringFlags: DWORD; lpStringSource: PWideChar; cchSource: Integer; lpStringValue: PWideChar; cchValue: Integer; cchFound: PInteger): Integer; stdcall; external 'Kernel32.dll';
const
FIND_FROMSTART = $00400000; // look for value in source, starting at the
LINGUISTIC_IGNORECASE = $00000010; // linguistically appropriate 'ignore

Resources