I'm getting this error message under heavy load. Here is code abstract and message from my error log.
I tried everything I could think of. Any suggestion would be greatly appreciated.
Procedure tCacheInMemory.StreamValue(Name: String; IgnoreCase: Boolean; Var Stream: TStringStream);
Var
i: Integer;
Begin
i := 0;
Try
If Not active Then
exit;
arrayLock.BeginRead;
Try
i := Search(Name);
If i > -1 Then Begin
If fItems[i].value = Nil Then
exit;
fItems[i].value.Position := 0;
Stream.Position := 0;
Stream.CopyFrom(fItems[i].value, fItems[i].value.Size);
End;
Finally
arrayLock.EndRead;
End;
Except { ...execution jumps to here }
On E: Exception Do Begin
x.xLogError('LogErrorCacheInMemory.txt', 'StreamValue:' + E.Message + ' ItemsCount:' + IntToStr( High(fItems)) + 'Memory:' + IntToStr(x.GetMemoryInfoMemory) + endLn + 'StreamSize : ' + IntToStr(fItems[i].value.Size) + ' i=' + IntToStr(i) + 'Name: ' + Name);
Clear;
End
End;
End;
Log Entries:
3/10/2011 10:52:59 AM: StreamValue:Stream read error ItemsCount:7562 Memory:240816
StreamSize : 43 i=7506 Name: \\xxxxxxxx\WebRoot\\images\1x1.gif
3/10/2011 12:39:14 PM: StreamValue:Stream read error ItemsCount:10172 Memory:345808
StreamSize : 849 i=10108 Name: \\xxxxxxxx\WebRoot\\css\screen.add.css
3/10/2011 3:45:29 PM: StreamValue:Stream read error ItemsCount:11200 Memory:425464
StreamSize : 3743 i=11198 Name: \\xxxxxxxx\WebRoot\\JS\ArtWeb.js
P.S.
arrayLock: TMultiReadExclusiveWriteSynchronizer;
fItems: Array Of rCache;
Type
rCache = Record
Name: String;
value: TStringStream;
expired: TDateTime;
End;
And calling function:
Function tCacheInMemory.CacheCheck(cName: String; Out BlobStream: TStringStream): Boolean;
Begin
Result := False;
If Not IfUseCache Then
exit;
BlobStream.SetSize(0);
BlobStream.Size := 0;
StreamValue(trim(cName), True, BlobStream);
If BlobStream.Size > 0 Then
Result := True;
End;
`
You're not using correct locking. You're acquiring a read lock on the array of cache entries, but once you find the item you want, you modify it. First, you explicitly modify it by assigning its Position property, and then you implicitly modify it by reading from it, which modifies its Position property again. When other code attempts to read from that same cache item, you'll have interference. If the source stream's Position property changes between the time the destination stream calculates how many bytes are available and the time it actually requests to read those bytes, you'll get a stream-read error.
I have a couple pieces of advice related to this:
Don't use streams as a storage device in the first place. You're apparently holding the contents of files. You're not going to change those, so you don't need a data structure designed for making sequential changes. Instead, just store the data in simple arrays of bytes: TBytes. (Also, use of TStringStream in particular introduces confusion over whether those strings' encodings are important. A simple file cache shouldn't be concerned with string encodings at all. If you must use a stream, use a content-agnostic class like TMemoryStream.)
Don't quell an exception that you haven't actually handled. In this code, you're catching all exception types, logging some information, clearing the cache, and then proceeding as though everything is normal. But you haven't done anything to resolve the problem that triggered the exception, so everything is not normal. Since you're not really handling the exception, you need to make sure it propagates to the caller. Call raise after to call Clear. (And when you log the exception, make sure you log the exception's ClassName value as well as its message.)
It looks like something external is blocking your stream files.
You could try to use Process Monitor to see what blocks it.
Another thing you can try is to open the stream in read-deny-write mode (please show us how you open the stream).
Something like this:
Stream := TFileStream.Create(FileName, fmOpenRead or fmShareDenyWrite) ;
Edit 1: Disregard the strike through part: you are using TStringStream.
I'll keep the answer just in case anyone ever gets this kind of error when using TFileStream.
Edit 2: Yuriy posted this interesting addendum, but I'm not sure it will work, as the BlobStream is not initialized, just like Robert Love suspected:
Function TCacheInMemory.CacheCheck(cName: String; Out BlobStream: TStringStream): Boolean;
Begin
Result := False;
Try
If Not IfUseCache Then
exit;
BlobStream.SetSize(0);
BlobStream.Size := 0;
StreamValue(trim(cName), True, BlobStream);
If BlobStream.Size > 0 Then
Result := True;
Except
On E: Exception Do
Begin
x.xLogError('LogErrorCacheInMemory.txt', 'CheckCacheOutStream:' + E.Message + ' ItemsCount:' + IntToStr( High(fItems)) + 'Memory:' + IntToStr(x.GetMemoryInfoMemory));
End;
End;
End;
--jeroen
Related
I have an external message coming in every second.
The message payload is saved in a ClientDataSet and displayed in a dbGrid.
No data base is involved. RAM storage only.
This works fine,
BUT,
I have intermittent problems when the dataset is empty and populated the first time.
The code is as follows:
procedure TCtlCfg_DM_WarningsFaults_frm.DecodeRxFrame(Protocol: TProtocolSelection;
// PROVA UTAN VAR VAR Frame : CAN_Driver.TCAN_Frame);
Frame : CAN_Driver.TCAN_Frame);
var
OldRecNo : integer;
// OldIxname : string;
// bMark : TBookMark;
WasFiltered : Boolean;
IdBitFields : TCanId_IdBitFields;
Msg : TCan_Msg;
MsgType : integer;
GlobalNode : TCanId_GlobalNode;
LocalNode : TCanId_LocalNode;
SubNode : TCanId_SubNode;
EntryType : integer;
SubSystemType : integer;
SubSystemDeviceId : integer;
IsActive : Boolean;
IsAcked : Boolean;
begin
with cdsWarningsFaults do
begin
if not Active then Exit;
Msg := Frame.Msg;
IdBitFields := DecodeCanId(Protocol, Frame.ID);
if IdBitFields.SubNode <> cSubNode_Self then Exit; // Ignore non controller/slave messages
if IdBitFields.AddressMode <> cCanId_AddrMode_CA then Exit;
MsgType := IdBitFields.MessageType;
if MsgType <> cMsg_CTL_CA_Broadcast_WarningAndFaultList then Exit;
if Frame.MsgLength < 5 then Exit;
GlobalNode := IdBitFields.GlobalNode;
LocalNode := IdBitFields.LocalNode;
SubNode := IdBitFields.SubNode;
// Silent exit if wrong node
if GlobalNode <> fNodeFilter.GlobalNode then Exit;
if LocalNode <> fNodeFilter.LocalNode then Exit;
if SubNode <> fNodeFilter.SubNode then Exit;
EntryType := Msg[1];
SubSystemType := Msg[2];
IsActive := (Msg[3] = 1);
SubSystemDeviceId := Msg[4];
IsAcked := (Msg[8] = 1);
DisableControls; // 2007-12-03/AJ Flytta inte scrollbars under uppdatering
OldRecNo := RecNo;
// OldIxName := IndexName; // Save current index
// IndexName := IndexDefs.Items[0].Name;
WasFiltered := Filtered; // Save filter status
Filtered := False;
try
try
if Findkey([GlobalNode, LocalNode, SubNode, EntryType, SubSystemType, SubSystemDeviceId]) then
begin // Update record
Edit;
FieldByName('fIsActive').AsBoolean := IsActive;
FieldByName('fIsAcked').AsBoolean := IsAcked;
FieldByName('fTimeout').AsDateTime := GetDatabaseTimeoutAt;
Post;
MainForm.AddToActivityLog('CtlCfg_DM_WF: DecodeRxFrame: Efter Edit. N=' + IntToStr(GlobalNode) + ' ' +
IntToStr(LocalNode) + ' ' +
IntToStr(SubNode) +
' RecCnt=' + IntToStr(RecordCount) + ' ET=' + IntToStr(EntryType) + ' SST=' + IntToStr(subSystemType) + ' SSD=' + IntToStr(SubSystemDeviceId), False);
end
else
begin // Create new record
Append;
MainForm.AddToActivityLog('CtlCfg_DM_WF: DecodeRxFrame: Efter Append. N=' + IntToStr(GlobalNode) + ' ' +
IntToStr(LocalNode) + ' ' +
IntToStr(SubNode) +
' RecCnt=' + IntToStr(RecordCount) + ' ET=' + IntToStr(EntryType) + ' SST=' + IntToStr(subSystemType) + ' SSD=' + IntToStr(SubSystemDeviceId), False);
try
FieldByName('fGlobalNode').AsInteger := GlobalNode;
FieldByName('fLocalNode').AsInteger := LocalNode;
FieldByName('fSubNode').AsInteger := SubNode;
FieldByName('fEntryType').AsInteger := EntryType;
FieldByName('fSubSystemType').AsInteger := SubSystemType;
FieldByName('fSubSystemDeviceId').AsInteger := SubSystemDeviceId;
FieldByName('fIsActive').AsBoolean := IsActive;
FieldByName('fIsAcked').AsBoolean := IsAcked;
FieldByName('fTimeout').AsDateTime := GetDatabaseTimeoutAt;
finally
try
Post; // VArför biter inte denna post så att det blir edit nästa gång
except
MainForm.AddToActivityLog('CtlCfg_DM_WF: DecodeRxFrame: Exception efter Post.', True);
end;
MainForm.AddToActivityLog('CtlCfg_DM_WF: DecodeRxFrame: Efter Post. N=' + IntToStr(GlobalNode) + ' ' +
IntToStr(LocalNode) + ' ' +
IntToStr(SubNode) +
' RecCnt=' + IntToStr(RecordCount) + ' ET=' + IntToStr(EntryType) + ' SST=' + IntToStr(subSystemType) + ' SSD=' + IntToStr(SubSystemDeviceId), False);
end;
end;
except
on E: Exception do
begin
MainForm.AddToActivityLog('Post exception message: [' + E.Message + ']', False);
MainForm.AddToActivityLog('Post exception class: [' + E.ClassName + ']', False);
MainForm.AddToActivityLog('Post exception Error code: [' + IntToStr(EDBCLIENT (E).ErrorCode) + ']', False);
MainForm.AddToActivityLog('Post exception ReadOnly is: [' + BoolToStr(ReadOnly) + ']', False);
MainForm.AddToActivityLog('Post exception CanModify is: [' + BoolToStr(CanModify) + ']', False);
MainForm.AddToActivityLog('DecodeRxFrame: Exception inside FindKey block', False);
Cancel;
end;
end;
finally
// IndexName := OldIxName; // Restore previous index
Filtered := WasFiltered; // Restore filter state
if (OldRecNo >= 1) and (OldRecNo <= RecordCount) then RecNo := OldRecNo;
EnableControls;
end;
end;
//MainForm.AddToActivityLog('DecodeRxFrame: Exit ur proceduren', False);
end;
The problem is when the record does not already exist,
and I need to Append a new record.
It often works fine, but many times it seems the POST does not work,
and the append is repeated a few or many times when new data comes in.
Suddenly the append works, and subbsequent updates are done using edit,
and as far as I can tell, after that it then works forever.
The issue is intermittent and the number of tries needed to succeed vary.
It feels like a timing issue, but I cannot figure it out.
Any ideas greatly appreciated.
Thanks,
Anders J
As mentioned in my comment a lot can be figured out about how the code flows using an extract of the logs. (Also as a side-note, sometimes you need to be careful of the reliability of your logging system. Your logging is at least partially home-brew, so I have no idea what it means when you arbitrarily pass True/False values to the AddToActivityLog method.)
However, I am still able to offer some guidance to help you identify your problem. I also have some general comments to help you improve your code.
You're not shy to use logging to narrow down your problem: this is a good thing!
However you technique could use a little improvement. You're trying to determine what's going wrong around the Post method. Given such a clear goal, your logging seems surprisingly haphazard.
You should do the following:
//Log the line before Post is called. It confirms when it is called.
//Log important state information to check you're ready to post "correctly"
//In this it's especially useful to know the Sate of the dataset (dsEdit/dsInsert).
Post;
//Log the line after Post to confirm it completed.
//Note that "completed" and "succeeded" aren't always the same, so...
//Again, you want to log important state information (in this case dsBrowse).
If you had this this logging, you might (for example) be able to tell us that:
Before calling Post dataset is in dsInsert state.
And (assuming no exceptions as you say): after calling Post the dataset is still in dsInsert state.
NOTE: If it were in dsBrowse but Post still considered "unsuccessful", you'd be told to log details of the record before and after Post.
So now: Post "completing" without the record being "posted" would give a few more things to look at:
What events are hooked to the data set? Especially consider events used for validation.
Since you're working with TClientDataSet there's an important event you'll want to take a look at OnPostError. DBClient uses this callback mechanism to notify the client of errors while posting.
If you log OnPostError I'm sure you'll get a better idea of the problem.
Finally I mentioned you have a lot of other problems with your code; especially the error handling.
Don't use with until you know how to use it correctly. When you know how to use it correctly, you'll also know there's never a good reason to use it. As it stands, your code is effectively 2 characters short of a subtle bug that could have been a nightmare to even realise it even existed; but would be a complete non-issue without with. (You declared and used a property called IsActive differing by only 2 characters from TDataSet's Active. I'm sure you didn't realise this; and their difference is but an accident. However, if they had been the same, with would very quietly use the wrong one.)
You need to write smaller methods - MUCH smaller! Blobs of code like you have are a nightmare to debug and are excellent at hiding bugs.
Your exception handling is fundamentally wrong:
Your comment about logging and exception handling suggests that you've been simply adding what you can out of desperation. I think it pays to understand what's going on to keep your logging useful and avoid the clutter. Let's take a close look at the most problematic portion.
/_ try
/_ FieldByName('fGlobalNode').AsInteger := GlobalNode;
/_E FieldByName('fLocalNode').AsInteger := LocalNode;
| FieldByName('fSubNode').AsInteger := SubNode;
| FieldByName('fEntryType').AsInteger := EntryType;
| FieldByName('fSubSystemType').AsInteger := SubSystemType;
| FieldByName('fSubSystemDeviceId').AsInteger := SubSystemDeviceId;
| FieldByName('fIsActive').AsBoolean := IsActive;
| FieldByName('fIsAcked').AsBoolean := IsAcked;
| FieldByName('fTimeout').AsDateTime := GetDatabaseTimeoutAt;
|_ finally
/_ try
/_ Post;
/ except
| MainForm.AddToActivityLog(..., True);
| end;
|_ MainForm.AddToActivityLog(..., False);
/ end;
|
...
So, in the above code:
If no exceptions happen, you'd simply step from one line to the next.
But as soon as an exception happens, you jump to the next finally/except block.
The first problem is: Why would you try to force a Post if you haven't finished setting your field values. It's a recipe for headaches when you end up with records that have only a fraction of the data they should - unless you're lucky and Post fails because critical data is missing.
When finally finishes during an exception, code immediately jumps to the next finally/except in the call-stack.
Except is slightly different, it only gets called if something did go wrong. (Whereas finally guarantees it will be called with/without an exception.
TIPS: (good for 99% of exception handling cases.
Only use try finally for cleanup that must happen in both success and error cases.
Nest your try finally blocks: The pattern is <Get Resource>try .... finally<Cleanup>end (The only place to do another resource protection is inside the .....)
Avoid except in most cases.
The main exception to the previous rule is when cleanup is needed only in the case of an error. In which case: do the cleanup and re-raise the exception.
Other than that: only implement an except block without re-raising if you can fully resolve an error condition. (Meaning the lines of code immediately after the exception swallower truly don't care about the previous exception.
i am working on a little byte patching program but i encountered an error.
copying the file before modifying fails with no error, (no copied output is seen) but the file patches successfully.
Here is the Patch Code
procedure DoMyPatch();
var
i: integer;
FileName: string;
input: TFileStream;
FileByteArray, ExtractedByteArray: array of Byte;
begin
FileName := 'Cute1.res';
try
input := TFileStream.Create(FileName, fmOpenReadWrite);
except
begin
ShowMessage('Error Opening file');
Exit;
end
end;
input.Position := 0;
SetLength(FileByteArray, input.size);
input.Read(FileByteArray[0], Length(FileByteArray));
for i := 0 to Length(FileByteArray) do
begin
SetLength(ExtractedByteArray, Length(OriginalByte));
ExtractedByteArray := Copy(FileByteArray, i, Length(OriginalByte));
// function that compares my array of bytes
if CompareByteArrays(ExtractedByteArray, OriginalByte) = True then
begin
// Begin Patching
CopyFile(PChar(FileName), PChar(ChangeFileExt(FileName, '.BAK')),
true); =======>>> fails at this point, no copied output is seen.
input.Seek(i, SoFromBeginning);
input.Write(BytetoWrite[0], Length(BytetoWrite)); =====>>> patches successfully
input.Free;
ShowMessage('Patch Success');
Exit;
end;
end;
if Assigned(input) then
begin
input.Free;
end;
ShowMessage('Patch Failed');
end;
sidenote : it copies fine if i close the filestream before attempting copy.
by the way, i have tested it on Delphi 7 and XE7.
Thanks
You cannot copy the file because you locked it exclusively when you opened it for the file stream, which is why CopyFile fails.
You should close the file before attempting to call CopyFile. Which would require you to reopen the file to patch it. Or perhaps open the file with a different sharing mode.
Some other comments:
The exception handling is badly implemented. Don't catch exceptions here. Let them float up to the high level.
Lifetime management is fluffed. You can easily leak as it stands. You need to learn about try/finally.
You overrun buffers. Valid indices for a dynamic array are 0 to Length(arr)-1 inclusive. Or use low() and high().
You don't check the value returned by CopyFile. Wrap it with a call to Win32Check.
The Copy function returns a new array. So you make a spurious call to SetLength. To copy the entire array use the one parameter overload of Copy.
Showing messages in this function is probably a mistake. Better to let the caller provide user feedback.
There are loads of other oddities in the code and I've run out of energy to point them all out. I think I got the main ones.
I am still having issues with the TComPort component but this time is not the component itself is the logic behind it. I have a device witch sends some ascii strings via serial port i need to prase those strings the problem is the computer reacts very fast so in the event char it captures only a part of the string the rest of the string comes back later... so parsing it when it is recived makes it impossible.
I was thinking in writing a timer witch verify if there was no serial activity 10 secons or more and then prase the string that i am saving into a buffer. But this method is unprofessional isn't there a idle event witch i can listen...Waiting for the best solution for my problem. Thanks.
After using a number of serial-port-components, I've got the best results until now, by using CreateFile('\\?\COM1',GENERIC_READ or GENERIC_WRITE,0,nil,OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL,0), passing that handle to a THandleStream instance, and starting a dedicated thread to read from it. I know threads take a little more work than writing an event handler, but it still is the best way to handle any synchronization issues that arise from using serial ports.
Typical handler for OnRXChar event:
procedure XXX.RXChar(Sender: TObject; Count: Integer);
begin
ComPort.ReadStr(s, Count);
Accumulator := Accumulator + s;
if not AccumContainsPacketStart then
Accumulator := ''
else if AccumContainsPacketEndAfterStart then begin
ExtractFullStringFromAccum;
ParseIt;
end;
end;
Note.
Most com-port components do not have a clue when to report back to the owner. Normally the thread that is responsible to gather the bytes from the port is informed by the OS that one or more bytes are ready to be processed. This information is then simply popped up to your level. So when you expect the message to be transferred, you get what the OS is giving you.
You have to buffer all incoming characters in a global buffer. When you get the final character in your message string, handle the message.
Here is an example where the message start is identified with a special character and the end of the message is identified with another character.
If your message is constructed in another way, I'm sure you can figure out how to adapt the code.
var
finalBuf: AnsiString;
{- Checking message }
Function ParseAndCheckMessage(const parseS: AnsiString) : Integer;
begin
Result := 0; // Assume ok
{- Make tests to confirm a valid message }
...
end;
procedure TMainForm.ComPortRxChar(Sender: TObject; Count: Integer);
var
i,err: Integer;
strBuf: AnsiString;
begin
ComPort.ReadStr(strBuf, Count);
for i := 1 to Length(strBuf) do
case strBuf[i] of
'$' :
finalBuf := '$'; // Start of package
#10 :
begin
if (finalBuf <> '') and (finalBuf[1] = '$') then // Simple validate check
begin
SetLength( finalBuf, Length(finalBuf) - 1); // Strips CR
err := ParseAndCheckMessage(finalBuf);
if (err = 0) then
{- Handle validated string }
else
{- Handle error }
end;
finalBuf := '';
end;
else
finalBuf := finalBuf + strBuf[i];
end;
end;
If your protocol has begin/end markers, you can use TComDataPacket to provide you full packets, when they are available.
For certain amount of character we can use delay some miliseconds before ReadStr to make sure the data is completely sent. Example for 4 amount of character:
procedure TForm1.ComPort1RxChar(Sender: TObject; Count: Integer);
var
Str: String;
tegangan : real;
begin
sleep(100); //delay for 100ms
ComPort1.ReadStr(Str, 4);
...
I wrote Delphi debug visualizer for TDataSet to display values of current row, source + screenshot: http://delphi.netcode.cz/text/tdataset-debug-visualizer.aspx . Working good, but very slow. I did some optimalization (how to get fieldnames) but still for only 20 fields takes 10 seconds to show - very bad.
Main problem seems to be slow IOTAThread90.Evaluate used by main code shown below, this procedure cost most of time, line with ** about 80% time. FExpression is name of TDataset in code.
procedure TDataSetViewerFrame.mFillData;
var
iCount: Integer;
I: Integer;
// sw: TStopwatch;
s: string;
begin
// sw := TStopwatch.StartNew;
iCount := StrToIntDef(Evaluate(FExpression+'.Fields.Count'), 0);
for I := 0 to iCount - 1 do
begin
s:= s + Format('%s.Fields[%d].FieldName+'',''+', [FExpression, I]);
// FFields.Add(Evaluate(Format('%s.Fields[%d].FieldName', [FExpression, I])));
FValues.Add(Evaluate(Format('%s.Fields[%d].Value', [FExpression, I]))); //**
end;
if s<> '' then
Delete(s, length(s)-4, 5);
s := Evaluate(s);
s:= Copy(s, 2, Length(s) -2);
FFields.CommaText := s;
{ sw.Stop;
s := sw.Elapsed;
Application.MessageBox(Pchar(s), '');}
end;
Now I have no idea how to improve performance.
That Evaluate needs to do a surprising amount of work. The compiler needs to compile it, resolving symbols to memory addresses, while evaluating properties may cause functions to be called, which needs the debugger to copy the arguments across into the debugee, set up a stack frame, invoke the function to be called, collect the results - and this involves pausing and resuming the debugee.
I can only suggest trying to pack more work into the Evaluate call. I'm not 100% sure how the interaction between the debugger and the evaluator (which is part of the compiler) works for these visualizers, but batching up as much work as possible may help. Try building up a more complicated expression before calling Evaluate after the loop. You may need to use some escaping or delimiting convention to unpack the results. For example, imagine what an expression that built the list of field values and returned them as a comma separated string would look like - but you would need to escape commas in the values themselves.
Because Delphi is a different process than your debugged exe, you cannot direct use the memory pointers of your exe, so you need to use ".Evaluate" for everything.
You can use 2 different approaches:
Add special debug dump function into executable, which does all value retrieving in one call
Inject special dll into exe with does the same as 1 (more hacking etc)
I got option 1 working, 2 should also be possible but a little bit more complicated and "ugly" because of hacking tactics...
With code below (just add to dpr) you can use:
Result := 'Dump=' + Evaluate('TObjectDumper.SpecialDump(' + FExpression + ')');
Demo code of option 1, change it for your TDataset (maybe make CSV string of all values?):
unit Unit1;
interface
type
TObjectDumper = class
public
class function SpecialDump(aObj: TObject): string;
end;
implementation
class function TObjectDumper.SpecialDump(aObj: TObject): string;
begin
Result := '';
if aObj <> nil then
Result := 'Special dump: ' + aObj.Classname;
end;
initialization
//dummy call, just to ensure it is linked c.q. used by compiler
TObjectDumper.SpecialDump(nil);
end.
Edit: in case someone is interested: I got option 2 working too (bpl injection)
I have not had a chance to play with the debug visualizers yet, so I do not know if this work, but have you tried using Evaluate() to convert FExpression into its actual memory address? If you can do that, then type-cast that memory address to a TDataSet pointer and use its properties normally without going through additional Evaluate() calls. For example:
procedure TDataSetViewerFrame.mFillData;
var
DS: TDataSet;
I: Integer;
// sw: TStopwatch;
begin
// sw := TStopwatch.StartNew;
DS := TDataSet(StrToInt(Evaluate(FExpression)); // this line may need tweaking
for I := 0 to DS.Fields.Count - 1 do
begin
with DS.Fields[I] do begin
FFields.Add(FieldName);
FValues.Add(VarToStr(Value));
end;
end;
{
sw.Stop;
s := sw.Elapsed;
Application.MessageBox(Pchar(s), '');
}
end;
I'm using Delphi 2007 and have an application that read log-files from several places over an internal network and display exceptions. Those directories sometimes contains thousands of log-files. The application have an option to read only log-files from the n latest days bit it can also be in any datetime range.
The problem is that the first time the log directory is read it can be very slow (several minutes). The second time it is considerably faster.
I wonder how I can optimize my code to read the log-files as fast as possible ?
I'm using a vCurrentFile: TStringList to store the file in memory.
That is updated from a FileStream as I think this is faster.
Here is some code:
Refresh: The main loop to read log-files
// In this function the logfiles are searched for exceptions. If a exception is found it is stored in a object.
// The exceptions are then shown in the grid
procedure TfrmMain.Refresh;
var
FileData : TSearchRec; // Used for the file searching. Contains data of the file
vPos, i, PathIndex : Integer;
vCurrentFile: TStringList;
vDate: TDateTime;
vFileStream: TFileStream;
begin
tvMain.DataController.RecordCount := 0;
vCurrentFile := TStringList.Create;
memCallStack.Clear;
try
for PathIndex := 0 to fPathList.Count - 1 do // Loop 0. This loops until all directories are searched through
begin
if (FindFirst (fPathList[PathIndex] + '\*.log', faAnyFile, FileData) = 0) then
repeat // Loop 1. This loops while there are .log files in Folder (CurrentPath)
vDate := FileDateToDateTime(FileData.Time);
if chkLogNames.Items[PathIndex].Checked and FileDateInInterval(vDate) then
begin
tvMain.BeginUpdate; // To speed up the grid - delays the guichange until EndUpdate
fPathPlusFile := fPathList[PathIndex] + '\' + FileData.Name;
vFileStream := TFileStream.Create(fPathPlusFile, fmShareDenyNone);
vCurrentFile.LoadFromStream(vFileStream);
fUser := FindDataInRow(vCurrentFile[0], 'User'); // FindData Returns the string after 'User' until ' '
fComputer := FindDataInRow(vCurrentFile[0], 'Computer'); // FindData Returns the string after 'Computer' until ' '
Application.ProcessMessages; // Give some priority to the User Interface
if not CancelForm.IsCanceled then
begin
if rdException.Checked then
for i := 0 to vCurrentFile.Count - 1 do
begin
vPos := AnsiPos(MainExceptionToFind, vCurrentFile[i]);
if vPos > 0 then
UpdateView(vCurrentFile[i], i, MainException);
vPos := AnsiPos(ExceptionHintToFind, vCurrentFile[i]);
if vPos > 0 then
UpdateView(vCurrentFile[i], i, HintException);
end
else if rdOtherText.Checked then
for i := 0 to vCurrentFile.Count - 1 do
begin
vPos := AnsiPos(txtTextToSearch.Text, vCurrentFile[i]);
if vPos > 0 then
UpdateView(vCurrentFile[i], i, TextSearch)
end
end;
vFileStream.Destroy;
tvMain.EndUpdate; // Now the Gui can be updated
end;
until(FindNext(FileData) <> 0) or (CancelForm.IsCanceled); // End Loop 1
end; // End Loop 0
finally
FreeAndNil(vCurrentFile);
end;
end;
UpdateView method: Add one row to the displaygrid
{: Update the grid with one exception}
procedure TfrmMain.UpdateView(aLine: string; const aIndex, aType: Integer);
var
vExceptionText: String;
vDate: TDateTime;
begin
if ExceptionDateInInterval(aLine, vDate) then // Parse the date from the beginning of date
begin
if aType = MainException then
vExceptionText := 'Exception'
else if aType = HintException then
vExceptionText := 'Exception Hint'
else if aType = TextSearch then
vExceptionText := 'Text Search';
SetRow(aIndex, vDate, ExtractFilePath(fPathPlusFile), ExtractFileName(fPathPlusFile), fUser, fComputer, aLine, vExceptionText);
end;
end;
Method to decide if row is in daterange:
{: This compare exact exception time against the filters
#desc 2 cases: 1. Last n days
2. From - to range}
function TfrmMain.ExceptionDateInInterval(var aTestLine: String; out aDateTime: TDateTime): Boolean;
var
vtmpDate, vTmpTime: String;
vDate, vTime: TDateTime;
vIndex: Integer;
begin
aDateTime := 0;
vtmpDate := Copy(aTestLine, 0, 8);
vTmpTime := Copy(aTestLine, 10, 9);
Insert(DateSeparator, vtmpDate, 5);
Insert(DateSeparator, vtmpDate, 8);
if TryStrToDate(vtmpDate, vDate, fFormatSetting) and TryStrToTime(vTmpTime, vTime) then
aDateTime := vDate + vTime;
Result := (rdLatest.Checked and (aDateTime >= (Now - spnDateLast.Value))) or
(rdInterval.Checked and (aDateTime>= dtpDateFrom.Date) and (aDateTime <= dtpDateTo.Date));
if Result then
begin
vIndex := AnsiPos(']', aTestLine);
if vIndex > 0 then
Delete(aTestLine, 1, vIndex + 1);
end;
end;
Test if the filedate is inside range:
{: Returns true if the logtime is within filters range
#desc Purpose is to sort out logfiles that are no idea to parse (wrong day)}
function TfrmMain.FileDateInInterval(aFileTimeStamp: TDate): Boolean;
begin
Result := (rdLatest.Checked and (Int(aFileTimeStamp) >= Int(Now - spnDateLast.Value))) or
(rdInterval.Checked and (Int(aFileTimeStamp) >= Int(dtpDateFrom.Date)) and (Int(aFileTimeStamp) <= Int(dtpDateTo.Date)));
end;
The problem is not your logic, but the underlying file system.
Most file systems get very slow when you put many files in a directory. This is very bad with FAT, but NTFS also suffers from it, especially if you have thousands of files in a directory.
The best you can do is reorganize those directory structures, for instance by age.
Then have at most a couple of 100 files in each directory.
--jeroen
How fast do you want to be? If you want to be really fast, then you need to use something besides windows networking to read the files. The reason is if you want to read the last line of a log file (or all the lines since the last time you read it) then you need to read the whole file again.
In your question you said the problem is that it is slow to enumerate your directory listing. That is your first bottleneck. If you want to be real fast then you need to either switch to HTTP or add some sort of log server to the machine where the log files are stored.
The advantage of using HTTP is you can do a range request and just get the new lines of the log file that were added since you last requested it. That will really improve performance since you are transferring less data (especially if you enable HTTP compression) and you also have less data to process on the client side.
If you add a log server of some sort, then that server can do the processing on the server side, where it has native access to the data, and only return the rows that are in the date range. A simple way of doing that may be to just put your logs into a SQL database of some sort, and then run queries against it.
So, how fast do you want to go?