Opening a Firebird database file on a network share - delphi

I thought converting a mapped drive letter to a UNC path would be enough to be able to open a .GDB file,
but alas:
function ConvertToUNCPath(AMappedDrive: string) : string;
var
lRemoteString : array[0..255] of char;
lpRemote : PChar;
lStringLen : Cardinal;
begin
lpRemote := #lRemoteString;
lStringLen := 255;
If WNetGetConnection(Pchar(ExtractFileDrive(AMappedDrive)) ,
lpRemote,
lStringLen) = NO_ERROR Then
Result := lRemoteString
else
Result := ''; // No mapping found
end;
function TDataModuleData.OpenGDBDatabase(AGDBName: string) : Boolean;
var
lDlgLogin: TFrmLogin;
p : Integer;
lUNC,
lErrMsg : String;
begin
Result := False;
with FDConnection do // TFDConnection
begin
Close;
TxOptions.Isolation := xiDirtyRead;
p := Pos(':',AGDBName);
if p = 2 then
begin
lUNC := ConvertToUNCPath(Copy(AGDBName,1,2));
if lUNC <> '' then
begin
lUNC := Copy(lUNC,3);
p := pos('\',lUNC);
AGDBName := Copy(lUNC,p) + Copy(AGDBName,3);
lUNC := copy(lUNC,1,p-1);
end;
end;
DriverName := S_FD_IBId;
Params.Database := AGDBName;
if lUNC <> '' then
Params.Add('Server=' + lUNC)
else
Params.Add('Server=localhost'); // Not strictly necessary
Params.UserName := 'SYSDBA';
Params.Password := 'masterkey';
try
Open;
Result := Connected;
except
on E:Exception do
begin
lErrMsg := LowerCase(E.Message);
end;
end;
end;
end;
Depending on how I parse the ConvertToUNCPath result I get different error messages:
[firedac][phys][ib]unavailable database
[firedac][phys][ib]i/o error during "createfile (open)" operation for file "persoonlijk\jan\klanten.gdb"'#$D#$A'error while trying to open file'#$D#$A'the system cannot find the path specified.
The part of the code using ConvertToUNCPath succesfully converts e.g. P:\Jan\KLANTEN.GDB to \\tt2012server\persoonlijk\Jan\KLANTEN.GDB.
How can I open a GDB file when the path points to a mapped drive letter?
Added: I tried these hardcoded variations, they all fail:
// lUNC := '\\2012server'; // Unable to complete network request to host
lUNC := 'tt2012server';
//AGDBName := '\\tt2012server\persoonlijk\jan\klanten.gdb';
//AGDBName := 'tt2012server\persoonlijk\jan\klanten.gdb';
//AGDBName := '\persoonlijk\jan\klanten.gdb';
//AGDBName := 'persoonlijk\jan\klanten.gdb';
//AGDBName := '\jan\klanten.gdb';
//AGDBName := 'jan\klanten.gdb';
//AGDBName := 'p:\jan\klanten.gdb'; (original input)
(P: maps to \\tt2012server\persoonlijk)
Added:
Sorry, I was not clear in my initial text: this is not about connecting to a database on a remote server per se. I just want my local 'DB inspection' tool to be able to open a GDB file if someone places it in my network share for inspection (instead of having to copy it to local disk first).
To only intention of using WNetGetConnection was to resolve drive letter to UNC path (some I code I found on the web).

1. Firebird explicitly denies attempts to open database files on non-local disks
Firebird is database server, and as such it focuses on performance and reliability.
http://www.firebirdfaq.org/faq46/
Performance means lots of data is cached, both cached for reading and cached for writing.
Reliability means Firebird has to gain worthy warrants from OS that:
a. no other process would tinker with the database file while the server has some data from it cached for reading.
b. at any moment in time the server might wish to write any data to the file from its cache and it is warranted that that data - at any moment in time - ends persistently written to the persistent media.
Network-connected disks nullify both warranties and consequently Firebird Server refuses to trust them.
You may hack Firebird configuration or source files on your own discretion to remove this safety check and open network-shared files, if you really need this more than safety and speed.
But proper solution would be installing Firebird server on the machine whose disks do carry the database file.
2. Connection String is not a database file name
AGDBName := '\\tt2012server\persoonlijk\jan\klanten.gdb'
This does NOT mean "local Firebird server should connect to tt2012server server using LOCAL_SYSTEM credentials and read the database file from persoonlijk shared resource", as you probably intended it to mean.
http://www.firebirdfaq.org/faq260/
If anything, Windows LOCAL_SYSTEM user is explicitly barred from most network operations to contain intruders and viruses. Even if you hack Firebird into opening network files, most probably Windows would prohibit this access anyway, unless you would setup your Windows to run Firebird Server service with some user account other than the default LOCAL_SYSTEM.
Anyway, what \\tt2012server\persoonlijk\jan\klanten.gdb Connection String actually means is that you request your application to connect to tt2012server using WNET (aka Microsoft Named Pipes) protocol and find Firebird server running on that server and communicating by WNET protocol, as opposed to TCP/IP protocol.
Judging by the error you quote - lUNC := '\\2012server'; // Unable to complete network request to host - the said tt2012server computer perhaps does not have a Firebird Server running and accepting Named Pipes connections.
The WNET protocol is considered obsoleted and would most probably be removed from the future Firebird Server versions. As of now it is working, but few people use it, thus little up to date experience exists in that area. It is suggested you would use TCP/IP protocol by default to connect your application to the Firebird Server running on the tt2012server machine, not WNET protocol.
PS. This question has duplicates:
Connecting to Firebird database from Windows local network
ibase_connect: remote computer host and shared db file from windows
PPS. Firebird is a multi-generation database engine.
Consequently, there is no "dirty read" transactions possible in Interbase/Yaffil/Firebird family.
TxOptions.Isolation := xiDirtyRead; - this line would not work. Most probably it would silently change the transaction class to "READ COMMITTED", less probably it would give an explicit error.

Related

FireDAC "unable to complete network request to host"

The full error text is Remote error: [FireDAC][Phys][FB]Unable to complete network request to host "dataserver16". Error writing data to the connection. Now it seems that others have had this problem then once they sorted it, it went away, but I have the problem sporadically.
My Datasnap ISAPI.dll which contains the FireDAC Firebird connection, is running on an IIS server on a different machine to the one where the database is hosted (dataserver16) but on the same subnet. I know everything is configured correctly, because the application works to expectations about 70% of the time! The other 30% of the time, my Datasnap client receives this error (as passed back from the dll).
IMHO it looks like there is a Network issue. If the Connection is Etablished and you can read and write Data to this connection it seams to be correct.
Have you tried to do a Ping from your Source System to the Target and log that Ping so you can See if the hole Connection to the Server disapears?
Open Commandwindow as Admin and Type:
Ping {TARGET} -t >> c:\ping.log
Than wait until the Error apears and check the Logfile if your Target was available the hole Time.
For more Help we need more Background Information, like Firebird Version or If you are able to reproduce the Error + Source Code how you set up your Connection.
For completeness, I am posting my solution here. Perhaps others will gain benefit from this answer. The solution is to perform retries of the Firebird connection. The way I did it, is every TSQLQuery's BeforeOpen event handler is wired to the same method. This has improved reliability considerably (even if it slowed it down a little). The code for FireDAC is similar. Both DBX and FireDac work equally well here.
const
retrycount = 3;
procedure TServerMethodsDBX.QueryBeforeOpen(DataSet: TDataSet);
begin
TryConnect(TSQLQuery(DataSet).SQLConnection);
// ...
end;
procedure TServerMethodsDBX.TryConnect(SQLConn: TSQLConnection);
var
i: Integer;
Error: String;
begin
i := 0;
SQLConn.Close;
while (not SQLConn.Connected) and (i < retrycount) do
begin
try
SQLConn.Connected := True
except
on e: exception do
begin
Error := Error + ' ' + e.Message;
Sleep(500);
Inc(i);
end;
end;
end;
if i = retrycount then
LogMessage('Tryconnect error: ' + Error);
end;

Max length TSQLConnection.Params values

Hello fellow StackOverflowers,
Currently I'm facing a situation where it seems that there is a maximum length for the Database property of a TSQLConnection object in Delphi.
When I open the connection to my database I get the following error when I use a rather long (154 chars) database name:
dbExpress Error: [0x0015]: Connection failed
SQL Server Error: unrecognized database parameter block
wrong version of database parameter block
When I relocate my database file to another location (and with that reduce the length of the path) it will connect to the database.
I am currently using the Object Inspector to set the connection properties of the TSQLConnection object.
Basically, my question comes down to this:
Does a TSQLConnection have a maximum length for the values set in the Params property? And if so, what is the maximum length of these values?
Update
I've found two ways to open a copy of Employee.Gdb in a folder with a 160-character name ('abcdefghij0123456789' x 8).
What I did firstly was to edit the DBXConnections.Ini file and changed the Database parameter in the [IBConnection] section to read
Database=localhost:D:\abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890\employee.gdb
Then, I can successfully connect to it, open the Employee.Gdb and make changes to the Customer table. I have verified the changes in IBConsole just in case the copy of Employee.Gdb wasn't the one I assumed it was.
Subsequently, I've found that I can create and open the db in code using Delphi Seattle and Interbase XE7, as follows:
function LongPath : String;
begin
Result := 'D:\' + DupeString('abcdefghij0123456789', 8);
end;
function LongDBName : String;
begin
Result := LongPath + '\Employee.Gdb';
end;
procedure TForm1.OpenDB;
var
Ini : TMemIniFile;
const
scDBXConIni = 'C:\Users\Public\Documents\Embarcadero\Studio\dbExpress\17.0\dbxconnections.ini';
scSourceDB = 'D:\Delphi\Databases\Interbase\Employee.Gdb';
begin
Ini := TMemIniFile.Create(scDBXConIni);
try
// First, blank out the Database value in the IBConnection section
// of DBXConnections.Ini
Ini.WriteString('IBConnection', 'Database', '');
Ini.UpdateFile;
// Next, create the long-named directory and copy Employee.Gdb to it
if not DirectoryExists(LongPath) then
MkDir(LongPath);
Assert(CopyFile(PChar(scSourceDB), PChar(LongDBName), False));
// Set LoadParamsOnConnect to False so that the SqlConnection uses
// the value of the Database we are about to give it
SqlConnection1.LoadParamsOnConnect := False;
SqlConnection1.Params.Values['Database'] := LongDBName;
SqlConnection1.Connected := True;
// Open the CDS to view the data
CDS1.Open;
finally
Ini.Free;
end;
end;
The critical step in doing it this way is setting LoadParamsOnConnect to False, which I confess I'd overlooked in earlier attempts to get this code to work.
I've got some earlier versions of Delphi on this machine, so if you're not using Seattle and the above code doesn't work for you, tell me which one you are using and I'll see if I can try that.
**[Original answer]
Actually, I think that this may be an error occurring in one of the DBX DLLs.
I created a folder with a 160-character name, then copied the demo Employee.Gdb database into it. Interbase XE7's IBConsole can open the db without error. So could a small test project contructed with IBX components in Delphi Seattle.
However, with an equivalent DBX project, when I use the code below
procedure TForm1.Button1Click(Sender: TObject);
begin
SqlConnection1.Params.Values['database'] := 'D:\abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890abcdefghij01234567890\employee.gdb';
SqlConnection1.Connected := True;
end;
I get an error in
procedure TDBXDynalinkConnection.DerivedOpen;
var
Count: TInt32;
Names: TWideStringArray;
Values: TWideStringArray;
IsolationLevel: Longint;
DBXError: TDBXErrorCode;
begin
Count := FConnectionProperties.Properties.Count;
FConnectionProperties.GetLists(Names, Values);
CheckResult(FMethodTable.FDBXConnection_Connect(FConnectionHandle, Count, Names, Values));
DBXError := FMethodTable.FDBXConnection_GetIsolation(FConnectionHandle, IsolationLevel);
'I/O error for file "database.gdb"
Error while trying to open file
The operation completed successfully'
and the Database param of the SqlConnection is left at the value 'Database.Gdb', which is not the value I specified, of course, nor was it the value specified in the params in the IDE, which was 'd:\delphi\databases\interbase\employee.gdb'.
I wondered if I could work around this problem by SUBSTing a drive to the 'abcdefg ...' path. I tried that and opening the database as "x:\employee.gdb" , but I get the same error in my DBX app, and also IBConsole cannot access the db either.
I think you need a shorter physical path!**
This is related to MSSql Server:
As a general guideline, long path names greater than 160 characters
might cause problems.
from Microsoft TechNet - https://technet.microsoft.com/en-us/library/ms165768(v=sql.105).aspx

Connecting from a Delphi app to an InterbaseXE7 server on another machine

I have XE8 and the version of InterbaseXE7 that comes with it installed on two machines, A & B. Using IBX or DBX I can connect to the IB server running on the same machine and access its databases without any problem. Btw, I am not a regular IB user.
I had no luck at all connecting from a Delphi app on machine A to an IB database on machine B: I got all manner of errors including a mystifying one about not being able to find the file specified (despite doing a DIR from a CMD prompt to verify that I had the name right) until I discovered that in those circumstances (connecting to a remote server), the database name has to be capitalized in the Delphi app on A exactly as it is on the db host B.
So, assuming there is no way to configure IB and/or IBX to avoid this case-sensitivity, how can I programmatically retrieve a list of the database names, correctly capitalized, on B (assuming I have no access to B's file-system) from a Delphi app on A?
I've tried using the TIBServerProperties component to do this but using code like this:
procedure TForm1.btnPropertiesClick(Sender: TObject);
var
S : String;
begin
IBServerProperties1.Active := True;
IBServerProperties1.FetchDatabaseInfo;
S := IBServerProperties1.DatabaseInfo.DbName[0];
Caption := S;
end;
, the database names are returned from the IB host server in all capitals, which obviously doesn't solve the problem of finding their correct capitalizations.
It turns out that the TIBServerProperties can get DB Aliases from a remote server with the correct capitalization, but not using the DatabaseInfo property. The information can be obtained from its AliasInfo property instead (one of those things that's kind-of obvious with the benefit of hindsight), as shown below.
procedure TForm1.btnPropertiesClick(Sender: TObject);
var
S : String;
i : Integer;
begin
IBServerProperties1.Active := True;
IBServerProperties1.FetchAliasInfo;
for i :=0 to IBServerProperties1.AliasCount - 1 do begin
S := IBServerProperties1.AliasInfo[i].Alias; // <- the .Alias has the
// same capitalization as on the server
S := S + ' ' + IBServerProperties1.AliasInfo[i].DBPath;
Memo2.Lines.Add(S);
end;
end;
, which is good enough for my immediate purpose.
I'd still be interested to know, though, if there is an IB configuration parameter or similar that avoids the case-sensitivity that provoked my q.

Check remote port access using Delphi - Telnet style

I deploy my application in environments heavily stricken with firewalls. Frequently I find myself using Telnet to check if a port is open and accessible in the network.
Now I would like to implement an equivalent functionality of the command, Telnet [domainname or ip] [port], in Delphi.
Is it adequate that I just attempt to open and close a TCP/IP socket without sending or receiving any data?
Is there any risk that I might crash the arbitrary application/service listening on the other end?
Here's my code:
function IsPortActive(AHost : string; APort : Word):boolean;
var IdTCPClient : TIdTCPClient;
begin
IdTCPClient := TIdTCPClient.Create(nil);
try
try
IdTCPClient.Host := AHost;
IdTCPClient.Port := APort;
IdTCPClient.Connect;
except
//Igonre exceptions
end;
finally
result := IdTCPClient.Connected;
IdTCPClient.Disconnect;
FreeAndNil(IdTCPClient);
end;
end;
If you just want to check whether the port is open, then you can use this:
function IsPortActive(AHost : string; APort : Word): boolean;
var
IdTCPClient : TIdTCPClient;
begin
Result := False;
try
IdTCPClient := TIdTCPClient.Create(nil);
try
IdTCPClient.Host := AHost;
IdTCPClient.Port := APort;
IdTCPClient.Connect;
Result := True;
finally
IdTCPClient.Free;
end;
except
//Ignore exceptions
end;
end;
But that only tells you if any server app has opened the port. If you want to make sure that YOUR server app opened the port, then you will have to actually communicate with the server and make sure its responses are what you are expecting. For this reason, many common server protocols provide an initial greeting so clients can identify the type of server they are connected to. You might consider adding a similar greeting to your server, if you are at liberty to make changes to your communication protocol.
Simply opening a connection to the server does not impose any risk of crashing the server, all it does is momentarily occupy a slot in the server's client list. However, if you actually send data to the server, and the server app you are connected to is not your app, then you do run a small risk if the server cannot handle arbitrary data that does not conform it its expected protocol. But that is pretty rare. Sending a small command is not uncommon and usually pretty safe, you will either get back a reply (which may be in a format that does not conform to your protocol, so just assume failure), or you may not get any reply at all (like if the server is waiting for more data, or simply is not designed to return a reply) in which case you can simply time out the reading and assume failure.

Real-time logging of a service application

I have a service application which I will be soon implementing a log file. Before I start writing how it saves the log file, I have another requirement that a small simple form application should be available to view the log in real-time. In other words, if the service writes something to the log, not only should it save it to the file, but the other application should immediately know and display what was logged.
A dirty solution would be for this app to constantly open this file and check for recent changes, and load anything new. But this is very sloppy and heavy. On the other hand, I could write a server/client socket pair and monitor it through there, but it's a bit of an overload I think to use TCP/IP for sending one string. I'm thinking of using the file method, but how would I make this in a way that wouldn't be so heavy? In other words, suppose the log file grows to 1 million lines. I don't want to load the entire file, I just need to check the end of the file for new data. I'm also OK with a delay of up to 5 seconds, but that would contradict the "Real-time".
The only methods of reading/writing a file which I am familiar with consist of keeping file open/locked and reading all contents of the file, and I have no clue how to only read portions from the end of a file, and to protect it from both applications attempting to access it.
What you are asking for is exactly what I do in one of my company's projects.
It has a service that hosts an out-of-process COM object so all of our apps can write messages to a central log file, and then a separate viewer app that uses that same COM object to receive notifications directly from the service whenever the log file changes. The COM object lets the viewer know where the log file is physically located so the viewer can open the file directly when needed.
For each notification that is received, the viewer checks the new file size and then reads only the new bytes that have been written since the last notification (the viewer keeps track of what the previous file size was). In an earlier version, I had the service actually push each individual log entry to the viewer directly, but under heavy load that is a lot of traffic to sift through, so I ended up taking that feature out and let the viewer handle reading the data instead, that way it can read multiple log entries at one time more efficiently.
Both the service and the viewer have the log file open at the same time. When the service creates/opens the log file, it sets the file to read/write access with read-only sharing. When the viewer opens the file, it sets the file to read-only access with read/write sharing (so the service can still write to it).
Needless to say, both service and viewer are run on the same machine so they can access the same local file (no remote files are used). Although the service does have a feature that forwards log entries via TCP/IP to a remote instance of the service running on another machine (then the viewer running on that machine can see them).
Our Open Source TSynLog class matches most of your needs - it's already stable and proven (used in real world applications, including services).
It features mainly fast logging (with a set of levels, not a hierarchy of level), exception interception with stack trace, and custom logging (including serialization of objects as JSON within the log).
You have even some additional features, like customer-side method profiler, and a log viewer.
Log files are locked during generation: you can read them, not modify them.
Works from Delphi 5 up to XE2, fully Open Source and with daily updates.
This may sound like a completely nutty answer but..
I use Gurock Softwares Smart Inspect.. http://www.gurock.com/smartinspect/
its great because you can send pictures, variables whatever and it logs them all, so while you want text atm, its a great for watching your app real time even on remote machines.. it can send it to a local file..
It maybe a useful answer to your problem, or a red herring - its a little unconventional but the additional features it has you may feel worth incorporating later (such as its great for capturing info should something go horribly wrong)
Years ago I wrote a circular buffer binary-file trace logging system, that avoided the problem of an endlessly growing file, while giving me the capabilities that I wanted, such as being able to see a problem if I wanted to, but otherwise, being able to just ignore the trace buffer.
However, if you want a continuous online system, then I would not use files at all.
I used files because I really did want file-like persistence and no listener app to have to be running. I simply wanted the file solution because I wanted the logging to happen whether anybody was around to "listen" right now, or not, but didn't use an endlessly growing text log because I was worried about using up hundreds of megs on log files, and filling up my 250 megabyte hard drive. One hardly has concerns like that in the era of 1 tb hard disks.
As David says, the client server solution is best, and is not complex really.
But you might prefer files, as I did, in my case way back. I only launched my viewer app as a post-mortem tool that I ran AFTER a crash. This was before there was MadExcept or anything like it, so I had some apps that just died, and I wanted to know what had happened.
Before my circular buffer, I would use a debug view tool like sys-internals DebugView and OutputDebugString, but that didn't help me when the crash happened before I launched DebugView.
File-based logging (binary) is one of the few times I allowed myself to create binary files. I normally hate hate hate binary files. But you just try to make a circular buffer without using a fixed length binary record.
Here's a sample unit. If I was writing this now instead of in 1997, I would have not used a "File of record", but hey, there it is.
To extend this unit so it could be used to be the realtime viewer, I would suggest that you simply check the datetime stamp on the binary file and refresh every 1-5 seconds (your choice) but only when the datetime stamp on the binary trace file has changed. Not hard, and not exactly a heavy load on the system.
This unit is used for the logger and for the viewer, it is a class that can read from, and write to, a circular buffer binary file on disk.
unit trace;
{$Q-}
{$I-}
interface
uses Classes;
const
traceBinMsgLength = 255; // binary record message length
traceEOFMARKER = $FFFFFFFF;
type
TTraceRec = record
index: Cardinal;
tickcount: Cardinal;
msg: array[0..traceBinMsgLength] of AnsiChar;
end;
PTraceBinRecord = ^TTraceRec;
TTraceFileOfRecord = file of TTraceRec;
TTraceBinFile = class
FFilename: string;
FFileMode: Integer;
FTraceFileInfo: string;
FStorageSize: Integer;
FLastIndex: Integer;
FHeaderRec: TTraceRec;
FFileRec: TTraceRec;
FAutoIncrementValue: Cardinal;
FBinaryFileOpen: Boolean;
FBinaryFile: TTraceFileOfRecord;
FAddTraceMessageWhenClosing: Boolean;
public
procedure InitializeFile;
procedure CloseFile;
procedure Trace(msg: string);
procedure OpenFile;
procedure LoadTrace(traceStrs: TStrings);
constructor Create;
destructor Destroy; override;
property Filename: string read FFilename write FFilename;
property TraceFileInfo: string read FTraceFileInfo write FTraceFileInfo;
// Default 1000 rows.
// change storageSize to the size you want your circular file to be before
// you create and write it. Remember to set the value to the same number before
// trying to read it back, or you'll have trouble.
property StorageSize: Integer read FStorageSize write FStorageSize;
property AddTraceMessageWhenClosing: Boolean
read FAddTraceMessageWhenClosing write FAddTraceMessageWhenClosing;
end;
implementation
uses SysUtils;
procedure SetMsg(pRec: PTraceBinRecord; msg: ansistring);
var
n: Integer;
begin
n := length(msg);
if (n >= traceBinMsgLength) then
begin
msg := Copy(msg, 1, traceBinMsgLength);
n := traceBinMsgLength;
end;
StrCopy({Dest} pRec^.msg, {Source} PAnsiChar(msg));
pRec^.msg[n] := Chr(0); // ensure nul char termination
end;
function IsBlank(var aRec: TTraceRec): Boolean;
begin
Result := (aRec.msg[0] = Chr(0));
end;
procedure TTraceBinFile.CloseFile;
begin
if FBinaryFileOpen then
begin
if FAddTraceMessageWhenClosing then
begin
Trace('*END*');
end;
System.CloseFile(FBinaryFile);
FBinaryFileOpen := False;
end;
end;
constructor TTraceBinFile.Create;
begin
FLastIndex := 0; // lastIndex=0 means blank file.
FStorageSize := 1000; // default.
end;
destructor TTraceBinFile.Destroy;
begin
CloseFile;
inherited;
end;
procedure TTraceBinFile.InitializeFile;
var
eofRec: TTraceRec;
t: Integer;
begin
Assert(FStorageSize > 0);
Assert(Length(FFilename) > 0);
Assign(FBinaryFile, Filename);
FFileMode := fmOpenReadWrite;
Rewrite(FBinaryFile);
FBinaryFileOpen := True;
FillChar(FHeaderRec, sizeof(TTraceRec), 0);
FillChar(FFileRec, sizeof(TTraceRec), 0);
FillChar(EofRec, sizeof(TTraceRec), 0);
FLastIndex := 0;
FHeaderRec.index := FLastIndex;
FHeaderRec.tickcount := storageSize;
SetMsg(#FHeaderRec, FTraceFileInfo);
Write(FBinaryFile, FHeaderRec);
for t := 1 to storageSize do
begin
Write(FBinaryFile, FFileRec);
end;
SetMsg(#eofRec, 'EOF');
eofRec.index := traceEOFMARKER;
Write(FBinaryFile, eofRec);
end;
procedure TTraceBinFile.Trace(msg: string);
// Write a trace message in circular file.
begin
if (not FBinaryFileOpen) then
exit;
if (FFileMode = fmOpenRead) then
exit; // not open for writing!
Inc(FLastIndex);
if (FLastIndex > FStorageSize) then
FLastIndex := 1; // wrap around to 1 not zero! Very important!
Seek(FBinaryFile, 0);
FHeaderRec.index := FLastIndex;
Write(FBinaryFile, FHeaderRec);
FillChar(FFileRec, sizeof(TTraceRec), 0);
Seek(FBinaryFile, FLastIndex);
Inc(FAutoIncrementValue);
if FAutoIncrementValue = 0 then
FAutoIncrementValue := 1;
FFileRec.index := FAutoIncrementValue;
SetMsg(#FFileRec, msg);
Write(FBinaryFile, FFileRec);
end;
procedure TTraceBinFile.OpenFile;
begin
if FBinaryFileOpen then
begin
System.CloseFile(FBinaryFile);
FBinaryFileOpen := False;
end;
if FileExists(FFilename) then
begin
// System.FileMode :=fmOpenRead;
FFileMode := fmOpenRead;
AssignFile(FBinaryFile, FFilename);
System.Reset(FBinaryFile); // open in current mode
System.Seek(FBinaryFile, 0);
Read(FBinaryFile, FHeaderRec);
FLastIndex := FHeaderRec.index;
FTraceFileInfo := string(FHeaderRec.Msg);
FBinaryFileOpen := True;
end
else
begin
InitializeFile; // Creates the file.
end;
end;
procedure TTraceBinFile.LoadTrace(traceStrs: TStrings);
var
ReadAtIndex: Integer;
Safety: Integer;
procedure NextReadIndex;
begin
Inc(ReadAtIndex);
if (ReadAtIndex > FStorageSize) then
ReadAtIndex := 1; // wrap around to 1 not zero! Very important!
end;
begin
Assert(Assigned(traceStrs));
traceStrs.Clear;
if not FBinaryFileOpen then
begin
OpenFile;
end;
ReadAtIndex := FLastIndex;
NextReadIndex;
Safety := 0; // prevents endless looping.
while True do
begin
if (ReadAtIndex = FLastIndex) or (Safety > FStorageSize) then
break;
Seek(FBinaryFile, ReadAtIndex);
Read(FBinaryFIle, FFileRec);
if FFileRec.msg[0] <> chr(0) then
begin
traceStrs.Add(FFileRec.msg);
end;
Inc(Safety);
NextReadIndex;
end;
end;
end.
Look at this article.
TraceTool 12.4: The Swiss-Army Knife of Trace
My suggestion would be to implement your logging in such a way that the log file "rolls over" on a daily basis. E.g. at midnight, your logging code renames your log file (e.g. MyLogFile.log) to a dated/archive version (e.g. MyLogFile-30082012.log), and starts a new empty "live" log (e.g. again MyLogFile.log).
Then it's simply a question of using something like BareTail to monitor your "live"/daily log file.
I accept this may not be the most network-efficient approach, but it's reasonably simple and meets your "live" requirement.

Resources