I'm totally new to Delphi I guess learning by doing something should be ok (My hope!)
My Idea was because I often have to re-create the same tasks:
creating always the same directorys which contains sometimes files and sometimes leaved empty...
So my conclusion was to automate it in some way.
Assume a Memo containing the following:
config.xml|enc
/skin
/data/defines.dat:blub
/temp
The Basepath where all the stuff above should be created inside:
C:\users\BGates\test
":blub" is just placeholders e.g. :blub can contain any text which comes from another memo in my application which means that later defines.dat is filled with the text blub contains...
As you can see sometimes I use | and sometimes : for the placeholder...
So from the information above I would like to parse the contents of the memo to create a directory structure like this:
C:\users\BGates\test\
config.xml
skin
data
defines.dat (while defines.dat will contain the stuff which comes from blub)
temp
My problem is the parsing of the memo esspecially how to decide its a folder or its a folder in another folder, then its a file in the root or a file inside of a folder and so on...
Well it might be there is an easier way (I was reading about csv files and such but then? My tool would be hard to understand for someone using it which doesn't know how a csv file needs to look like) while my example above feels maybe familier to them...
Could someone show me please an example how to parse it in a correct (best practice) way So I could learn from it?
There are routines in the SysUtils unit that make file path parsing a lot easier. Have a look at ExtractFileName and ExtractFilePath, for starters. Also, if you're using a recent version of Delphi, (D2010 or any of the XE line,) the IOUtils unit contains a set of helper methods under the TPath record that simplify working with paths.
For example, if I wanted to deal with the line /data/defines.dat:blub, I'd do something like this:
function NormalizePath(const name: string): string;
begin
result := StringReplace(name, '/', '\', [rfReplaceAll]);
end;
procedure ProcessLine(line: string);
var
path, filename, data: string;
colonPos: integer;
begin
colonPos := pos(':', line);
if colonPos > 0 then
begin
data := copy(line, colonPos + 1);
delete(line, colonPos, MAXINT);
end;
line := TPath.Combine(BASE_PATH, normalizePath(line));
if ExtractFileExt(line) = '' then
path := line
else begin
path := ExtractFilePath(line);
filename := line;
end;
ForceDirectories(path); //ensure that the folder exists
if filename <> '' then
TFile.WriteAllText(filename, data);
end;
Note: I just wrote this off the top of my head. It may contain bugs. Don't trust it without testing it first. Also, this uses functionality from IOUtils, and things will be a little bit trickier if you don't have it in your version of Delphi. But this should give you the general idea of how to deal with the problem you're trying to solve.
Related
I have text file like this:
"01","AAA","AAAAA"
"02","BBB","BBBBB","BBBBBBBB"
"03","CCC"
"04","DDD","DDDDD"
I want to load this text file data into temp table in sybase db. So, I need to build a program to read line by line this text file until eof. If the text file size is small, the process to read line by line is fast. But if text file size is too big (can be more than 500M), the process read line by line is too slow. I think the read line by line method not suitable for huge text file. So, need to find other solution to load text file data into db instead of read text file line by line method. Any suggestion?
Example code:
var
myFile : TextFile;
text : string;
begin
// Try to open the Test.txt file for writing to
AssignFile(myFile, 'Test.txt');
// Display the file contents
while not Eof(myFile) do
begin
ReadLn(myFile, text);
TempTable.append;
TempTable.FieldByName('Field1').asstring=Copy(text,2,2);
TempTable.FieldByName('Field2').asstring=Copy(text,7,3);
TempTable.FieldByName('Field3').asstring=Copy(text,13,5);
TempTable.FieldByName('Field4').asstring=Copy(text,21,8);
TempTable.post;
end;
// Close the file for the last time
CloseFile(myFile);
end;
Some general tips:
Ensure your TempTable is in memory, or use a fast database engine - take a look at SQlite3 or other means (like FireBird embedded, NexusDB or ElevateDB) as possible database alternatives;
If you do not use a TTable, but a true database, ensure you nest the insert within a Transaction;
For a true database, check out if you can not use ArrayDML feature, which is much faster for inserting a lot of data as you want in a remote database (like Sybase) - such Array DML is handled for instance with FireDAC AFAIK;
The FieldByName('...') method is known to be very slow: use locals TField variables instead;
When using a TextFile, assign a bigger temporary buffer;
If you are using newer Unicode versions of Delphi (2009+), using TextFile is not the best option.
So your code may be:
var
myFile : TextFile;
myFileBuffer: array[word] of byte;
text : string;
Field1, Field2, Field3, Field4: TField;
begin
// Set Field* local variables for speed within the main loop
Field1 := TempTable.FieldByName('Field1');
Field2 := TempTable.FieldByName('Field2');
Field3 := TempTable.FieldByName('Field3');
Field4 := TempTable.FieldByName('Field4');
// Try to open the Test.txt file for writing to
AssignFile(myFile, 'Test.txt');
SetTextBuf(myFile, myFileBuffer); // use 64 KB read buffer
// Display the file contents
while not Eof(myFile) do
begin
ReadLn(myFile, text);
TempTable.append;
Field1.asInteger := StrToInt(Copy(text,2,2));
Field2.asString := Copy(text,7,3);
Field3.asString := Copy(text,13,5);
Field4.asString := Copy(text,21,8);
TempTable.post;
end;
// Close the file for the last time
CloseFile(myFile);
end;
You can achieve very high speed with embedded engines, with almost no size limit, but your storage. See for instance how fast we can add content to a SQLite3 database in our ORM: about 130,000 / 150,000 rows per second in a database file, including all ORM marshalling. I also found out that SQLite3 generates much smaller database files than alternatives. If you want fast retrieval of any field, do not forget to define INDEXes in your database, if possible after the insertion of row data (for better speed). For SQLite3, there is already an ID/RowID integer primary key available, which maps your first data field, I suppose. This ID/RowID integer primary key is already indexed by SQLite3. By the way, our ORM now supports FireDAC / AnyDAC and its advanced Array DML feature.
Text files normally have a very small buffer. Look into using the SetTextBuf function to increase your performance.
var
myFile : TextFile;
text : string;
myFileBuffer: Array[1..32768] of byte;
begin
// Try to open the Test.txt file for writing to
AssignFile(myFile, 'Test.txt');
SetTextBuf(MyFile, myFileBuffer);
Reset(MyFile);
// Display the file contents
while not Eof(myFile) do
begin
ReadLn(myFile, text);
end;
// Close the file for the last time
CloseFile(myFile);
end;
In addition to what has already been said, I would also avoid using any TTable component. You would be better off using a TQuery type component (depending on the access layer you're using). Something like this :-
qryImport.SQL := 'Insert Into MyTable Values (:Field1, :Field2, :Field3, :Field4);';
Procedure ImportRecord(Const pField1, pField2, pField3, pField4 : String);
Begin
qryImport.Close;
qryImport.Params[0].AsString := pField1;
qryImport.Params[1].AsString := pField2;`
qryImport.Params[2].AsString := pField3;
qryImport.Params[3].AsString := pField4;
qryImport.ExecSQL;
End;
Hope this helps.
Another approach would be to use memory-mapped files (you can google or go torry.net to find implementations). it would not work well for files >1gb (in win32,, in win64 you can map virtually any file). It would turn all your file into PAnsiChar that you would be able to scan like a one large buffer, searching for #10 and #13 (alone or in pairs) and thus manually splitting strings.
If you use (or don't mind starting to use) the JEDI Jvcl, they have a TJvCSVDataSet which allows you to simply use your CSV file like any other dataset in Delphi, including being able to define persistent fields and use "standard" Delphi database functionality:
JvCSVDataSet1.FileName := 'MyFile.csv';
JvCSVDataSet1.Open;
while not JvCSVDataSet1.Eof do
begin
TempTable.Append; // Posts last appended row automatically;
// no need to call Post here.
// Assumes TempTable has same # of fields in the
// same order
for i := 0 to JvCSVDataSet1.FieldCount - 1 do
TempTable.Fields[i].AsString := JvCSVDataSet1.Fields[i].AsString;
JvCSVDataSet1.Next;
end;
// Post the last row appended when the loop above exited
if TempTable.State in dsEditModes then
TempTable.Post;
In Delphi 7 you can use Turbo Power SysTools TStAnsiTextStream() to read and write in a line oriented way, but using the thread safe TStream implementation and not the unsafe old pascal file interface. In later Delphi versions you will find something alike in the standard RTL (although they are a little different in their implementation), but Delphi 7 didn't offer much for text file manipulation.
I feel like this should be easy, but google is totally failing me at the moment. I want to open a file, or create it if it doesn't exist, and write to it.
The following
AssignFile(logFile, 'Test.txt');
Append(logFile);
throws an error on the second line when the file doesn't exist yet, which I assume is expected. But I'm really failing at finding out how to a) test if the file exists and b) create it when needed.
FYI, working in Delphi XE.
You can use the FileExists function and then use Append if exist or Rewrite if not.
AssignFile(logFile, 'Test.txt');
if FileExists('test.txt') then
Append(logFile)
else
Rewrite(logFile);
//do your stuff
CloseFile(logFile);
Any solution that uses FileExists to choose how to open the file has a race condition. If the file's existence changes between the time you test it and the time you attempt to open the file, your program will fail. Delphi doesn't provide any way to solve that problem with its native file I/O routines.
If your Delphi version is new enough to offer it, you can use the TFile.Open with the fmOpenOrCreate open mode, which does exactly what you want; it returns a TFileStream.
Otherwise, you can use the Windows API function CreateFile to open your file instead. Set the dwCreationDisposition parameter to OPEN_ALWAYS, which tells it to create the file if it doesn't already exist.
You should be using TFileStream instead. Here's a sample that will create a file if it doesn't exist, or write to it if it does:
var
FS: TFileStream;
sOut: string;
i: Integer;
Flags: Word;
begin
Flags := fmOpenReadWrite;
if not FileExists('D:\Temp\Junkfile.txt') then
Flags := Flags or fmCreate;
FS := TFileStream.Create('D:\Temp\Junkfile.txt', Flags);
try
FS.Position := FS.Size; // Will be 0 if file created, end of text if not
sOut := 'This is test line %d'#13#10;
for i := 1 to 10 do
begin
sOut := Format(sOut, [i]);
FS.Write(sOut[1], Length(sOut) * SizeOf(Char));
end;
finally
FS.Free;
end;
end;
If you are just doing something simple, the IOUtils Unit is a lot easier. It has a lot of utilities for writing to files.
e.g.
procedure WriteAllText(const Path: string; const Contents: string);
overload; static;
Creates a new file, writes the specified string to the file, and then
closes the file. If the target file already exists, it is overwritten.
You can also use the load/save feature in a TStringList to solve your problem.
This might be a bad solution, because the whole file will be loaded into memory, modified in memory and then saved to back to disk. (As opposed to your solution where you just write directly to the file). It's obviously a bad solution for multiuser situations.
But this approach is OK for smaller files, and it is easy to work with and easy understand.
const
FileName = 'test.txt';
var
strList: TStringList;
begin
strList := TStringList.Create;
try
if FileExists(FileName) then
strList.LoadFromFile(FileName);
strList.Add('My new line');
strList.SaveToFile(FileName);
finally
strList.Free;
end;
end;
HI
We have a large number of remote computers that capture video onto disk drives. Each camera has it's own unique directory and there can be up to 16 directories on any one disk.
I'm trying to locate the oldest video file on the disk but using FindFirst/FindNext to compare the File Creation DateTime takes forever.
Does anybody know of a more efficient way of finding the oldest file in a directory? We remotely connect to the pc's from a central HO location.
Regards, Pieter
-- Update
Thank you all for the answers. In the end I used the following.
Map a drive ('w:') to the remote computer using windows.WNetAddConnection2
//Execute dir on the remote computer using cmd.exe /c dir
//NOTE: Drive letters are relative to the remote computer. (psexec -w parameter)
psexec \\<IPAddress> -i /accepteula -w "c:\windows\system32" cmd.exe "/c dir q:\video /OD /TC /B > q:\dir.txt"
//Read the first line of "w:\dir.txt" to get the oldest file in that directory.
//Disconnect from the remote computer using windows.WNetCancelConnection2
You could also try FindFirstFileEx with FindExInfoBasic parameter, and on Windows 7 or Server 2008 R2 or later, FIND_FIRST_EX_LARGE_FETCH which should improve performance.
First, grab the RunDosAppPipedToTStrings routine from this page on how to run a DOS program and pipe its output to a TStrings. The example uses a TMemo's Lines property, but you can pass any TStrings in, such as TStringList. Note that this will fail silently if CreateProcess returns false. You might want to add an else case to the "if CreateProcess" block that raises an exception.
Then create a simple batch file in the same folder as your EXE. Call it getdir.bat. All it should say is:
dir %1
This produces a directory listing of whatever folder you pass to it. Unfortunately, "dir" is a DOS keyword command, not a program, so you can't invoke it directly. Wrapping it in a batch file gets around that. This is a bit of a hack, but it works. If you can find a better way to run DIR, so much the better.
You'll want to invoke RunDosAppPipedToTStrings with code that looks something like this:
procedure GetDirListing(dirname: string; list: TStringList);
const
CMDNAME = '%s\getdir.bat "%s"';
var
path: string;
begin
list.Clear;
path := ExcludeTrailingPathDelimiter(ExtractFilePath(ParamStr(0)));
RunDosAppPipedToTStrings(format(CMDNAME, [path, dirname]), list, false);
end;
Then all that's left to do is parse the output, extract date and time and filenames, sort by date and time, and grab the filename of the file with the lowest date. I'll leave that much to you.
If you can run something on the remote computer that can iterate over the directories, that will be the fastest approach. If you wanted to use Mason's example, try launching it with PsExec from SysInternals.
If you can only run an application locally then no, there's no faster way than FindFirst/FindNext, and anything else you do will boil down to that eventually. If your local computer is running Windows 7 you can use FindFirstFileEx instead, which has flags to indicate it should use larger buffers for the transfers and that it shouldn't read the 8.3 alias, which can help the speed a bit.
I had almost the same problem on the fax server software I developed. I had to send the faxes in the order they were received from thousands (all stored in a directory). The solution I adopted (which is slow to start but fast to run) is to make a sorted list of all the files using the
SearchRec.Time
as the key. After the file is in the list, I'm setting the attributes of the file as a faSysFile:
NewAttributes := Attributes or faSysFile;
Now when I do a new search with
FileAttrs := (faAnyFile and not faDirectory);
only the files that are not faSysFile are shown, so I can add to the list the files that are coming in new.
Now you have a list with all the files sorted by time.
Don't forget, when you start your application, first step is to remove the faSysFile attribute from the files in the folder so they can be processed again.
procedure FileSetSysAttr(AFileName: string);
var
Attributes, NewAttributes: Word;
begin
Attributes := FileGetAttr(AFileName);
NewAttributes := Attributes or faSysFile;
FileSetAttr(AFileName, NewAttributes);
end;
procedure FileUnSetSysAttr(AFileName: string);
var
Attributes, NewAttributes: Word;
begin
Attributes := FileGetAttr(AFileName);
NewAttributes := Attributes and not faSysFile;
FileSetAttr(AFileName, NewAttributes);
end;
procedure PathUnSetSysAttr(APathName: string);
var
sr: TSearchRec;
FileAttrs: Integer;
begin
FileAttrs := (faAnyFile and not faDirectory) and (faAnyFile or faSysFile);
APathName := IncludeTrailingBackslash(APathName);
if SysUtils.FindFirst(APathName + '*.*', FileAttrs, sr) = 0 then
try
repeat
if (sr.Attr and faDirectory) = 0 then
FileUnSetSysAttr(APathName + sr.Name);
until SysUtils.FindNext(sr) <> 0;
finally
SysUtils.FindClose(sr);
end;
end;
I know this is not the best solution, but works for me.
I have a problem adding strings to a TStringList. I've searched other posts but couldn't find an answer to this.
What I'm trying to do is to add a big amount of strings to a TStringList (more than 14000) but somewhere in the process I get an EAccessViolation. Here's the code I'm using:
procedure TForm1.FormCreate(Sender: TObject);
begin
List := TStringList.Create;
List.Duplicates := dupAccept;
end;
procedure TForm1.ButtonStartClick(Sender: TObject);
begin
List.Clear;
List.Add('125-AMPLE');
List.Add('TCUMSON');
List.Add('ATLV 4300');
List.Add('150T-15');
List.Add('TDL-08ZE');
List.Add('RT20L');
List.Add('SIN LINEA');
List.Add('TIARA');
List.Add('FL200ZK1');
List.Add('FL250ZK1');
List.Add('SIN LINEA');
List.Add('CENTAURO-70 S.P.');
List.Add('CORSADO');
{ This list continues to about 14000 strings...}
List.Add('VOSJOD 2');
List.Add('Z 125');
List.Add('ZUMY');
List.Add('NEW AGE 125');
List.Add('SIN LINEA');
end;
procedure TForm1.FormClose(Sender: TObject; var Action: TCloseAction);
begin
FreeAndNil(List);
end;
¿What's wrong with this code? The list contains duplicate strings so I set the Duplicates property to dupAccept. I was able to load the list using LoadFromFile, but I don't want to have a text file outside my application.
I hope you can help me!!! Please tell me if you need any further information.
Thank you very much. I really appreciate your help.
The suggestions for using an external file are on the mark here. However, your post indicates your desire for not having an external file. I would then suggest you link the file to the executable as a resource. You can easily do this following these steps:
Place all the strings into a text file called stringdata.txt (or whatever name you choose). Then create a .rc file of whatever name you choose and put the following into it (STRING_DATA can be any identifier you choose):
STRING_DATA RCDATA "stringdata.txt"
Create a .res file from the .rc:
BRCC32 <name of rc>.rc
Now reference this file from the source code. Place the following someplace in the unit:
{$R <name of res>.res}
Instead of loading from a file stream, load from a resource stream:
StringData := TResourceStream.Create(HInstance, 'STRING_DATA', RT_RCDATA);
try
List.LoadFromStream(StringData);
finally
StringData.Free;
end;
If you do command-line automated builds, I would suggest you keep the .rc file under source control and build the .res during the build process. This way you can also keep the stringdata.txt file under source control and any edits are automatically caught on the next build without having to explicitly build the .res file each time the .txt file changes.
What Delphi version are you using? Some older versions had a bug in the memory manager that can cause an access violation when trying to reallocate an array to a size that's too large.
Try adding FastMM4 to your project to replace the old memory manager and see if that helps.
Also, you're probably better off keeping the list in an external file. Yes, it's another file, but it also means that you can change the list without having to recompile the entire program. This also makes creating (and distributing!) updates easier.
Mason is probably right for the cause of the AV; this is quite a large array to grow.
On a side note, when doing such a long processing on a StringList, it's recommended to surround it by BeginUpdate/EndUpdate to avoid firing any update event.
Even if you don't have any now, they might be added later and you'll get problems.
Set list.capacity to the number of items you plan to add, immediately after you create the list. Alternatively, place the list in an RC file (named other than with the name of your project) and add it to your project. This gets compiled into your application, but does not involve executable code to create the list.
I would also worry about compiler integrity with a 14,000 line procedure. People have found other cases where going beyond anything reasonable breaks the compiler in various ways.
You may also want to try THashedStringList, could see a speed boost (although not in this function), although I'm not sure if the add method is a whole lot different.
try using the following instead of your code to add the strings to the StringList
var
Str: string;
begin
Str := '125-AMPLE' + #13#10;
Str := Str + 'TCUMSON' + #13#10;
Str := Str + 'ATLV 4300' + #13#10;
Str := Str + '150T-15' + #13#10;
................
List.Text := Str;
end;
That is, delete all files matching pattern within a given directory
Example, Delete all *.jpg files within DirectoryName
procedure TForm1.Button1Click(Sender: TObject);
begin
DeleteFiles(ExtractFilePath(ParamStr(0)),'*.jpg');
end;
procedure DeleteFiles(APath, AFileSpec: string);
var
lSearchRec:TSearchRec;
lPath:string;
begin
lPath := IncludeTrailingPathDelimiter(APath);
if FindFirst(lPath+AFileSpec,faAnyFile,lSearchRec) = 0 then
begin
try
repeat
SysUtils.DeleteFile(lPath+lSearchRec.Name);
until SysUtils.FindNext(lSearchRec) <> 0;
finally
SysUtils.FindClose(lSearchRec); // Free resources on successful find
end;
end;
end;
In more recent versions of Delphi, you would probably use the classes in System.IOUtils, which are essentially wrapping FindFirst, FindNext etc:
procedure DeleteFilesMatchingPattern(const Directory, Pattern: string);
var FileName: string;
begin
for FileName in TDirectory.GetFiles(Directory, Pattern) do TFile.Delete(FileName);
end;
You can use the SHFileOperation function. The nice thing about using SHFileOperation is you have the option of deleting the files to the recycle bin and you get the normal API animations so the user will know what is going on. The downside is the delete will take a little longer than Jeff's code.
There are several wrappers out there. I use this free wrapper from BP Software. The entire wrapper file is only 220 lines and is easy to read and use. I don't install this as a component. I have found it easier to add this unit to my project and just Create and free the object as needed.
Update: The download link for the BP Software site is no longer valid. There is an older version on the Embarcadero website.
TSHFileOp (1.3.5.1) (3 KB) May
31, 2006 TComponent that is a wrapper
for the SHFileOperation API to copy,
move, rename, or delete (with
recycle-bin support) a file system
object.
The file name parameter for SHFileOperation supports MS DOS style wildcards. So you can use the component like this:
FileOps := TSHFileOp.Create(self);
FileOps.FileList.Add(DirectoryName + '\*.jpg');
FileOps.HWNDHandle := self.Handle;
FileOps.Action := faDelete;
FileOps.SHOptions :=
[ofAllowUndo, ofNoConfirmation, ofFilesOnly, ofSimpleProgress];
FileOps.Execute;
I usually show the "Are you sure" message myself so I always pass the ofNoConfirmation flag so Windows does not ask again.
If you don't want to delete every jpg file or you need to delete from multiple directories you can add full file names or different paths with wild cards to the FileList string list before calling execute.
Here is the MSDN Page for SHFileOperation
Note that SHFileOperation has been replaced by IFileOperation starting with Windows Vista. I have continued to use SHFileOperation on Windows Vista without any problems.