FindFirstFile matches wildcards that shouldn't match - delphi

On some of my Windows 10 machines, FindFirstFile matches files that definitely should not match. Assume the following program in Delphi:
{$apptype console}
uses Windows;
var
FindHandle: THandle;
FindData: WIN32_FIND_DATA;
begin
FindHandle := FindFirstFile('*.qqq', FindData);
if FindHandle <> INVALID_HANDLE_VALUE then
begin
try
repeat
Writeln(PChar(#FindData.cFileName[0]));
until not FindNextFile(FindHandle, FindData);
finally
FindClose(FindHandle);
end;
end;
end.
and four files:
a.qqq
b.qqqt
c.qqqx
c.qqq123
The output I expect to get is just a.qqq. But what actually happens is, all four files get printed out. I get the same result with CMD's dir *.qqq, too, so it's not just Delphi doing weird stuff, but PowerShell's dir *.qqq works as expected. What could possibly be causing this behavior? And particularly, if it is some specific settings in the OS (which seems to be indicated by the fact that I don't get this result on all machines, just some), is there something I can do from within my program to enforce the expected behavior regardless of the OS settings?

The reason is that the underlying Windows functions checks the long and the short file names:
The search includes the long and short file names.
You can see that when you add the /X parameter to the dir call in CMD:
dir /X *.qqq
A possible solution is to add another filter check for each found name that only takes the long name.
Actually, that is what Delphi does in TDirectory.GetFiles, making that sometimes a better alternative to the hand written routine.

Related

Adding large resources with UpdateResource

Nowhere in the Windows documentation do I see a reference to a size limit to the resources one can add using UpdateResource, but it seems I have stumbled upon one - and it's tiny!
I was developing a Windows Ribbon app and wanted to programmatically build and attach the resource. Linking the resource using a $R directive worked just dandy, but I kept getting memory junk when attaching the very same thing from code.
I have managed to reduce it to a simple example using a string resource:
Handle := BeginUpdateResource(PChar(DestFileName), True);
try
AddResource(Handle, 'STRING', 'ManyXs', StrUtils.DupeString('X', 1000));
finally
EndUpdateResource(Handle, False);
end;
And AddResource is defined as:
procedure TForm2.AddResource(Handle: NativeUInt; ResType, ResName, Value: string);
begin
if not UpdateResource(Handle, PChar(ResType), PChar(ResName), 1033,
PChar(Value), Value.Length * SizeOf(Char)) then
RaiseLastOSError;
end;
Please ignore my hard-coded language for the moment.
When I inspect the resource subsequent to calling this, I see a thousand Xs. Fabulous.
I can change the resource to 1990 Xs and it's fine. The moment it goes to 1991, I get nonsense written to the DLL. The size of the resource is correctly indicated as 3982 (1991 * 2 because it's Unicode), but the contents is just a dump of stuff from memory.
I can view larger resources with my resource editor, and the IDE routinely inserts larger resources (Delphi forms, for example), so I'm definitely missing something.
I've tried the following, despite not thinking any of them would make a difference (they didn't):
Using just large memory buffers instead of strings
Using the Ansi version of the UpdateResource function
Many different resource types - what I really need to get working, is UIFILE
Looking for other functions in the API (I found none)
Combinations of 1, 2 and 3
Any ideas?
Update:
Inspired by the comments and Jolyon's answer, tried a few more things.
First, I tried in Delphi XE7 and XE5 as well (original was in XE6). I don't have XE2 installed anymore, so i cannot confirm what Sertak has said. I'll find out if someone else in my office still has it installed.
Second, here is the memory buffer version:
procedure TForm2.AddResource(Handle: NativeUInt; const ResType, ResName, Value: string);
var
Buffer: Pointer;
BuffLen: Integer;
begin
BuffLen := Value.Length * SizeOf(Char);
GetMem(Buffer, BuffLen);
try
StrPCopy(PChar(Buffer), Value);
if not UpdateResource(Handle, PChar(ResType), PChar(ResName), 1033,
Buffer, BuffLen) then
RaiseLastOSError;
finally
FreeMem(Buffer);
end;
end;
I actually had a previous version of this code where I dumped the contents of that pointer into a file before the call to UpdateResource and the file saved correctly but the resource still saved junk. Then I did this version, which doesn't involve strings at all:
procedure TForm2.AddResource(Handle: NativeUInt; const ResType, ResName: string;
C: AnsiChar; Len: Integer );
var
Buffer: Pointer;
BuffLen: Integer;
begin
BuffLen := Len;
GetMem(Buffer, BuffLen);
try
FillMemory(Buffer, Len, Byte(C));
if not UpdateResource(Handle, PChar(ResType), PChar(ResName), 1033,
Buffer, BuffLen) then
RaiseLastOSError;
finally
FreeMem(Buffer);
end;
end;
With this version I still have the same problem when I use 3882 Xs. Of course, I'm now using single-byte characters, that's why it's double. But I have the exact same issue.
I did notice a difference between the versions in the output of TDUMP though. For versions 1 (strings) and 2 (string copied to buffer), my resource size is suddenly indicated as FFFFFF90 when I use 1991 characters. With version 3 (no strings), the size is the actual hex value of whatever size I used.
The fact that you are getting "junk" data but data of the right size leads me to suspect the PChar() casting of the string value yielding an incorrect address. This normally should not be a problem, but I wonder if the issue is some strange behaviour as the result of passing the result of a function directly into a parameter of a method ? A behaviour which for some strange reason is only triggered when the string involved reaches a certain size, perhaps indicating some edge-case optimization behaviour.
This might also explain difficulties in reproducing the problem if it is some combination of optimization (and/or other compiler settings) in some specific version of Delphi.
I would suggest to try eliminating this possibility by creating your new resource string in an explicit variable and passing that to the AddResource() method. I would also suggest that you be explicit in your parameter semantics and since the string involved is not modified, nor intended to be modified, in the AddResource() method, declare it as a formally const parameter.
You do mention having tried an alternative approach using "memory buffers". If the above suggestions do not resolve the problem, perhaps it would be helpful to post a minimal example that reproduces the problem using those, to eliminate any possible influence on things by the rather more exotic "string" type.

FileGetDate works some times, thoughts?

Some files this works with and others it does not.
var
Src : integer;
FileDate : LongInt;
begin
Src:=FileOpen(SrcPath,fmOpenRead);
FileDate:=FileGetDate(Src); // Crash here with FileDate = -1
...
FileSetDate(Dest,FileDate);
I have checked Attributes for files that work and some that do not and they are identical.
Same for "Security," identical.
"Src" is a valid Integer for the ones that work and the ones that do not work.
The only thing I can see is that the full path to the ones that do not can be 130 characters and longer. But I renamed some Folders and shortened that to 118 and still no good.
Got me baffled. In a 2000+ file copy process, just 149 all in the same sub-Folder crash at this FileGetDate.
Any suggestions?
Thanks
The call to FileGetDate returns -1. The documentation says this:
The return value is -1 if the handle is invalid.
In other words, the handle returned by your call to FileOpen is not valid. You don't check for any errors in the code. Your code makes the assumption that all the calls succeed. The failure mode for FileOpen is that it returns -1. You are not checking the return value of FileOpen. You must add code to do so.
Note that the documentation for FileOpen says:
Note: We do not encourage the use of the non-native Delphi language file handlers such as FileOpen. These routines map to system
routines and return OS file handles, not normal Delphi file variables.
These are low-level file access routines. For normal file operations
use AssignFile, Rewrite, and Reset instead.
So even ancient legacy Pascal I/O is to be preferred to FileOpen.
Frankly, if you want to work with files and get meaningful error diagnostics, you should use the Win32 API. Call CreateFile and if it fails, check GetLastError to find out why. There are lots of ways in which a file open request can fail and realistically only you can work out what the reason is for your files. We don't have the files at hand, only you do.
Finally, you say that you are writing a file copy routine. The system already provides such a thing, and you would be far better off using it. You are spending a lot of effort re-inventing the wheel. What's more, writing a good file copy function is hard. The one that the system provides is known to work. Your version is liable to be inferior.
To copy a single file you can use CopyFile or CopyGFileEx. But you are copying multiple files and SHFileOperation is the API for that.
3 thoughts,
The first is that something else has exclusive access to the file and you simply can not open it regardless. Check then your opened file handle is valid.
The second though is that some files can have VERY damaged time stamps on them. I am not sure how it happens, I just know that it does.
Finally, according to the documents on Linux, -1 is a valid date value, you do not mention what file system your source files are stored on.
Here is the implementation of FileGetDate() in Delphi 5:
function FileGetDate(Handle: Integer): Integer;
var
FileTime, LocalFileTime: TFileTime;
begin
if GetFileTime(THandle(Handle), nil, nil, #FileTime) and
FileTimeToLocalFileTime(FileTime, LocalFileTime) and
FileTimeToDosDateTime(LocalFileTime, LongRec(Result).Hi,
LongRec(Result).Lo) then Exit;
Result := -1;
end;
That is 3 different points of failure that could happen on any given input file handle:
does GetFileTime() fail?
does FileTimeToLocalFileTime() fail?
does FileTimeToDosDateTime() fail?
Unless FileOpen() fails (which you are not checking for - it returns -1 if it is not able to open the file), then it is unlikely (but not impossible) that #1 or #2 are failing. But #3 does have a documented caveat:
The MS-DOS date format can represent only dates between 1/1/1980 and 12/31/2107; this conversion fails if the input file time is outside this range.
It is not likely that you encounter files with timestamps in the year 2108 and later, but you can certainly encounter files with timestamps in the year 1979 and earlier.
All 4 functions (counting the CreateFile() function called inside of FileOpen()) report an error code via GetLastError(), so you can do this:
var
Src : integer;
FileDate : LongInt;
begin
Src := FileOpen(SrcPath, fmOpenRead);
Win32Check(Src <> -1);
FileDate := FileGetDate(Src);
Win32Check(FileDate <> -1);
...
Win32Check(FileSetDate(Dest, FileDate) = 0);
Win32Check() calls RaiseLastWin32Error() if the input parameter is false. RaiseLastWin32Error() raises an EOSError exception containing the actual error code in its ErrorCode property.
If FileGetDate() fails, obviously you won't know which Win32 function actually failed. That is where the debugger comes into play. Enable Debug DCUs in your Project Options to allow you to step into the VCL/RTL source code. Find a file that fails, call FileGetDate() on it, and step through its source code to see which if the three API functions is actually failing.
Similarly for FileSetDate(), which also calls 3 API functions internally:
function FileSetDate(Handle: Integer; Age: Integer): Integer;
var
LocalFileTime, FileTime: TFileTime;
begin
Result := 0;
if DosDateTimeToFileTime(LongRec(Age).Hi, LongRec(Age).Lo, LocalFileTime) and
LocalFileTimeToFileTime(LocalFileTime, FileTime) and
SetFileTime(Handle, nil, nil, #FileTime) then Exit;
Result := GetLastError;
end;
If FileSetDate() fails, is it because:
DosDateTimeToFileTime() failed?
LocalFileTimeToFileTime() failed?
does SetFileTime() failed?

How can I strip Windows drive lettering out of my reformed path? Delphi or Pascal

Further to this answered question I have another sticky problem. My coding is Free Pascal but Delphi solutions will work probably.
In brief, I have a string value of concatenated paths that is formed by taking a source directory and recreating that tree in a destination directory. e.g.
C:\SourceDir\SubDirA
becomes
F:\DestinationDir\SourceDir\SubDirA.
However, the solution I have for the Linux version of my program (as posted in the link above) doesn't quite work with Windows version because I end up with :
F:\DestionationDir\C:SourceDir\SubDirA.
which is invalid.
So I came up with this "only run in Windows" code to remove the central drive letterof the reformed path, but leave the initial one at the start by saying "Look at the string starting from the 4th character in from the left. If you find 'C:', delete it" so that the path becomes F:\DestinationDir\SourceDir\SubDirA.
{$IFDEF Windows} // Only do this for the Windows version
k := posex('C:', FinalisedDestDir, 4); // Find 'C:' in the middle of the concatanated path and return its position as k
Delete(FinalisedDestDir, k, 2); // Delete the 2 chars 'C:' of 'C:\' if found, leaving the '\' to keep the path valid
{$ENDIF}
Now, that works fine IF the C: is the source of the chosen directory. But obviously if the user is copying data from another drive (such as E:, F:, G: or whatever else drive up to Z:) it will not work.
So my question is, how do I code it so that it says "if any drive letter a: to z: is found after the 4th character from the left, delete it"? Whilst any solution that works "will do", ideally I need a fast solution. The best solution would be to not have it in there in the first place, but given the solution I posted in reply to my earlier post, I can't work out how not to have it in, due to the procedure I use to form it.
Here is a code I use in my application:
function CombinePath(const BaseDir, Path: string): string;
begin
if IsPathDelimiter(Path, 1) then
Result := ExcludeTrailingBackSlash(BaseDir) + Path else
Result := IncludeTrailingBackSlash(BaseDir) + Path;
end;
function MapRootPath(const Path, NewPath: string): string;
var
Drive, RelativePath: string;
begin
Drive := ExtractFileDrive(Path); // e.g: "C:"
RelativePath := ExtractRelativePath(Drive, Path); // e.g: "Program Files\MyApp"
Result := CombinePath(NewPath, RelativePath);
end;
Usage:
ShowMessage(MapRootPath('C:\SourceDir\SubDirA', 'F:\DestionationDir'));
// result is "F:\DestionationDir\SourceDir\SubDirA"
I offer you two solutions:
Normalize your paths before concatenating folders
You know the saying, prevention is always better then the cure. Why don't you "normalize" paths before you do your concatentations? This gives you the chance to:
Delete the drive letter from the path if the name starts with a path. If the second char in the string is : and the third is \ you know it's a Windows path containing a drive letter. You may delete all of the first 3 characters.
Deal with UNC names, you didn't mention those: \\ComputerName\ShareName\SubFolder
Fix slashes so they conform to your current platform
Remove the drive letter later
This is ugly (because you shouldn't get into this situation in the first place), but you can always look for :\ - not for C:. The : is not valid in folder or file names on windows, so if you find it you know it's preceded by exactly one char, and that's the DRIVE letter. Get the index for :\, substract `, delete 2 chars from that index.
I don't know freepascal but for this problem use regular expression such as [A-Za-z]\: to find such string. I see from freepascal wiki that it supports regular expressions http://wiki.freepascal.org/Regexpr.
You might also see if some of the SysUtils functions can help, including:
ExpandFileName()
IncludeTrailingSlashes()
etc..
Putting your path in a "normalized" form - like these functions might be able to do for you - makes it trivial to convert between Linux and Windows path conventions.
Just a thought...
Thanks for all the help with this. Special thanks to kobik who's code sample is very clear and easy to follow. That is certainly one way to do it, but whilst I was waiting for replies I came up with the following which also seems to work quite well for me :
type
TRange = 'A'..'Z';
...
{$IFDEF Windows}
// Due to the nonsenseories of Windows, we have to allow for driver lettering.
for DriveLetter in TRange do
begin
k := posex(DriveLetter+':', FinalisedDestDir, 4); // Find e.g 'C:' in the middle of the concatanated path and return its position, leaving the first 'C:\' at the start in place
Delete(FinalisedDestDir, k, 2); // Delete 'C:' of 'C:\' if found, leaving the '\'
end;
{$ENDIF}

How Can I Efficiently Read The FIrst Few Lines of Many Files in Delphi

I have a "Find Files" function in my program that will find text files with the .ged suffix that my program reads. I display the found results in an explorer-like window that looks like this:
I use the standard FindFirst / FindNext methods, and this works very quickly. The 584 files shown above are found and displayed within a couple of seconds.
What I'd now like to do is add two columns to the display that shows the "Source" and "Version" that are contained in each of these files. This information is found usually within the first 10 lines of each file, on lines that look like:
1 SOUR FTM
2 VERS Family Tree Maker (20.0.0.368)
Now I have no problem parsing this very quickly myself, and that is not what I'm asking.
What I need help with is simply how to most quickly load the first 10 or so lines from these files so that I can parse them.
I have tried to do a StringList.LoadFromFile, but it takes too much time loading the large files, such at those above 1 MB.
Since I only need the first 10 lines or so, how would I best get them?
I'm using Delphi 2009, and my input files might or might not be Unicode, so this needs to work for any encoding.
Followup: Thanks Antonio,
I ended up doing this which works fine:
var
CurFileStream: TStream;
Buffer: TBytes;
Value: string;
Encoding: TEncoding;
try
CurFileStream := TFileStream.Create(folder + FileName, fmOpenRead);
SetLength(Buffer, 256);
CurFileStream.Read(Buffer[0], 256);
TEncoding.GetBufferEncoding(Buffer, Encoding);
Value := Encoding.GetString(Buffer);
...
(parse through Value to get what I want)
...
finally
CurFileStream.Free;
end;
Use TFileStream and with Read method read number of bytes needed. Here is the example of reading bitmap info that is also stored on begining of the file.
http://www.delphidabbler.com/tips/19
Just open the file yourself for block reading (not using TStringList builtin functionality), and read the first block of the file, and then you can for example load that block to a stringlist with strings.SetText() (if you are using block functions) or simply strings.LoadFromStream() if you are loading your blocks using streams.
I would personally just go with FileRead/FileWrite block functions, and load the block into a buffer. You could also use similair winapi functions, but that's just more code for no reason.
OS reads files in blocks, which are at least 512bytes big on almost any platform/filesystem, so you can read 512 bytes first (and hope that you got all 10 lines, which will be true if your lines are generally short enough). This will be (practically) as fast as reading 100 or 200 bytes.
Then if you notice that your strings objects has only less than 10 lines, just read next 512 byte block and try to parse again. (Or just go with 1024, 2048 and so on blocks, on many systems it will probably be as fast as 512 blocks, as filesystem cluster sizes are generally larger than 512 bytes).
PS. Also, using threads or asynchronous functionality in winapi file functions (CreateFile and such), you could load that data from files asynchronously, while the rest of your application works. Specifically, the interface will not freeze during reading of large directories.
This will make the loading of your information appear faster, (since the file list will load directly, and then some milliseconds later the rest of the information will come up), while not actually increasing the real reading speed.
Do this only if you have tried the other methods and you feel like you need the extra boost.
You can use a TStreamReader to read individual lines from any TStream object, such as a TFileStream. For even faster file I/O, you could use Memory-Mapped Views with TCustomMemoryStream.
Okay, I deleted my first answer. Using Remy's first suggestion above, I tried again with built-in stuff. What I don't like here is that you have to create and free two objects. I think I would make my own class to wrap this up:
var
fs:TFileStream;
tr:TTextReader;
filename:String;
begin
filename := 'c:\temp\textFileUtf8.txt';
fs := TFileStream.Create(filename, fmOpenRead);
tr := TStreamReader.Create(fs);
try
Memo1.Lines.Add( tr.ReadLine );
finally
tr.Free;
fs.Free;
end;
end;
If anybody is interested in what I had here before, it had the problem of not working with unicode files.
Sometimes oldschool pascal stylee is not that bad.
Even though non-oo file access doesn't seem to be very popular anymore, ReadLn(F,xxx) still works pretty ok in situations like yours.
The code below loads information (filename, source and version) into a TDictionary so that you can look it up easily, or you can use a listview in virtual mode, and look stuff up in this list when the ondata even fires.
Warning: code below does not work with unicode.
program Project101;
{$APPTYPE CONSOLE}
uses
IoUtils, Generics.Collections, SysUtils;
type
TFileInfo=record
FileName,
Source,
Version:String;
end;
function LoadFileInfo(var aFileInfo:TFileInfo):Boolean;
var
F:TextFile;
begin
Result := False;
AssignFile(F,aFileInfo.FileName);
{$I-}
Reset(F);
{$I+}
if IOResult = 0 then
begin
ReadLn(F,aFileInfo.Source);
ReadLn(F,aFileInfo.Version);
CloseFile(F);
Exit(True)
end
else
WriteLn('Could not open ', aFileInfo.FileName);
end;
var
FileInfo:TFileInfo;
Files:TDictionary<string,TFileInfo>;
S:String;
begin
Files := TDictionary<string,TFileInfo>.Create;
try
for S in TDirectory.GetFiles('h:\WINDOWS\system32','*.xml') do
begin
WriteLn(S);
FileInfo.FileName := S;
if LoadFileInfo(FileInfo) then
Files.Add(S,FileInfo);
end;
// showing file information...
for FileInfo in Files.Values do
WriteLn(FileInfo.Source, ' ',FileInfo.Version);
finally
Files.Free
end;
WriteLn;
WriteLn('Done. Press any key to quit . . .');
ReadLn;
end.

Quickest way to find the oldest file in a directory using Delphi

HI
We have a large number of remote computers that capture video onto disk drives. Each camera has it's own unique directory and there can be up to 16 directories on any one disk.
I'm trying to locate the oldest video file on the disk but using FindFirst/FindNext to compare the File Creation DateTime takes forever.
Does anybody know of a more efficient way of finding the oldest file in a directory? We remotely connect to the pc's from a central HO location.
Regards, Pieter
-- Update
Thank you all for the answers. In the end I used the following.
Map a drive ('w:') to the remote computer using windows.WNetAddConnection2
//Execute dir on the remote computer using cmd.exe /c dir
//NOTE: Drive letters are relative to the remote computer. (psexec -w parameter)
psexec \\<IPAddress> -i /accepteula -w "c:\windows\system32" cmd.exe "/c dir q:\video /OD /TC /B > q:\dir.txt"
//Read the first line of "w:\dir.txt" to get the oldest file in that directory.
//Disconnect from the remote computer using windows.WNetCancelConnection2
You could also try FindFirstFileEx with FindExInfoBasic parameter, and on Windows 7 or Server 2008 R2 or later, FIND_FIRST_EX_LARGE_FETCH which should improve performance.
First, grab the RunDosAppPipedToTStrings routine from this page on how to run a DOS program and pipe its output to a TStrings. The example uses a TMemo's Lines property, but you can pass any TStrings in, such as TStringList. Note that this will fail silently if CreateProcess returns false. You might want to add an else case to the "if CreateProcess" block that raises an exception.
Then create a simple batch file in the same folder as your EXE. Call it getdir.bat. All it should say is:
dir %1
This produces a directory listing of whatever folder you pass to it. Unfortunately, "dir" is a DOS keyword command, not a program, so you can't invoke it directly. Wrapping it in a batch file gets around that. This is a bit of a hack, but it works. If you can find a better way to run DIR, so much the better.
You'll want to invoke RunDosAppPipedToTStrings with code that looks something like this:
procedure GetDirListing(dirname: string; list: TStringList);
const
CMDNAME = '%s\getdir.bat "%s"';
var
path: string;
begin
list.Clear;
path := ExcludeTrailingPathDelimiter(ExtractFilePath(ParamStr(0)));
RunDosAppPipedToTStrings(format(CMDNAME, [path, dirname]), list, false);
end;
Then all that's left to do is parse the output, extract date and time and filenames, sort by date and time, and grab the filename of the file with the lowest date. I'll leave that much to you.
If you can run something on the remote computer that can iterate over the directories, that will be the fastest approach. If you wanted to use Mason's example, try launching it with PsExec from SysInternals.
If you can only run an application locally then no, there's no faster way than FindFirst/FindNext, and anything else you do will boil down to that eventually. If your local computer is running Windows 7 you can use FindFirstFileEx instead, which has flags to indicate it should use larger buffers for the transfers and that it shouldn't read the 8.3 alias, which can help the speed a bit.
I had almost the same problem on the fax server software I developed. I had to send the faxes in the order they were received from thousands (all stored in a directory). The solution I adopted (which is slow to start but fast to run) is to make a sorted list of all the files using the
SearchRec.Time
as the key. After the file is in the list, I'm setting the attributes of the file as a faSysFile:
NewAttributes := Attributes or faSysFile;
Now when I do a new search with
FileAttrs := (faAnyFile and not faDirectory);
only the files that are not faSysFile are shown, so I can add to the list the files that are coming in new.
Now you have a list with all the files sorted by time.
Don't forget, when you start your application, first step is to remove the faSysFile attribute from the files in the folder so they can be processed again.
procedure FileSetSysAttr(AFileName: string);
var
Attributes, NewAttributes: Word;
begin
Attributes := FileGetAttr(AFileName);
NewAttributes := Attributes or faSysFile;
FileSetAttr(AFileName, NewAttributes);
end;
procedure FileUnSetSysAttr(AFileName: string);
var
Attributes, NewAttributes: Word;
begin
Attributes := FileGetAttr(AFileName);
NewAttributes := Attributes and not faSysFile;
FileSetAttr(AFileName, NewAttributes);
end;
procedure PathUnSetSysAttr(APathName: string);
var
sr: TSearchRec;
FileAttrs: Integer;
begin
FileAttrs := (faAnyFile and not faDirectory) and (faAnyFile or faSysFile);
APathName := IncludeTrailingBackslash(APathName);
if SysUtils.FindFirst(APathName + '*.*', FileAttrs, sr) = 0 then
try
repeat
if (sr.Attr and faDirectory) = 0 then
FileUnSetSysAttr(APathName + sr.Name);
until SysUtils.FindNext(sr) <> 0;
finally
SysUtils.FindClose(sr);
end;
end;
I know this is not the best solution, but works for me.

Resources