Anybody knows how to unload a file from cache?
I write a file to disk, then I want to read it back. However, Windows is giving me the file from cache.
begin
...
{-- Write file --}
AssignFile(F, FileName);
Rewrite(F, 1);
BlockWrite(F, Buf[0], Chunk);
CloseFile(F); { FLUSH }
some code...
then.....
{-- Read file --}
AssignFile(F, FileName);
Reset(F, 1);
BlockRead(F, Buf[0], Chunk); <----------- getting file from cache
CloseFile(F);
end;
-
I am trying to determine the write/read speed of a drive.
Some code to demonstrate the use of FILE_FLAG_NO_BUFFERING and to test how it affects your reading time:
uses
MMSystem;
function GetTimeForRead(ABuffered: boolean): single;
const
FileToRead = // name of file with maybe 500 MByte size
var
FlagsAndAttributes: DWORD;
FileHandle: THandle;
SrcStream, DestStream: TStream;
Ticks: DWord;
begin
if ABuffered then
FlagsAndAttributes := FILE_ATTRIBUTE_NORMAL
else
FlagsAndAttributes := FILE_FLAG_NO_BUFFERING;
FileHandle := CreateFile(FileToRead, GENERIC_READ, FILE_SHARE_READ, nil,
OPEN_EXISTING, FlagsAndAttributes, 0);
if FileHandle = INVALID_HANDLE_VALUE then begin
Result := 0.0;
exit;
end;
SrcStream := THandleStream.Create(FileHandle);
try
DestStream := TMemoryStream.Create;
try
DestStream.Size := SrcStream.Size;
Sleep(0);
Ticks := timeGetTime;
DestStream.CopyFrom(SrcStream, SrcStream.Size);
Result := 0.001 * (timeGetTime - Ticks);
finally
DestStream.Free;
end;
finally
SrcStream.Free;
end;
end;
procedure TForm1.Button1Click(Sender: TObject);
var
i: integer;
begin
Button1.Enabled := FALSE;
try
Update;
Memo1.Lines.Clear;
for i := 1 to 5 do begin
Memo1.Lines.Add(Format('Time for buffered file read: %.3f s',
[GetTimeForRead(TRUE)]));
end;
for i := 1 to 5 do begin
Memo1.Lines.Add(Format('Time for unbuffered file read: %.3f s',
[GetTimeForRead(FALSE)]));
end;
finally
Button1.Enabled := TRUE;
end;
end;
Running this code with a file of 420 MByte size gives on my system:
Time for buffered file read: 3,974 s
Time for buffered file read: 0,922 s
Time for buffered file read: 0,937 s
Time for buffered file read: 0,937 s
Time for buffered file read: 0,938 s
Time for unbuffered file read: 3,922 s
Time for unbuffered file read: 4,000 s
Time for unbuffered file read: 4,016 s
Time for unbuffered file read: 4,062 s
Time for unbuffered file read: 3,985 s
I think that you have misunderstood the concept of flushing a file.
Flushing a file does not remove it from the disk cache, it causes the content of the file stream's buffer to be written to the file.
(The stream is automatically flushed when you close it. Opening a file and flushing it without writing anything to it has no effect what so ever.)
You can look into the FILE_FLAG_NO_BUFFERING flag for reading the file, but it seems from the documentation that it has no effect on files on a hard drive.
MSDN: CreateFile
You'll need to use the Win32 API directly, specifically CreateFile with the FILE_FLAG_NO_BUFFERING flag. It forces the OS to read from the disk instead of the cache, and has the side effect that it also clears out the cache for that file, so the next read without the flag also hits the disk, though it does read it into the cache then.
File caching is an OS level operation, so is independent of whether you use Delphi or any other language.
If you give us some idea of why you want to ensure you are not reading from cache, it may be easier to help.
The code should be fine, unless you're using invalid values for the Chunk variable. E.g., if chunk = 0 then it won't read any data into the buffer, thus the buffer would keep it's old value. (Which could be the same data you just wrote to disk.)
Related
I have written a routine in Delphi Tokyo which takes multiple files (such as CSV) and merges them together, giving the user the option to ignore the first line on all files except the first one (as CSV files often have header lines/column name lines, when merging the files, I only want one copy of the header). The issue I am having is that even though I am only reading the various input files, if the file is open in another process, (specifically Excel), my app gives an error: "Cannot open file . The process cannot access the file because it is being used by another process."
I am using TStreamReader. How do I tell TStreamReader that it should open the file in read-only...and continue even if the file is open elsewhere?
Code below:
procedure glib_MergeTextFiles(const InFileNames: array of string; const OutFileName: string;
HasHeader: Boolean = True;
KeepHeader: Boolean = True);
var
I: Integer;
InStream: TStreamReader;
OutStream: TStreamWriter;
Line: string;
IsFirstLine: Boolean;
begin
// Create our output stream
OutStream := TStreamWriter.Create(OutFileName, False, TEncoding.UTF8);
try
for I := 0 to high(InFileNames) do
begin
InStream := TStreamReader.Create(InFileNames[I], TEncoding.UTF8);
IsFirstLine := True;
try
while not InStream.EndOfStream do
begin
Line := InStream.ReadLine;
if IsFirstLine then { First Line }
begin
if HasHeader = False then
begin
OutStream.WriteLine(Line);
end
else
begin
// Is First Line, Has Header
if I = 0 then {is first file}
OutStream.WriteLine(Line);
end;
end
else
begin
OutStream.WriteLine(Line);
end;
IsFirstLine := False;
end;
finally
InStream.Free;
end;
end;
finally
OutStream.Free;
end;
end;
The problem is with the sharing mode. By default, the stream reader creates a file stream for reading only, but specifies no sharing mode, so it opens the file for exclusive access. However, to open a file for reading when it is already opened elsewhere, the file must have been previously opened to share reading access using the FILE_SHARE_READ flag:
FILE_SHARE_READ
0x00000001
Enables subsequent open operations on a file or device to request read access.
Otherwise, other processes cannot open the file or device if they request read access.
If this flag is not specified, but the file or device has been opened for read access, the function fails.
You can pass your own file stream to the stream reader, opened with the mode you like:
var
I: Integer;
FileStream: TFileStream;
InStream: TStreamReader;
..
begin
...
FileStream := TFileStream.Create(InFileNames[I], fmOpenRead or fmShareDenyNone);
try
InStream := TStreamReader.Create(FileStream, TEncoding.UTF8);
try
..
Again, this requires Excel doing the same while opening the file, but with my simple test it looks like it does.
I want to know whether it is possible in Delphi to read a CD as a raw Stream direct from the logical disk drive device "C:\".
I hope I could use a TFileStream if I have already a valid file handle.
It is easiest to use THandleStream rather than TFileStream in my view. Like this:
procedure ReadFirstSector;
var
Handle: THandle;
Stream: THandleStream;
Buffer: array [1..512] of Byte;
b: Byte;
begin
Handle := CreateFile('\\.\C:', GENERIC_READ,
FILE_SHARE_READ or FILE_SHARE_WRITE, nil,
OPEN_EXISTING, FILE_ATTRIBUTE_NORMAL, 0);
if Handle=INVALID_HANDLE_VALUE then
RaiseLastOSError;
try
Stream := THandleStream.Create(Handle);
try
Stream.ReadBuffer(Buffer, SizeOf(Buffer));
for b in Buffer do
Writeln(AnsiChar(b));
finally
Stream.Free;
end;
finally
CloseHandle(Handle);
end;
end;
Beware that when using raw disk access you have to read exactly multiples of sectors. The sectors on the disk I tested with are 512 bytes in size. I expect that CD disk sectors could very well be a different size.
i want to process a text file line by line. In the olden days i loaded the file into a StringList:
slFile := TStringList.Create();
slFile.LoadFromFile(filename);
for i := 0 to slFile.Count-1 do
begin
oneLine := slFile.Strings[i];
//process the line
end;
Problem with that is once the file gets to be a few hundred megabytes, i have to allocate a huge chunk of memory; when really i only need enough memory to hold one line at a time. (Plus, you can't really indicate progress when you the system is locked up loading the file in step 1).
The i tried using the native, and recommended, file I/O routines provided by Delphi:
var
f: TextFile;
begin
Reset(f, filename);
while ReadLn(f, oneLine) do
begin
//process the line
end;
Problem withAssign is that there is no option to read the file without locking (i.e. fmShareDenyNone). The former stringlist example doesn't support no-lock either, unless you change it to LoadFromStream:
slFile := TStringList.Create;
stream := TFileStream.Create(filename, fmOpenRead or fmShareDenyNone);
slFile.LoadFromStream(stream);
stream.Free;
for i := 0 to slFile.Count-1 do
begin
oneLine := slFile.Strings[i];
//process the line
end;
So now even though i've gained no locks being held, i'm back to loading the entire file into memory.
Is there some alternative to Assign/ReadLn, where i can read a file line-by-line, without taking a sharing lock?
i'd rather not get directly into Win32 CreateFile/ReadFile, and having to deal with allocating buffers and detecting CR, LF, CRLF's.
i thought about memory mapped files, but there's the difficulty if the entire file doesn't fit (map) into virtual memory, and having to maps views (pieces) of the file at a time. Starts to get ugly.
i just want Reset with fmShareDenyNone!
With recent Delphi versions, you can use TStreamReader. Construct it with your file stream, and then call its ReadLine method (inherited from TTextReader).
An option for all Delphi versions is to use Peter Below's StreamIO unit, which gives you AssignStream. It works just like AssignFile, but for streams instead of file names. Once you've used that function to associate a stream with a TextFile variable, you can call ReadLn and the other I/O functions on it just like any other file.
You can use this sample code:
TTextStream = class(TObject)
private
FHost: TStream;
FOffset,FSize: Integer;
FBuffer: array[0..1023] of Char;
FEOF: Boolean;
function FillBuffer: Boolean;
protected
property Host: TStream read FHost;
public
constructor Create(AHost: TStream);
destructor Destroy; override;
function ReadLn: string; overload;
function ReadLn(out Data: string): Boolean; overload;
property EOF: Boolean read FEOF;
property HostStream: TStream read FHost;
property Offset: Integer read FOffset write FOffset;
end;
{ TTextStream }
constructor TTextStream.Create(AHost: TStream);
begin
FHost := AHost;
FillBuffer;
end;
destructor TTextStream.Destroy;
begin
FHost.Free;
inherited Destroy;
end;
function TTextStream.FillBuffer: Boolean;
begin
FOffset := 0;
FSize := FHost.Read(FBuffer,SizeOf(FBuffer));
Result := FSize > 0;
FEOF := Result;
end;
function TTextStream.ReadLn(out Data: string): Boolean;
var
Len, Start: Integer;
EOLChar: Char;
begin
Data:='';
Result:=False;
repeat
if FOffset>=FSize then
if not FillBuffer then
Exit; // no more data to read from stream -> exit
Result:=True;
Start:=FOffset;
while (FOffset<FSize) and (not (FBuffer[FOffset] in [#13,#10])) do
Inc(FOffset);
Len:=FOffset-Start;
if Len>0 then begin
SetLength(Data,Length(Data)+Len);
Move(FBuffer[Start],Data[Succ(Length(Data)-Len)],Len);
end else
Data:='';
until FOffset<>FSize; // EOL char found
EOLChar:=FBuffer[FOffset];
Inc(FOffset);
if (FOffset=FSize) then
if not FillBuffer then
Exit;
if FBuffer[FOffset] in ([#13,#10]-[EOLChar]) then begin
Inc(FOffset);
if (FOffset=FSize) then
FillBuffer;
end;
end;
function TTextStream.ReadLn: string;
begin
ReadLn(Result);
end;
Usage:
procedure ReadFileByLine(Filename: string);
var
sLine: string;
tsFile: TTextStream;
begin
tsFile := TTextStream.Create(TFileStream.Create(Filename, fmOpenRead or fmShareDenyWrite));
try
while tsFile.ReadLn(sLine) do
begin
//sLine is your line
end;
finally
tsFile.Free;
end;
end;
If you need support for ansi and Unicode in older Delphis, you can use my GpTextFile or GpTextStream.
As it seems the FileMode variable is not valid for Textfiles, but my tests showed that multiple reading from the file is no problem. You didn't mention it in your question, but if you are not going to write to the textfile while it is read you should be good.
What I do is use a TFileStream but I buffer the input into fairly large blocks (e.g. a few megabytes each) and read and process one block at a time. That way I don't have to load the whole file at once.
It works quite quickly that way, even for large files.
I do have a progress indicator. As I load each block, I increment it by the fraction of the file that has additionally been loaded.
Reading one line at a time, without something to do your buffering, is simply too slow for large files.
I had same problem a few years ago especially the problem of locking the file. What I did was use the low level readfile from the shellapi. I know the question is old since my answer (2 years) but perhaps my contribution could help someone in the future.
const
BUFF_SIZE = $8000;
var
dwread:LongWord;
hFile: THandle;
datafile : array [0..BUFF_SIZE-1] of char;
hFile := createfile(PChar(filename)), GENERIC_READ, FILE_SHARE_READ or FILE_SHARE_WRITE, nil, OPEN_EXISTING, FILE_ATTRIBUTE_READONLY, 0);
SetFilePointer(hFile, 0, nil, FILE_BEGIN);
myEOF := false;
try
Readfile(hFile, datafile, BUFF_SIZE, dwread, nil);
while (dwread > 0) and (not myEOF) do
begin
if dwread = BUFF_SIZE then
begin
apos := LastDelimiter(#10#13, datafile);
if apos = BUFF_SIZE then inc(apos);
SetFilePointer(hFile, aPos-BUFF_SIZE, nil, FILE_CURRENT);
end
else myEOF := true;
Readfile(hFile, datafile, BUFF_SIZE, dwread, nil);
end;
finally
closehandle(hFile);
end;
For me the speed improvement appeared to be significant.
Why not simply read the lines of the file directly from the TFileStream itself one at a time ?
i.e. (in pseudocode):
readline:
while NOT EOF and (readchar <> EOL) do
appendchar to result
while NOT EOF do
begin
s := readline
process s
end;
One problem you may find with this is that iirc TFileStream is not buffered so performance over a large file is going to be sub-optimal. However, there are a number of solutions to the problem of non-buffered streams, including this one, that you may wish to investigate if this approach solves your initial problem.
I found a way to write avi from BMP files:
http://www.delphi3000.com/articles/article_2770.asp?SK=
I want to write avi from array or TList of TBitmaps?
The key portion of the code you linked to is below, where IList is a TStrings with the names of all the files to include in the animation.
for i := 0 to IList.Count - 1 do begin
AssignFile(BFile, IList[i]);
Reset(BFile, 1);
Seek(BFile, m_bfh.bfOffBits);
BlockRead(BFile, m_MemBits[0], m_Bih.biSizeImage);
Seek(BFile, SizeOf(m_Bfh));
BlockRead(BFile, m_MemBitMapInfo[0], length(m_MemBitMapInfo));
CloseFile(BFile);
if AVIStreamWrite(psCompressed, i, 1, #m_MemBits[0],
m_Bih.biSizeImage, AVIIF_KEYFRAME, 0, 0) <> 0 then begin
ShowMessage('Error during Write AVI File');
break;
end;
end;
It reads portions of the file from disk and writes them to the AVI stream. The important part is that it reads from the files. The in-memory representation of a TBitmap doesn't necessarily match with the representation of a file. However, it's easy to adapt the given code to temporarily store the bitmaps in a memory stream; the stream will match what the layout of the file would be. Suppose IList is now an array of TBitmap, as you suggested. Then we could use this:
var
ms: TMemoryStream;
ms := TMemoryStream.Create;
try
for i := 0 to Length(IList) - 1 do begin
IList[i].SaveToStream(ms);
ms.Position := m_bfh.bfOffBits;
ms.ReadBuffer(m_MemBits[0], m_Bih.biSizeImage);
ms.Position := SizeOf(m_Bfh);
ms.ReadBuffer(m_MemBitMapInfo[0], Length(m_MemBitMapInfo));
ms.Clear;
if AVIStreamWrite(psCompressed, i, 1, #m_MemBits[0],
m_Bih.biSizeImage, AVIIF_KEYFRAME, 0, 0) <> 0 then begin
ShowMessage('Error during Write AVI File');
break;
end;
end;
finally
ms.Free;
end;
There's code earlier in your cited example that reads the first file in the list to populate the various records and size the arrays used here, but you should be able to make the same changes there as I have done to the code shown here.
my application opens files does transformations and saves the data out to another file..or possible the same file.. the file size changes but i dont know how big or small its gona be untill i see the data inside the first file..
At the moment i load the file into a dynamic array do all that i need to do in there then save it back... this was looking good untill i got to my testing stage where i found transforming multi gigabyte files on a system with 128mb ram caused some issues...LOL
here is my code..
procedure openfile(fname:string);
var
myfile: file;
filesizevalue:integer;
begin
AssignFile(myfile,fname);
filesizevalue := GetFileSize(fname);
Reset(myFile, 1);
SetLength(dataarray, filesizevalue);
BlockRead(myFile, dataarray[0], filesizevalue);
CloseFile(myfile);
end;
what i need is direct file access to minimise ram usage.. thats what i think i need/
is this what i need can it be done in delphi
I'd look at using a TFileStream, perhaps one with buffering, but you need to show what you are doing with the data really because it is hard to determine the best strategy. As gabr says, one option is memory mapped files, the code of which is in his link but since it is my code, I'll add it here too!
procedure TMyReader.InitialiseMapping(szFilename : string);
var
// nError : DWORD;
bGood : boolean;
begin
bGood := False;
m_hFile := CreateFile(PChar(szFilename), GENERIC_READ, 0, nil, OPEN_EXISTING, 0, 0);
if m_hFile <> INVALID_HANDLE_VALUE then
begin
m_hMap := CreateFileMapping(m_hFile, nil, PAGE_READONLY, 0, 0, nil);
if m_hMap <> 0 then
begin
m_pMemory := MapViewOfFile(m_hMap, FILE_MAP_READ, 0, 0, 0);
if m_pMemory <> nil then
begin
htlArray := Pointer(Integer(m_pMemory) + m_dwDataPosition);
bGood := True;
end
else
begin
// nError := GetLastError;
end;
end;
end;
if not bGood then
raise Exception.Create('Unable to map token file into memory');
end;
You can also map parts of file directly into memory. That's definitely the most direct way. See What is the fastest way to Parse a line in Delphi for an example.
If problem permits, you can use BlockRead and BlockWrite to read chunk of input file, process it and then write that chunk to output file. Something like this:
AssignFile(inFile,inFname);
AssignFile(outFile,outFname);
repeat
BlockRead(inFile, buff, SizeOf(buff), bytesRead);
ProcessBuffer(buff);
BlockWrite(outFile, buff, bytesRead, bytesWritten);
until (bytesRead = 0) or (bytesWritten <> bytesRead);
Code presumes that you won't change size of buffer while processing it. If size of file changes, then you should change last two lines of example code.
I prefer to use tFileStream for this kind of processing. In this example I am assuming there is a constant ArraySize which is set to the size of a single array element. For example, if your "array" is an array of integer then it would be set to:
ArraySize := SizeOf( Integer );
which would set the ArraySize to 4.
Function LoadPos(inFIlename:string;ArrayPos:Int64;var ArrayBuff) : boolean;
var
fs : tFileStream;
begin
result := false;
fs := tFileStream.Create(inFilename,fmOpenRead);
try
// seek to the array position
fs.Seek( ArrayPos * ArraySize, soFromBeginning);
// load the element
result := fs.Read( ArrayBuff, ArraySize ) = ArraySize;
finally
fs.free;
end;
end;
The only problem with this approach is it only works for fixed size structures, variable length strings require a different approach.
I don't think you'll get a "more direct" file access. Do you use all the data in the file? Otherwise you could perhaps use a stream and load only the data needed into memory. But if you use all the data, there's only one solution IMHO: read the file in chunks. But that highly depends on the kind of transformation you want to apply. If the transformation is not local (so that combined data elements are all in the same chunk), you gonna have problems.