I'm trying to download a file of 500 mb more when it's in this size gives out of memory error. I tried switching to 64 bit application and it worked. But I would need it to work in 32 bit application to download file.
var
Stream: TStream;
fileStream: TFileStream;
Buffer: PByte;
BytesRead, BufSize: Integer;
Size: int64;
begin
BufSize := 1024;
fileStream:= TFileStream.Create(GetCurrentDir()+'\DownloadFile.zip',
fmCreate);
GetMem(Buffer, BufSize);
Stream := getDownload(size);
if (Size <> 0) then
begin
repeat
BytesRead := Stream.Read(Pointer(Buffer)^, BufSize);
if (BytesRead > 0) then
begin
fileStream.WriteBuffer(Pointer(Buffer)^, BytesRead);
end;
Application.ProcessMessages
until (BytesRead < BufSize);
if (Size <> fileStream.Size) then
begin
exit;
end;
end;
finally
FreeMem(Buffer, BufSize);
fileStream.Destroy;
end;
end;
function TServiceMethods.getDownload(out Size: Int64): TStream;
begin
Result := TFileStream.Create(GetCurrentDir+'\DownloadFile.zip', fmOpenRead
or fmShareDenyNone);
Size := Result.Size;
Result.Position := 0;
end;
Don't use a memory stream here. That forces the entire file into a contiguous block of memory, which as you discovered exhausts memory in a 32 bit process.
Instead, write the downloaded data directly to file. You can remove the intermediate memory stream and write directly to a file stream.
Of course, all of this assumes that getDownload returns a stream that performs the download as you read it. If getDownload reads the entire file into a memory stream then it suffers from the exact same problem as the code in this question.
Related
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about programming within the scope defined in the help center.
Closed 8 years ago.
Improve this question
I'm trying to cut the lines of this code and I do not see how the code I show is the minimum of lines that could achieve:
function read_file(FileName: String): AnsiString;
var
F: File;
Buffer: AnsiString;
Size: Integer;
ReadBytes: Integer;
DefaultFileMode: Byte;
begin
Result := '';
DefaultFileMode := FileMode;
FileMode := 0;
AssignFile(F, FileName);
Reset(F, 1);
if (IOResult = 0) then
begin
Size := FileSize(F);
while (Size > 1024) do
begin
SetLength(Buffer, 1024);
BlockRead(F, Buffer[1], 1024, ReadBytes);
Result := Result + Buffer;
Dec(Size, ReadBytes);
end;
SetLength(Buffer, Size);
BlockRead(F, Buffer[1], Size);
Result := Result + Buffer;
CloseFile(F);
end;
FileMode := DefaultFileMode;
end;
is there any way to reduce more lines?
Like this:
function read_file(const FileName: String): AnsiString;
var
Stream: TFileStream;
begin
Stream := TFileStream.Create(FileName, fmOpenRead);
try
SetLength(Result, Stream.Size);
Stream.ReadBuffer(Pointer(Result)^, Stream.Size);
finally
Stream.Free;
end;
end;
In modern Delphi the TFile class has static methods that can do this as a one liner. Although not directly into an AnsiString.
As well as being shorter I perceive the following additional benefits:
Avoiding Pascal I/O in favour of modern streams.
Error handling by exceptions, taken care of by the stream class.
A single allocation of the string variable as opposed to repeated inefficient re-allocations.
If you must do this with Pascal I0 use a single allocation.
SetLength(Buffer, FileSize(F));
BlockRead(F, Pointer(Result)^, Length(Result), ReadBytes);
If you insist on using old-style I/O, the following function is probably the smallest one you can do and still handle errors appropriately (if appropriately handling errors is to return an empty string).
function read_file(const FileName: String): AnsiString;
var
F: File;
DefaultFileMode: Byte;
begin
DefaultFileMode := FileMode;
try
FileMode := 0;
AssignFile(F, FileName);
{$I-}
Reset(F, 1);
{$I+}
if IoResult=0 then
try
SetLength(Result,FileSize(F));
if Length(Result)>0 then begin
{$I-}
BlockRead(F,Result[1],LENGTH(Result));
{$I+}
if IoResult<>0 then Result:='';
end;
finally
CloseFile(F);
end;
finally
FileMode := DefaultFileMode;
end;
end;
I've been using ZLib functions to compress/uncompress streams in memory. In case when I try to uncompress invalid stream, it leaks memory. The following code would leak memory:
uses
Winapi.Windows, System.Classes, System.ZLib;
function DecompressStream(const AStream: TMemoryStream): Boolean;
var
ostream: TMemoryStream;
begin
ostream := TMemoryStream.Create;
try
AStream.Position := 0;
// ISSUE: Memory leak happening here
try
ZDecompressStream(AStream, ostream);
except
Exit(FALSE);
end;
AStream.Clear;
ostream.Position := 0;
AStream.CopyFrom(ostream, ostream.Size);
result := TRUE;
finally
ostream.Free;
end;
end;
var
s: TMemoryStream;
begin
ReportMemoryLeaksOnShutdown := TRUE;
s := TMemoryStream.Create;
try
DecompressStream(s);
finally
s.Free;
end;
end.
I try to decompress empty TMemoryStream here and at the end of execution it shows that memory leak happened. Testing on Delphi XE2.
Any ideas how to prevent this leak to happen, because in real world there would be a chance for my application to try to decompress invalid stream and leak the memory there.
QC: http://qc.embarcadero.com/wc/qcmain.aspx?d=120329 - claimed fixed starting with XE6
It's a bug in the Delphi RTL code. The implementation of ZDecompressStream raises exceptions and then fails to perform tidy up. Let's look at the code:
procedure ZDecompressStream(inStream, outStream: TStream);
const
bufferSize = 32768;
var
zstream: TZStreamRec;
zresult: Integer;
inBuffer: TBytes;
outBuffer: TBytes;
inSize: Integer;
outSize: Integer;
begin
SetLength(inBuffer, BufferSize);
SetLength(outBuffer, BufferSize);
FillChar(zstream, SizeOf(TZStreamRec), 0);
ZCompressCheck(InflateInit(zstream)); <--- performs heap allocation
inSize := inStream.Read(inBuffer, bufferSize);
while inSize > 0 do
begin
zstream.next_in := #inBuffer[0];
zstream.avail_in := inSize;
repeat
zstream.next_out := #outBuffer[0];
zstream.avail_out := bufferSize;
ZCompressCheck(inflate(zstream, Z_NO_FLUSH));
// outSize := zstream.next_out - outBuffer;
outSize := bufferSize - zstream.avail_out;
outStream.Write(outBuffer, outSize);
until (zstream.avail_in = 0) and (zstream.avail_out > 0);
inSize := inStream.Read(inBuffer, bufferSize);
end;
repeat
zstream.next_out := #outBuffer[0];
zstream.avail_out := bufferSize;
zresult := ZCompressCheck(inflate(zstream, Z_FINISH));
// outSize := zstream.next_out - outBuffer;
outSize := bufferSize - zstream.avail_out;
outStream.Write(outBuffer, outSize);
until (zresult = Z_STREAM_END) and (zstream.avail_out > 0);
ZCompressCheck(inflateEnd(zstream)); <--- tidy up, frees heap allocation
end;
I've taken this from my XE3, but I believe that it is essentially the same in all versions. I've highlighted the problem. The call to inflateInit allocates memory off the heap. It needs to be paired with a call to inflateEnd. Because ZCompressCheck raises exceptions in the face of errors, the call to inflateEnd never happens. And hence the code leaks.
The other calls to inflateInit and inflateEnd in that unit are correctly protected with try/finally. It just appears to be the use in this function that is erroneous.
My recommendation is that you replace the Zlib unit with a version that is implemented correctly.
I want to implement a simple http downloader using TIdHttp (Indy10). I found two kind of code examples from the internet. Unfortunately none of them satisfy me 100%. Here is the code and I want some advise.
Variant 1
var
Buffer: TFileStream;
HttpClient: TIdHttp;
begin
Buffer := TFileStream.Create('somefile.exe', fmCreate or fmShareDenyWrite);
try
HttpClient := TIdHttp.Create(nil);
try
HttpClient.Get('http://somewhere.com/somefile.exe', Buffer); // wait until it is done
finally
HttpClient.Free;
end;
finally
Buffer.Free;
end;
end;
The code is compact and very easy to understand. The problem is that it allocates disk space when downloading begins. Another problem is that we cannot show the download progress in GUI directly, unless the code is executed in a background thread (alternatively we can bind HttpClient.OnWork event).
Variant 2:
const
RECV_BUFFER_SIZE = 32768;
var
HttpClient: TIdHttp;
FileSize: Int64;
Buffer: TMemoryStream;
begin
HttpClient := TIdHttp.Create(nil);
try
HttpClient.Head('http://somewhere.com/somefile.exe');
FileSize := HttpClient.Response.ContentLength;
Buffer := TMemoryStream.Create;
try
while Buffer.Size < FileSize do
begin
HttpClient.Request.ContentRangeStart := Buffer.Size;
if Buffer.Size + RECV_BUFFER_SIZE < FileSize then
HttpClient.Request.ContentRangeEnd := Buffer.Size + RECV_BUFFER_SIZE - 1
else
HttpClient.Request.ContentRangeEnd := FileSize;
HttpClient.Get(HttpClient.URL.URI, Buffer); // wait until it is done
Buffer.SaveToFile('somefile.exe');
end;
finally
Buffer.Free;
end;
finally
HttpClient.Free;
end;
end;
First we query the file size from the server and then we download file contents in pieces. Retrieved file contents will be save to disk when they are received completely. The potential problem is we have to send multiple GET requests to the server. I am not sure if some servers (such as megaupload) might limit the number of requests within particular time period.
My expectations
The downloader should send only one GET-request to the server.
The disk space must not be allocated when the download begins.
Any hints are appreciated.
Variant #1 is the simpliest, and is how Indy is meant to be used.
Regarding the disk allocation issue, you can derive a new class from TFileStream and override its SetSize() method to do nothing. TIdHTTP will still attempt to pre-allocate the file when appropriate, but it will not actually allocate any disk space. Writing to TFileStream will grow the file as needed.
Regarding status reporting, TIdHTTP has OnWork... events for that purpose. The AWorkCountMax parameter of the OnWorkBegin will be the actual file size if known (the response is not chunked), or 0 if not known. The AWorkCount parameter of the OnWork event will be the cumulative number of bytes that have been transferred so far. If the file size is known, you can display the total percentage by simply dividing the AWorkCount by the AWorkCountMax and multiplying by 100, otherwise just display the AWorkCount value by itself. If you want to display the speed of the transfer, you can calculate that from the difference of AWorkCount values and the time intervals between multiple OnWork events.
Try this:
type
TNoPresizeFileStream = class(TFileStream)
procedure
procedure SetSize(const NewSize: Int64); override;
end;
procedure TNoPresizeFileStream.SetSize(const NewSize: Int64);
begin
end;
.
type
TSomeClass = class(TSomething)
...
TotalBytes: In64;
LastWorkCount: Int64;
LastTicks: LongWord;
procedure Download;
procedure HttpWorkBegin(ASender: TObject; AWorkMode: TWorkMode; AWorkCountMax: Int64);
procedure HttpWork(ASender: TObject; AWorkMode: TWorkMode; AWorkCount: Int64);
procedure HttpWorkEnd(ASender: TObject; AWorkMode: TWorkMode);
...
end;
procedure TSomeClass.Download;
var
Buffer: TNoPresizeFileStream;
HttpClient: TIdHttp;
begin
Buffer := TNoPresizeFileStream.Create('somefile.exe', fmCreate or fmShareDenyWrite);
try
HttpClient := TIdHttp.Create(nil);
try
HttpClient.OnWorkBegin := HttpWorkBegin;
HttpClient.OnWork := HttpWork;
HttpClient.OnWorkEnd := HttpWorkEnd;
HttpClient.Get('http://somewhere.com/somefile.exe', Buffer); // wait until it is done
finally
HttpClient.Free;
end;
finally
Buffer.Free;
end;
end;
procedure TSomeClass.HttpWorkBegin(ASender: TObject; AWorkMode: TWorkMode; AWorkCountMax: Int64);
begin
if AWorkMode <> wmRead then Exit;
// initialize the status UI as needed...
//
// If TIdHTTP is running in the main thread, update your UI
// components directly as needed and then call the Form's
// Update() method to perform a repaint, or Application.ProcessMessages()
// to process other UI operations, like button presses (for
// cancelling the download, for instance).
//
// If TIdHTTP is running in a worker thread, use the TIdNotify
// or TIdSync class to update the UI components as needed, and
// let the OS dispatch repaints and other messages normally...
TotalBytes := AWorkCountMax;
LastWorkCount := 0;
LastTicks := Ticks;
end;
procedure TSomeClass.HttpWork(ASender: TObject; AWorkMode: TWorkMode; AWorkCount: Int64);
var
PercentDone: Integer;
ElapsedMS: LongWord;
BytesTransferred: Int64;
BytesPerSec: Int64;
begin
if AWorkMode <> wmRead then Exit;
ElapsedMS := GetTickDiff(LastTicks, Ticks);
if ElapsedMS = 0 then ElapsedMS := 1; // avoid EDivByZero error
if TotalBytes > 0 then
PercentDone := (Double(AWorkCount) / TotalBytes) * 100.0;
else
PercentDone := 0.0;
BytesTransferred := AWorkCount - LastWorkCount;
// using just BytesTransferred and ElapsedMS, you can calculate
// all kinds of speed stats - b/kb/mb/gm per sec/min/hr/day ...
BytesPerSec := (Double(BytesTransferred) * 1000) / ElapsedMS;
// update the status UI as needed...
LastWorkCount := AWorkCount;
LastTicks := Ticks;
end;
procedure TSomeClass.HttpWorkEnd(ASender: TObject; AWorkMode: TWorkMode);
begin
if AWorkMode <> wmRead then Exit;
// finalize the status UI as needed...
end;
Here is an example that shows how to use the components OnWork to show a progress bar:
Download a File from internet programatically with an Progress event using Delphi and Indy
You should not worry about the disk allocation. Disk space that is allocated is not actually written to, so it won't damage your disks. Be happy that it is allocated so that it is not possible that another process claims the disk space and let you run out of space!
Do not forget to add this for the Variant 2
: Else HttpClient.Request.ContentRangeEnd := FileSize;
Replace
if Buffer.Size + RECV_BUFFER_SIZE < FileSize then
HttpClient.Request.ContentRangeEnd := Buffer.Size + RECV_BUFFER_SIZE - 1;
By
if Buffer.Size + RECV_BUFFER_SIZE < FileSize then
HttpClient.Request.ContentRangeEnd := Buffer.Size + RECV_BUFFER_SIZE - 1;
Else HttpClient.Request.ContentRangeEnd := FileSize;
I know that I can efficiently truncate a file and remove bytes from the end of the file.
Is there a corresponding efficient way to truncate files by deleting content from the beginning of the file to a point in the middle of the file?
As I read the question you are asking to remove content from a file starting from the beginning of the file. In other words you wish to delete content at the start of the file and shift the remaining content down.
This is not possible. You can only truncate a file from the end, not from the beginning. You will need to copy the remaining content into a new file, or copy it down yourself within the same file.
However you do it there is no shortcut efficient way to do this. You have to copy the data, for example as #kobik describes.
Raymond Chen wrote a nice article on this topic: How do I delete bytes from the beginning of a file?
Just for fun, here's a simple implementation of a stream based method to delete content from anywhere in the file. You could use this with a read/write file stream. I've not tested the code, I'll leave that to you!
procedure DeleteFromStream(Stream: TStream; Start, Length: Int64);
var
Buffer: Pointer;
BufferSize: Integer;
BytesToRead: Int64;
BytesRemaining: Int64;
SourcePos, DestPos: Int64;
begin
SourcePos := Start+Length;
DestPos := Start;
BytesRemaining := Stream.Size-SourcePos;
BufferSize := Min(BytesRemaining, 1024*1024*16);//no bigger than 16MB
GetMem(Buffer, BufferSize);
try
while BytesRemaining>0 do begin
BytesToRead := Min(BufferSize, BytesRemaining);
Stream.Position := SourcePos;
Stream.ReadBuffer(Buffer^, BytesToRead);
Stream.Position := DestPos;
Stream.WriteBuffer(Buffer^, BytesToRead);
inc(SourcePos, BytesToRead);
inc(DestPos, BytesToRead);
dec(BytesRemaining, BytesToRead);
end;
Stream.Size := DestPos;
finally
FreeMem(Buffer);
end;
end;
A very simple solution would be to shift (move) blocks of data from the "target position offset"
towards BOF, and then trim (truncate) the leftovers:
--------------------------
|******|xxxxxx|yyyyyy|zzz|
--------------------------
BOF <-^ (target position offset)
--------------------------
|xxxxxx|yyyyyy|zzz|******|
--------------------------
^ EOF
Since #David posted a code based on TStream, here is some code based on "low level" I/O pascal style:
function FileDeleteFromBOF(const FileName: string; const Offset: Cardinal): Boolean;
var
Buf: Pointer;
BufSize, FSize,
NumRead, NumWrite,
OffsetFrom, OffsetTo: Cardinal;
F: file;
begin
{$IOCHECKS OFF}
Result := False;
AssignFile(F, FileName);
try
FileMode := 2; // Read/Write
Reset(F, 1); // Record size = 1
FSize := FileSize(F);
if (IOResult <> 0) or (Offset >= FSize) then Exit;
BufSize := Min(Offset, 1024 * 64); // Max 64k - This value could be optimized
GetMem(Buf, BufSize);
try
OffsetFrom := Offset;
OffsetTo := 0;
repeat
Seek(F, OffsetFrom);
BlockRead(F, Buf^, BufSize, NumRead);
if NumRead = 0 then Break;
Seek(F, OffsetTo);
BlockWrite(F, Buf^, NumRead, NumWrite);
Inc(OffsetFrom, NumWrite);
Inc(OffsetTo, NumWrite);
until (NumRead = 0) or (NumWrite <> NumRead) or (OffsetFrom >= FSize);
// Truncate and set to EOF
Seek(F, FSize - Offset);
Truncate(F);
Result := IOResult = 0;
finally
FreeMem(Buf);
end;
finally
CloseFile(F);
end;
end;
I am using a TStringBuilder class ported from .Net to Delphi 7.
And here is my code snippet:
procedure TForm1.btn1Click(Sender: TObject);
const
FILE_NAME = 'PATH TO A TEXT FILE';
var
sBuilder: TStringBuilder;
I: Integer;
fil: TStringList;
sResult: string;
randInt: Integer;
begin
randomize;
sResult := '';
for I := 1 to 100 do
begin
fil := TStringList.Create;
try
fil.LoadFromFile(FILE_NAME);
randInt := Random(1024);
sBuilder := TStringBuilder.Create(randInt);
try
sBuilder.Append(fil.Text);
sResult := sBuilder.AsString;
finally
sBuilder.free;
end;
mmo1.Text := sResult;
finally
FreeAndNil(fil);
end;
end;
showmessage ('DOne');
end;
I am experiencing AV errors. I can alleviate the problem when I create memory with the size multiple by 1024, however sometimes it still occurs.
Am I doing something wrong?
Your code is fine. The TStringBuilder code you're using is faulty. Consider this method:
procedure TStringBuilder.Append(const AString : string);
var iLen : integer;
begin
iLen := length(AString);
if iLen + FIndex > FBuffMax then _ExpandBuffer;
move(AString[1],FBuffer[FIndex],iLen);
inc(FIndex,iLen);
end;
If the future length is too long for the current buffer size, the buffer is expanded. _ExpandBuffer doubles the size of the buffer, but once that's done, it never checks whether the new buffer size is sufficient. If the original buffer size is 1024, and the file you're loading is 3 KB, then doubling the buffer size to 2048 will still leave the buffer too small in Append, and you'll end up overwriting 1024 bytes beyond the end of the buffer.
Change the if to a while in Append.