I imported many Excel worksheets to many tFDMemTables one by one and modified them. Now I try to save them to any kind of ONE file to maintain with tFDMemTable further, no more Excel. How can I make many tFDMemTables to ONE object to save to ONE file, not by APPEND?
I use Delphi 10.3 Community in Windows 10. The reference recommends FireDAC that I use it.
In .Net I make multi table layers into ONE DataSet and ONE XML file with the following simple code. But in Delphi it seems that a data table means a data set. So what is that holds many data tables like .Net DataSet?
DataSet.Tables.Add(Table);
DataSet.WriteXml(FileName);
The code below shows how to save a series of Excel files (workbooks) into rows
of an FDMemTable on the basis of one workbook per FDMemTable row. This is done
by saving the Excel file in a blob field of the FDMemTable.
The GetFiles method shows how to scan a folder for Excel files and save them to
the FDMemTable using the SaveFile method. The FDMemTable row includes the name
of the Excel file and the path to where it was found.
When the GetExcelMethod completes, it saves the FDMemTable contents to a file
which can then be copied elsewhere.
The WriteFiles method reads the Excel files from the FDMemTable and writes them
to a specified directory: this method also shows how to open the file using Excel.
Of course, the techniques shown here are not restricted to Excel files: by
adjusting the file mask in the GetFiles, it could find and save files of any type.
uses [...] ShellAPI;
type
TForm1 = class(TForm)
[...]
private
ID : Integer; // used to generate iD field for FDMemTable1
[...]
public
end;
procedure TForm1.GetFiles(Path : String);
// Find all files in a given directory and save them to FDMemTable1
var
SearchRec : TSearchRec;
Res : Integer;
FN : String;
begin
Path := Path + '\*.xl*';
Res := FindFirst(Path, faAnyFile, SearchRec);
if Res = 0 {SearchRec.Attr and faAnyFile = faAnyFile} then begin
repeat
SaveFile(ExtractFilePath(Path) + SearchRec.Name);
Res := FindNext(SearchRec);
until Res <> 0;
FindClose(SearchRec);
FN := ExtractFilePath(Application.ExeName) + 'Excelfiles.XML';
FDMemTable1.SaveToFile(FN, sfXML);
end;
end;
procedure TForm1.SaveFile(FileName : String);
// Save an individual file to FDMemTable1
var
APath,
AName : String;
begin
APath := ExtractFilePath(FileName);
AName := ExtractFileName(FileName);
inc(ID);
FDMemTable1.Insert;
FDMemTable1.FieldByName('ID').AsInteger := ID;
FDMemTable1.FieldByName('FilePath').AsString := APath;
FDMemTable1.FieldByName('FileName').AsString := AName;
TBlobField(FDMemTable1.FieldByName('FileData')).LoadFromFile(FileName);
FDMemTable1.Post;
end;
procedure TForm1.WriteFiles;
// Extract files from FDMemTable1 to s given directory
var
FileName : String;
begin
if not FDMemTable1.Active then
FDMemTable1.Open;
FDMemTable1.First;
while not FDMemTable1.Eof do begin
FileName := FDMemTable1.FieldByName('FileName').AsString;
TBlobField(FDMemTable1.FieldByName('FileData')).SaveToFile('C:\Temp\'+ FileName);
// Uncomment the following line to have the file opened in Excel; you'll need to add ShellAPI to your Uses list
// ShellExecute(Handle, 'Open', PChar('C:\Temp\' + Filename), '','',SW_SHOWNORMAL);
FDMemTable1.Next;
end;
end;
procedure TForm1.FormCreate(Sender: TObject);
var
FieldDef : TFieldDef;
begin
ID := 0;
FieldDef := FDMemTable1.FieldDefs.AddFieldDef;
FieldDef.Name := 'ID';
FieldDef.DataType := ftInteger;
FieldDef := FDMemTable1.FieldDefs.AddFieldDef;
FieldDef.Name := 'FilePath';
FieldDef.DataType := ftString;
FieldDef.Size := Max_Path;
FieldDef := FDMemTable1.FieldDefs.AddFieldDef;
FieldDef.Name := 'FileName';
FieldDef.DataType := ftString;
FieldDef.Size := Max_Path;
FieldDef := FDMemTable1.FieldDefs.AddFieldDef;
FieldDef.Name := 'FileData';
FieldDef.DataType := ftBlob;
FDMemTable1.CreateDataSet;
end;
procedure TForm1.btnSaveClick(Sender: TObject);
begin
GetFiles('D:\aaad7\aaaofficeauto');
end;
procedure TForm1.btnWriteFilesClick(Sender: TObject);
begin
WriteFiles;
end;
I don't know another way. Append all data into a new dataset and export it using SaveToFile() procedure.
I have a solution for you.
Use TFDDataSet variable to store all data from Excel files.
DataSet: TFDDataSet
Each time you get a data table, merge it into the DataSet via TFDDataSet.MergeDataSet method.
// Getting data from Excel files.
// Suppose there are 10 excel files.
for I := 0 to 9 do
begin
// Stored Excel data to TempDataSet.
// ...
// Merge TempDataSet to DataSet.
DataSet.MergeDataSet(TempDataSet, dmDataAppend, mmAdd);
end;
Finally, use TFDDataSet.SaveToFile method to write XML file.
// Write to XML file.
DataSet.SaveToFile('C:\Data\DataSetData.xml', sfXML);
Related
I have a total of 3214 .doc. I need to open the first file, copy its contents, paste it in a RichEdit, extract some text, insert it into a database then move on to the next file and repeat the procedure.
So far I've managed to:
Open the 1st .doc/any 1 .doc only
Copy the content and paste it in the RichEdit
Extract the text from the RichEdit
Insert the extracted text into the database
Close the opened .doc and clear the content of RichEdit
I've loaded all 3214 filenames, in order, into a Memo.
Once I finish with the 1st file from the list, how do I now make it move to the next .doc from the list and do the same thing, repeating this till I finish all the 3214 .doc files? Currently reading about loops but I can't figure it out yet.
Code so far:
procedure TForm1.Button4Click(Sender: TObject);
var
content: string;
StartPos: Integer;
endPos: Integer;
i: integer;
fname: string;
WordApp : Variant;
begin
WordApp := CreateOleObject('Word.Application');
for i := 1 to 1 do
fname := Memo1.Lines[i - 1];
WordApp.Visible := True;
WordApp.Documents.Open('C:\Users\tcsh\Desktop\all\'+fname);
WordApp.ActiveDocument.Select;
WordApp.Selection.Copy;
RichEdit1.Lines.Add(WordApp.Selection);
WordApp.documents.item(1).Close;
WordApp.Quit;
content:= RichEdit1.Text;
//<text extract code is here>
begin
//<sql code is here>
end;
RichEdit1.Clear;
Edit1.Clear;
Edit2.Clear;
Edit3.Clear;
Edit4.Clear;
Edit5.Clear;
Edit7.Clear;
Edit8.Clear;
//the TEdit's hold the extracted text so the sql can retrieve it from them and insert into the database
end;
for i := 1 to 1 do
Hmmm, that will only run once..
You may also want to try:
Create the WordApp object in each iteration ,
Add a time delay in between each iteration (using Sleep and Application.ProcessMessages) (as LU RD points out this is not necessary)
Code sample below:
for i := 0 to Memo1.Lines.Count - 1 do
begin
WordApp := CreateOleObject('Word.Application');
fname := Memo1.Lines[i];
WordApp.Visible := True;
WordApp.Documents.Open(fname);
WordApp.ActiveDocument.Select;
WordApp.Selection.Copy;
Memo2.Lines.Add(WordApp.Selection);
Memo2.Lines.Add('===');
WordApp.documents.item(1).Close;
WordApp.Quit;
//Sleep(1000); -> not needed
//Application.ProcessMessages;
end;
Try it with System.IOUtils.TDirectory.GetFiles
GetFiles('C:\temp\', '*.doc');
Here is an example
You'll find a few here on SO.
Update
...
var
line: string;
...
for line in Memo1.Lines do begin
<your code per file>
ShowMessage(line);
end
I'm modifying a program that is written in Delphi 6.0
I have a table in Oracle with a BLOB column named FILE_CONTENT.
I have already managed to upload an XML File that is about 100 KB. I have verified that the file content was correctly uploaded using SQL Developer.
The problem I have is when I try to download back the file content from DB to a file. This is an example code I'm using to donwload it:
procedure TfrmDownload.Save();
var
fileStream: TFileStream;
bField: TBlobField;
begin
dmDigital.qrGetData.Open;
dmDigital.RequestLive := True;
bField := TBlobField(dmDigital.qrGetData.FieldByName('FILE_CONTENT'));
fileStream := TFileStream.Create('FILE.XML', fmCreate);
bField.SaveToStream(fileStream);
FlushFileBuffers(fileStream.Handle);
fileStream.Free;
dmDigital.qrGetData.Close;
end;
The previous code already downloads the file content to FILE.XML. I'm using RequestLive:=True to be able to download a large BLOB (otherwise the file content is truncated to 32K max)
The resulting file is the same size as the original file. However, when I compare the downloaded file with the original one there are some differences (for example the last character is missing and other characters are also changed), therefore it seems to be a problem while downloading the content.
Do you know what cuould be wrong?
The problem seems to be related to Delphi code because I already tried with C# and the file content is downloaded correctly.
Don't use TBlobField.SaveToStream() directly, use TDataSet.CreateBlobStream() instead (which is what TBlobField.SaveToStream() uses internally anyway):
procedure TfrmDownload.Save;
var
fileStream: TFileStream;
bField: TField;
bStream: TStream;
begin
dmDigital.qrGetData.Open;
try
dmDigital.RequestLive := True;
bField := dmDigital.qrGetData.FieldByName('FILE_CONTENT');
bStream := bField.DataSet.CreateBlobStream(bField, bmRead);
try
fileStream := TFileStream.Create('FILE.XML', fmCreate);
try
fileStream.CopyFrom(bStream, 0);
FlushFileBuffers(fileStream.Handle);
finally
fileStream.Free;
end;
finally
bStream.Free;
end;
finally
dmDigital.qrGetData.Close;
end;
end;
TDataSet.CreateBlobStream() allows the DataSet to decide the best way to access the BLOB data. If the returned TStream is not delivering the data correctly, then either the TStream class implementation that CreateBlobStream() uses is broken, or the underlying DB driver is buggy. Try taking CopyFrom() out of the equation so you can verify the data as it is being retrieved:
procedure TfrmDownload.Save;
const
MaxBufSize = $F000;
var
Buffer: array of Byte;
N: Integer;
fileStream: TFileStream;
bField: TField;
bStream: TStream;
begin
dmDigital.qrGetData.Open;
try
dmDigital.RequestLive := True;
bField := dmDigital.qrGetData.FieldByName('FILE_CONTENT');
bStream := bField.DataSet.CreateBlobStream(bField, bmRead);
try
fileStream := TFileStream.Create('FILE.XML', fmCreate);
try
//fileStream.CopyFrom(bStream, 0);
SetLength(Buffer, MaxBufSize);
repeat
N := bStream.Read(PByte(Buffer)^, MaxBufSize);
if N < 1 then Break;
// verify data here...
fileStream.WriteBuffer(PByte(Buffer)^, N);
until False;
FlushFileBuffers(fileStream.Handle);
finally
fileStream.Free;
end;
finally
bStream.Free;
end;
finally
dmDigital.qrGetData.Close;
end;
end;
Im working in a Tdbadvgrid from TMS. I am adding colums dynamically based on a sql query length. Everything works great, with the exception of the column width saving.
Adding the columns:
//Captions
for i := 0 to oRow.Count - 1 do
begin
grdFieldData.Columns.Insert(1);
grdFieldData.Cells[i + 1, 0] := TabelList.Captions[i].Caption;
end;
//Data
for r := 0 to TabelList.Count - 1 do //rows
begin
for c := 0 to oRow.Count - 1 do //cols
begin
grdFieldData.Cells[c+1, r+1] := TabelList.Rows[r].Fields[c].Value;
end;
if r <> TabelList.Count - 1 then
grdFieldData.RowCount := grdFieldData.RowCount + 1;
end;
Now the save function is built into the TMS TDBAdvGrid, and looks like this:
I have tried messing with all the options with no luck.
The table has 1 fixed row (for captions) and one empty row. The row is only there because the number of fixed rows must be smaller then the number of rows.
When saving the data to either .ini file or to registry, it saves and loads the first column, but the dynamicly added ones get written in as default value (64) but never saved/loaded when i drag them to adjust size. The .ini file table looks like this:
[Recept]
Col0=20
Col1=97
Col2=64
Col3=64
Col4=64
Col5=64
Col6=64
Col7=64
Col8=64
Col9=64
Col10=64
Col11=64
Col12=64
Col13=64
Col14=64
When loaded in it looks like this:
Does anyone know what I can do to make the columns save properly so the widths will be saved?
So, after a few hours of sitting with it, I decided that there was no reason to struggle so much with something built in when I could just build something simple myself.
In this instance, it simply saves to a chosen ini file at a chosen directory. The filename and directory are currently hardcoded to each inhreited class. A feature to change them could easily be implemented. Perhaps even a directory selecter to a button or something like it.
The code for loading (called on FormShow):
procedure TfrmReceptEditor.LoadColWidths;
var
Ini: TIniFile;
i: Integer;
path: String;
filename: String;
begin
inherited;
path := 'C:\';
filename := 'grid.ini';
Ini := TIniFile.Create(path + filename);
try
for i := 0 to grdFieldData.ColCount - 1 do
begin
grdFieldData.Columns.Items[i].Width := Ini.ReadInteger('Recept','col'+IntToStr(i),75);
end;
grdFieldData.FixedColWidth := 20;
finally
Ini.Free;
end
end;
And the code for saving the data (on FormClose):
procedure TfrmReceptEditor.SaveColWidths;
var
Ini: TIniFile;
i: Integer;
path: String;
filename: String;
begin
inherited;
path := 'C:\';
filename := 'grid.ini';
Ini := TIniFile.Create(path + filename);
try
for i := 0 to grdFieldData.ColCount - 1 do
begin
Ini.WriteInteger('Recept', 'col'+IntToStr(i), grdFieldData.Columns.Items[i].Width);
end;
finally
Ini.Free;
end;
end;
Some might want to build in features to check if the Ini files exists (aka if the entered path is correct). But it will work smoothly even if the file doesnt exist, it will simply create it.
I would like to use SaveToStream to save a ClientDataSet ALONG WITH OTHER MATERIAL. Here is a short sample:
filename := ChangeFileExt(Application.ExeName, '.dat');
FS := TFileStream.Create(filename, fmCreate);
CDS.SaveToStream(FS);
ShowMessage('After save, position is ' + IntToStr(FS.Position));
{now write a longint}
L := 1234;
siz := SizeOf(L);
Write(L, siz);
FS.Free;
But when I try to load this back in using LoadFromStream, and I again display the position after the ClientDataSet has been loaded, I see that the position is now 4 bytes AFTER the clientdataset was originally saved. It seems that CDS.LoadFromStream just plows ahead and consumes whatever follows it. As a result, when I then try to read the longint, I get an end of file error.
It is not sufficient to just use the CDS.SaveToStream at the end of creating a file, because what I'd really like to do is to save TWO clientdatasets to the file, one after the other, plus other material.
Ideas? Thanks.
[NB, this solution is essentially doubling up the work that (TLama's suggestion) "ReadDataPacket/WriteDataPacket" already does internally. I would use TLama's approach i.e. sub-class TClientDataSet to expose the above protected methods, and use the WriteSize parameter.]
Save the datasets to a temporary stream and then copy that to your destination stream with size information:
procedure InternalSaveToStream(AStream: TStream);
var
ATempStream: TMemoryStream;
ASize: Int64;
begin
ATempStream := TMemoryStream.Create;
// Save first dataset:
DataSet1.SaveToStream(ATempStream, dfBinary);
ASize := ATempStream.Size;
AStream.WriteData(ASize);
ATempStream.Position := 0;
AStream.CopyFrom(ATempStream, ALength);
ATempStream.Clear;
// Save second dataset:
DataSet2.SaveToStream(ATempStream, dfBinary);
ASize := ATempStream.Size;
AStream.WriteData(ASize);
ATempStream.Position := 0;
AStream.CopyFrom(ATempStream, ALength);
ATempStream.Clear;
FreeAndNil(ATempStream);
end;
To read back, first read the size and then copy that section of your source to a temporary stream again and load your dataset from that:
procedure InternalLoadFromStream(AStream: TStream);
var
ATempStream: TMemoryStream;
ASize: Int64;
begin
ATempStream := TMemoryStream.Create;
// Load first datset:
AStream.Read(ASize,SizeOf(ASize));
ASize := ATempStream.Size;
ATempStream.CopyFrom(AStream,ASize);
ATempStream.Position := 0;
DataSet1.LoadFromStream(ATempStream);
//...etc.
end;
I have problems retrieving a blob field containing images, then saving it to file.
The downloaded file does not seem to be in the correct format; if I open
it with a hex editor it contains the word #R_BLOB#
What could be wrong?
bg: = TAdsBlobStream.Create (adsQuery1.FieldByName ('PAGE') as TBlobField, bmReadWrite);
f: = TMemoryStream.create;
bg.Position: = 0;
f.CopyFrom (bg, bg.size);
f.SaveToFile ('c: \ tmp \ db \ img \' + IntToStr (n) + '. jpg');
f.free;
Thanks
I just used the following code to write a small (9K) JPEG into an ADS BLOB Field via an ADSTable, read it back into an ADSBlobStream and write it to disk in a separate location, and all worked fine. The new copy of the image opens fine in Windows Picture Viewer, and I don't see any extraneous characters in it with a hex viewer.
// Code to create the test table, done in ARC32
CREATE TABLE Test (ID Integer, Pic BLOB);
Dropped a TADSSettings, TADSQuery, and two plain TButton components on a new blank VCL form. Here are the button OnClick handlers (obviously the filenames in both handlers are hard-coded and should be replaced by actual filenames on your machine):
procedure TForm1.Button1Click(Sender: TObject);
var
Blob: TAdsBlobStream;
Strm: TFileStream;
Tbl: TAdsTable;
begin
Tbl := TAdsTable .Create(nil);
try
Tbl.DatabaseName := AdsQuery1.DatabaseName;
Tbl.TableType := ttAdsCDX;
Tbl.TableName := 'Test.dbf';
Tbl.Open;
Tbl.Edit;
Blob := Tbl.CreateBlobStream(AdsQuery1.Fields[1], bmWrite) as TAdsBlobStream;
try
Strm := TFileStream.Create('E:\Test\Images\Big folder.jpg', fmOpenRead);
try
Blob.CopyFrom(Strm, Strm.Size);
finally
Strm.Free;
end;
finally
Blob.Free;
end;
finally
Tbl.Post;
Tbl.Close;
Tbl.Free;
AdsQuery1.Open;
end;
end;
procedure TForm1.Button2Click(Sender: TObject);
var
Blob: TAdsBlobStream;
Strm: TFileStream;
begin
AdsQuery1.SQL.Text := 'SELECT ID, pic FROM test';
AdsQuery1.Open;
Blob := AdsQuery1.CreateBlobStream(AdsQuery1.Fields[1], bmRead) as TAdsBlobStream;
try
Strm := TFileStream.Create('E:\TempFiles\BigFolder.jpg', fmCreate);
try
Strm.CopyFrom(Blob, Blob.Size);
finally
Strm.Free;
end;
finally
Blob.Free;
end;
end;
Run the app, and click Button1 to insert the image into the Test table. Then click Button2 to read it back out and write it to a new file on disk.
The initial text #R_BLOB# depended on the type of table set in the properties TableType of the component TadsQuery, setting it to ttAdsCDX, export work.
Thanks all.