I need to store multiple objects (most of them are TObject/non persistent) to a TMemoryStream, save the stream to disk and load it back. The objects need to be streamed one after each other. Some kind of universal container.
At the moment I put all properties/fields/variables of an object into a record and save the record to stream. But I intend to use functions file WriteInterger, WriteString (see below), WriteBoolean, etc functions to save/load data from stream.
StreamReadString(CONST MemStream: TMemoryStream): string;
StreamWriteString(CONST MemStream: TMemoryStream; s: string);
However, it seems that I need to rewrite a lot of code. One of the many examples is TStringList.LoadFromStream that will not work so it needs to be rewritten. This is because TStringList needs to be the last object in the stream (it reads from current position to the end of the stream).
Anybody knows a library that provide basic functionality like this?
I am using Delphi 7 so RTTI is not that great.
See related post here
Btw, Delphi7 also has RTTI support, otherwise your forms (.dfm) could not be loaded :-)
If you use published properties, RTTI will work for you "out of the box".
Otherwise you have to do it yourself with a
procedure DefineProperties(Filer: TFiler); override;
You can take a look at how it's implemented in:
procedure TDataModule.DefineProperties(Filer: TFiler);
These are the only ways for object serialization.
But you could also try records: if you do not use array(strings are also arrays of char) or object properties, you can directly save and load a record to memory (stream, file, etc). I use this in my AsmProfiler to be able to read and write many (small) results very fast (array of record with some integer values can be saved and loaded with one Move/CopyMemory call!).
Which Delphi version? Delphi 2010 has new RTTI functionality, so you can use DeHL which has "Full generic serialization for all included types and collections".
Have you thought about using TReader and TWriter to fill your streams.
Why not use XML?
Write an XSD for the XML that defines the XML.
Generate a Delphi unit form that XSD using the XML Data Binding Wizard.
Put a bunch of your objects into that XML.
Save the XML to disk (or stream it to some other medium).
For more info on XML and the XML Data Binding Wizard see this answer.
Edit:
Just map your objects to the interfaces/objects generated from the XSD; or use the objects/interfaces that have been generated.
That is usually far easier than hooking into the Delphi streaming mechanism (by either writing TPersistent wrappers with published properties around your objects, going the DefineBinaryProperty way, or the TReader/TWriter/DefineProperty way).
--jeroen
Related
I have made a C++ program that uses ofstream to create 2 files in my disk: one is called details.obf and the other is records.txt. The file details has only 1 line inside (1 integer) and the file records.txt has a non fixed number of lines (they are all strings).
With the code below I can get the value inside the file details. It's pretty simple and I am using a MemoryStream.
m := TMemoryStream.Create;
try
try
m.LoadFromFile(TPath.Combine(TPath.GetHomePath, 'details.obf'));
m.Read(i, sizeOf(i));
//other stuff...
except
//...
end;
finally
m.Free;
end;
With the code below instead I am reading the content of the records file:
a := TStreamReader.Create('C:\Users\betom\Desktop\records.txt');
try
while not(a.EndOfStream) do
begin
Memo1.Lines.Add(a.ReadLine);
end;
finally
a.Free;
end;
In the second block of code I have used a different class (TStreamReader) and I have written that code looking at embarcadero's documentation. I had to use the while not(a.EndOfStream) do because the lenght of records.txt is unknown.
I have seen that MemoryStream (and other classes) are all subclasses of TStream. Why I cannot call something like while not(m.EndOfStream) do with m a TMemoryStream?
I cannot understand the difference between a MemoryStream and a StreamReader. From what I have understood the latter can read automatically all the values in a given range while the first cannot.
Note: I have read on the docs that I can have a TStreamReader and a TStreamWriter and both are fine when I need to create a file that contains some data. I just cannot understand what are memorystream used for if I have the same behavior with a StreamReader.
TStreamReader is a general purpose class for reading text/character data from any stream. It does not support any other form of data in a stream and is not intended for use with any other form of data.
A stream itself might be a file on disk or data on the network or data in memory. Different stream classes exist to provide stream-access to data from those different sources.
TMemoryStream exists specifically to provide access to data in memory as a sequence of bytes, which may be binary data or text/character data or a mixture of both.
To answer your actual question:
I have seen that MemoryStream (and other classes) are all subclasses
of TStream. Why I cannot call something like while not(m.EndOfStream) do
with m a TMemoryStream?
First a correction. It is correct that TMemoryStream and some other stream manipulating classes (e.g. TFileStream) inherit from TStream. That is however not the case with TStreamReader (and TStringReader). These inherit from TTextReader, which together with TTextWriter and its descendents TStreamWriter and TStringWriter mainly exist to provide familiar classes for .Net users.
Here's the hierarchy of some of the discussed classes:
TObject
TStream
TCustomMemoryStream
TMemoryStream
TBytesStream
TStringStream
THandleStream
TFileStream
TWinSocketStream
TOleStream
TTextReader
TStreamReader
TStringReader
TBinaryReader
The answer is that the property EndOfStream is declared in TStreamReader, iow in a different branch than TMemoryStream.
In TStream descendents you can use e.g. the Position and Size properties to determine if you are at the end of the stream.
I need to be able to pass the same set of structures (basically arrays of different records) over two different interfaces
The first (legacy) which is working requires a pointer to a record and the record size
The second, which I am attempting to develop, is type-safe and requires individual fields to be set using Get/Set methods for each field
Existing code uses records (probably around 100 or so) with memory management being handled in a 3rd party DLL (i.e. we pass the record pointer and size to it and it deals with memory management of new records).
My original thought was to bring the memory management into my app and then copy over the data on the API call. This would be easy enough with the old interface, as I just need to be able to access SizeOf() and the pointer to the record structure held in my internal TList. The problem comes when writing the adapter for the new type-safe interface
As these records are reliant on having a known size, there is heavy use of array 0..n of char static arrays, however as soon as I try to access these via 2010-flavour RTTI I get error messages stating 'Insufficient RTTI information available to support this operation'. Standard Delphi strings work, but old short-strings don't. Unfortunately, fixing string lengths is important for the old-style interface to work properly. I've had a look at 3rd party solutions such as SuperObject and the streaming in MorMot, though they can't do anything out of the box which doesn't give me too much hope of a solution not needing significant re-work.
What I want to be able to do is something like the following (don't have access to my Delphi VM at the moment, so not perfect code, but hopefully you get the gist):
type
RTestRec = record
a : array [0..5] of char;
b : integer;
end;
// hopefully this would be handled by generic <T = record> or passing instance as a pointer
procedure PassToAPI(TypeInfo: (old or new RTTI info); instance: TestRec)
var
Field: RTTIField;
begin
for Field in TypeInfo.Fields do
begin
case Field.FieldType of
ftArray: APICallArray(Field.FieldName, Field.Value);
ftInteger: APICallInteger(Field.FieldName, Field.Value.AsInteger);
...
end;
end;
Called as:
var
MyTestRec: RTestRec;
begin
MyTestRec.a := 'TEST';
MyTestRec.b := 5;
PassToAPI(TypeInfo(TestRec), MyTestRec);
end;
Can the lack of RTTI be forced by a Compiler flag or similar (wishful thinking I feel!)
Can a mixture of old-style and new-style RTTI help?
Can I declare the arrays differently to give RTTI but still having the size constraints needed for old-style streaming?
Would moving from Records to Classes help? (I think I'd need to write my own streaming to an ArrayOfByte to handle the old interface)
Could a hacky solution using Attributes help? Maybe storing some of the missing RTTI information there? Feels like a bit of a long-term maintenance issue, though.
In the past, I have seen this work, but I never really understood how it should be done.
Assume we have a file of known data types, but unknown length, like a dynamic array of TSomething, where
type
TSomething = class
Name: String;
Var1: Integer;
Var2: boolean;
end;
The problem, though, is that this object type may be extended in the future, adding more variables (e.g. Var3: String).
Then, files saved with an older version will not contain the newest variables.
The File Read procedure should somehow recognize data in blocks, with an algorithm like:
procedure Read(Path: String)
begin
// Read Array Size
// Read TSomething --> where does this record end? May not contain Var3!
// --> how to know that the next data block I read is not a new object?
end;
I have seen this work with BlockRead and BlockWrite, and I assume each object should probably write its size before writing itself in the file, but I would appreciate an example (not necessarily code), to know that I am thinking towards the right direction.
Related readings I have found:
SO - Delphi 2010: How to save a whole record to a file?
Delphi Basics - BlockRead
SO - Reading/writing dynamic arrays of objects to a file - Delphi
SO - How Can I Save a Dynamic Array to a FileStream in Delphi?
In order to make this work, you need to write the element size to the file. Then when you read the file, you read that element length which allows you to read each entire element, even if your program does not know how to understand all of it.
In terms of matching up your record with the on-disk record that's easy enough if your record only contains simple types. In that scenario you can read from the file Min(ElementLength, YourRecordSize) bytes into your record.
But it does not look as though you actually have that scenario. Your record is in fact a class and so not suitable for memory copying. What's more, its first member is a string which is most definitely not a simple type.
Back in the day (say the 1970s), the techniques you described were how files were read. But these days programming has moved on. Saving structured data to files usually means using a more flexible and adaptable serialization format. You should be looking to using JSON, XML, YAML or similar for such tasks.
I'd say you need a method of versioning you file. That way you know what version of the record is contained in the file. Write it at the start of the file and then on reading, read in the version identifier first and then use the corresponding structure to read the rest.
If I understand you correctly your main issue is if TSomething changes. Most important thing is that you need to add version info into your file, this you really cannot avoid.
As for actual storage using Sqlite would most likely solve all your problems, but depending on your situation it might be an overkill.
Except for unexceptional circumstances I wouldn't really worry about extending the class too much.If you add add version number to the beginning of the file you can easily convert the file after the class have changed. All you need to do is implement your solution so that adding conversions would as simple as reasonable.
In order to read/write files I would prefer streams/XML/JSON (depending on situation) instead of blockread/blockwrite as you don't have to implement a hack to store version number.
In theory you could also have unused space for each record so I you could avoid recreating entire file if class changes upto a point (until you have enough unused space). It maybe helpful if TSomething changes often and files are big, but most likely I would not go that route.
This is how I would do it: Include a simple version number in the header. This can be any string, integer or whatever.
Reading and writing the file is very easy (I am using pseudocode):
Procedure Read (MyFile : TFile);
Var
reader : IMyFileReader;
begin
versionInfo = MyFile.ReadVersionInfo();
reader = ReaderFactory.CreateFromVersion(versionInfo);
reader.Read(MyFile);
end;
Type
ReaderFactory = Class
public
class function CreateFromVersion(VersionInfo : TVersionInfo) : IMyFileReader;
end;
function ReaderFactory.CreateFromVersion(VersionInfo : TVersionInfo) : IMyFileReader;
begin
if VersionInfo = '0.9-Alpha' then
result := TVersion_0_9_Alpha_Reader.Create()
else if VersionInfo = '1.0' then
result := TVersion1_0_Reader.Create()
else ....
end;
This can easily be maintained and extended forever. You will never have to touch the Read-routine, but only add a new reader and enhance the factory. With a simple registration method and a TDictionary<TVersionInfo,TMyFileReaderClass>, you can even avoid having to modify the factory.
I have a set of tables that were included in an Advantage Database data dictionary. The dictionary is no longer available, and the tables will not open.
I would like to free those tables using code (not the Advantage Data Architect).
The only reference I can find to this is a function listed in the help called ADSDDFreeTable.
The documentation for the function is at this link:
http://devzone.advantagedatabase.com/dz/WebHelp/Advantage11.1/index.html?ace_adsddfreetable.htm
but it does not offer a code sample, and I cannot understand how to use it.
Would anyone be kind enough to show a code sample of how this function is used (with variables, not literals, for file names, etc)
Thanks very much!
Ace.pas defines AdsDDFreeTable as
function AdsDDFreeTable( pucTableName: PAceChar;
pucPassword: PAceChar ):UNSIGNED32; {$IFDEF WIN32}stdcall;{$ENDIF}{$IFDEF LINUX}cdecl;{$ENDIF}
The same Ace.pas defines PAceChar:
type
PAceChar = PAnsiChar;
Therefore, the call to the function should be fairly straightforward:
var
TableName: AnsiString;
begin
TableName := 'C:\Data\MyTable.adt`;
if AdsDDFreeTable(PAnsiChar(TableName), nil) <> ADS_FREETABLEFAILED then
ShowMessage('Table removed from datadictionary')
else
// Call ADSGetLastError to retrieve reason for failure;
end;
In addition to #Ken's solution (+1), there is also a standalone command line utility named freeadt.exe that will free ADT tables from their associated data dictionary. I believe it is installed with Advantage Data Architect.
If you run it from the command line with no parameters, it displays usage information. In general, though, you can give it a folder name (to process all the tables) or a specific file as a parameter.
Does anyone know of a TDataset descendant that works with Generics and RTTI, so that I can write code like this, and make use of data-aware components in the GUI? :
...
ds:TDataset<TPerson>;
...
procedure DoStuff;
begin
ds:=TDataset<TPerson>.create;
ds.add(TPerson.Create('A.','Hitler',77));
ds.add(TPerson.Create('O.','Bin Laden',88));
end;
This should be possible. The fielddefs can be created via RTTI because the exact type of the data is known. Values can also be automatically marshalled back and forth, so you can both view and edit data that's in a class or a record.
I hate having to write a lot of useless marshalling code, while the required information for that is available via RTTI already.
Or maybe somebody once wrote some sort of TEnumerable <-> TDataset adapter?
Does something like that exist, or should I start writing one?
...
The closest thing that I could find is an (excellent!) example by Marco Cantu, from Mastering Delphi 7, but the code itself doesn't make use of new language features like generics, the new RTTI system, or attributes, and it doesn't work with Unicode delphi. TDataset has changed since D7 too.
The TAureliusDataSet included in TMS Aurelius comes very close to that.
Take a look at EverClassy Dataset from Inovativa at www.inovativa.com.br/public.
another one is Snap Object Dataset http://digilander.libero.it/snapobject/
DotNet4Delphi by A-Dato Scheduling Technology from the Netherlands is good for you.
Quotes:
From Torry's Delphi
Hook up any collection to your data aware controls.
DotNet4Delphi implements many .Net collection classes, including
generic types like List<> and Dictionary<>. Different from their
Delphi counterpart is that our generic collections also implement the
non-generic interfaces (IList, IDictionary) allowing you to access
your collections in multiple ways. This opens the door to use any
collection as a data source for data aware controls which is exactly
what the (also included) TListDataset component provides.
It targets Delphi XE and XE2.
It's an open source initiative, Delphi rocks !!!
I have found a more relevant resource and can't help sharing it! So relevant that I think it deserves a separate post rather than a mere update in my first answer.
The Dduce library for Delphi XE2-XE6 makes use of TListDataSet<...> a generic dataset component that can be used to expose a generic list as a TDataSet.
The most relevant units pertaining to the implementation of the generic dataset are:
DDuce.Components.VirtualDataSet.pas (The original SO post is itself cited by the author within the source code as a reference among others!!!)
DDuce.Components.ListDataSet.pas
Class hierarchy:
TDataSet <= TCustomVirtualDataset <= TListDataset <= TListDataset<T>
Yes, it inherits lots of features... my only wish is to have at my disposal a version working with a lessen requirement (Delphi XE without most of the other bells and whistles).
Look and feel: