Are Delphi strings immutable? - delphi

As far as I know, strings are immutable in Delphi. I kind of understand that means if you do:
string1 := 'Hello';
string1 := string1 + " World";
first string is destroyed and you get a reference to a new string "Hello World".
But what happens if you have the same string in different places around your code?
I have a string hash assigned for identifying several variables, so for example a "change" is identified by a hash value of the properties of that change. That way it's easy for me to check to "changes" for equality.
Now, each hash is computed separately (not all the properties are taken into account so that to separate instances can be equal even if they differ on some values).
The question is, how does Delphi handles those strings? If I compute to separate hashes to the same 10 byte length string, what do I get? Two memory blocks of 10 bytes or two references to the same memory block?
Clarification: A change is composed by some properties read from the database and is generated by an individual thread. The TChange class has a GetHash method that computes a hash based on some of the values (but not all) resulting on a string. Now, other threads receive the Change and have to compare it to previously processed changes so that they don't process the same (logical) change. Hence the hash and, as they have separate instances, two different strings are computed. I'm trying to determine if it'd be a real improvement to change from string to something like a 128 bit hash or it'll be just wasting my time.
Edit: Version of Delphi is Delphi 7.0

Delphi strings are copy on write. If you modify a string (without using pointer tricks or similar techniques to fool the compiler), no other references to the same string will be affected.
Delphi strings are not interned. If you create the same string from two separate sections of code, they will not share the same backing store - the same data will be stored twice.

Delphi strings are not immutable (try: string1[2] := 'a') but they are reference-counted and copy-on-write.
The consequences for your hashes are not clear, you'll have to detail how they are stored etc.
But a hash should only depend on the contents of a string, not on how it is stored. That makes the whole question mute. Unless you can explain it better.

As others have said, Delphi strings are not generally immutable. Here are a few references on strings in Delphi.
http://blog.marcocantu.com/blog/delphi_super_duper_strings.html
http://conferences.codegear.com/he/article/32120
http://www.codexterity.com/delphistrings.htm

The Delphi version may be important to know. The good old Delphi BCL handles strings as copy-on-write, which basically means that a new instance is created when something in the string is changed. So yes, they are more or less immutable.

Related

How to use AnsiString to store binary data?

I have a simple question.
I want to use AnsiString as a container for binary data. I mostly load such data from TMemoryStream or TFileStream and I save it back from AnsiString after some processing. Works fine, haven't found a problem with that.
But from what I've seen using it like that sparcles debates to use Sysutils::TBytes instead. Why? Sysutils::TBytes has much fewer useful methods which I can use to manipulate data stored inside for example AnsiString. It is clearly half-finished container, compared to AnsiString.
Is the only problem I should care about conversion to regular string or is there something else why I should really use the less-than-adequate TBytes instead? I do not make conversions of AnsiString to other string types - that is what is quoted as a possible problem elsewhere.
An example of how I load data:
AnsiString data;
boost::scoped_ptr<TFileStream> fs(new TFileStream(FileName, fmOpenRead | fmShareDenyWrite));
data.SetLength(fs->Size);
fs->Read(data.c_str(), fs->Size);
An example how I save data:
// fs wants void * so I have to use data.data() instead of data.c_str() here
fs->Write(data.data(), data.Length());
So it should be safe to store binary data correct?
I want to use AnsiString as a container for binary data.
One word - DON'T! It will bite you someday. Use a more appropriate container, such as TBytes, TMemoryStream, std::vector<byte>, etc.
Works fine, haven't found a problem with that.
Consider yourself lucky. From C++Builder 2009 onwards, AnsiString is codepage-aware, and it WILL cause data conversions if you are not VERY careful when passing AnsiString around. Sooner or later, you are likely to slip up and it will risk corrupting your binary data.
But from what I've seen using it like that sparcles debates to use Sysutils::TBytes instead. Why?
Because it is an actual raw binary container meant specifically for raw bytes.
Sysutils::TBytes has much fewer useful methods which I can use to manipulate data stored inside for example AnsiString.
You should not be manipulating binary data as text to begin with. And since you are using things like Boost and STL, you should consider using their binary containers instead. They have more functions available.
That being said, XE7 does introduce some new functions for manipulating Delphi-style dynamic arrays (like TBytes) including inserts, deletes, and concatenations:
String-Like Operations Supported on Dynamic Arrays
It does not look like those new functions made it into C++Builder's DynamicArray class (which TBytes is a typedef of), though.
It is clearly half-finished container, compared to AnsiString.
AnsiString is a container of text characters. Period. Always has been, always will be. People ABUSE it by taking advantage of the fact that sizeof(char)==sizeof(byte). That worked up to a point, but it has become dangerous in recent years to continue abusing it.
Is the only problem I should care about conversion to regular string or is there something else why I should really use the less-than-adequate TBytes instead?
That, and the fact that Embarcadero has been phasing out AnsiString since 2009. 8bit strings are disabled in the mobile compilers, it is only a matter of time before the desktop compilers follow suit.
Why are you wanting to manipulate raw bytes as strings to begin with? Can you provide an example of something you can do with AnsiString that you cannot do with TBytes?
So it should be safe to store binary data correct?
In your specific example, yes (and yes, you can use c_str() instead of data() when calling fs->Write()).

String indexing vs. dynamic array indexing in Delphi

In Delphi, why are AnsiStrings indexed from one and dynamic arrays indexed from zero? Is this a historical accident, to make AnsiStrings work more like ShortStrings, or is there some deeper logic at work?
One of the contributing factors that led to "Pascal" strings being 1 indexed instead of 0 indexed was that the length of the string was stored in the zeroth byte. Yes, that could have been hidden from the programmer's view by having the compiler internally add a constant offset to the string index expression (as was done in Delphi's long strings later) but in the beginning things were much simpler. Allocate a block of memory, store the length in byte zero, index the char data from byte 1. End of story.
As I recall UCSD Pascal was using this length-in-zero-byte convention long before Turbo Pascal came along.
As for why dynamic arrays are zero based, I don't recall any specific reason but I would guess it reflects the dynamic array's kinship to dynamically allocating a buffer and indexing off the buffer pointer. The array types that you would use to create array pointer types were zero based arrays. The first byte is found at buffer pointer + 0 offset. This is the C rationalization for zero based everything. There was no compelling reason to carry string's 1 based indexing pattern over to compiler managed arrays when string's 1 based indexing was already (and had always been) the exception rather than the norm.
It may well be that because the string type was the first array-like data type that everyone first encountered and possibly the most used data type across the board, there may be a perception of a bias towards 1 based indexing in the language. However, if you look closely I think you'll find arrays in Pascal (distinct from string) have never been inherently 1 based, especially when dynamically allocated.
The reason for the Delphi string tradition of 1-based strings is quite simple. The tradition comes from the implementation of old style Turbo Pascal strings. That data type stored the length of the string in the first byte of the variable, index 0. The string data began in the next byte, index 1.
You can still use that data type today. It's now called ShortString. As is immediately obvious from it's implementation, there is a 255 character limit. This limit led to the introduction of huge strings, if I recall correctly, in Delphi 2. When huge strings were introduced the language designers chose to retain 1-based indexing to make it easier for developers to switch from short strings to huge strings.
I guess Turbo Pascal didn't invent the idea of using element 0 for length. It's just that I'm too young to remember what came before then!
Dynamic arrays weren't bound by the past in the same way and had a free choice. I don't know why zero based was chosen. Perhaps because it fits more easily with the prevailing fashion on platform on which Delphi existed at that time, namely Windows. That's just a guess though. Danny Thorpe worked on the Delphi compiler at that time, and even he can't remember the rationale!
The Delphi language designers are currently moving towards zero based string indexing for huge strings. The initial steps in this direction can be seen in XE3 in the TStringHelper class which uses 0-based indexing. And also in the ZEROBASEDSTRINGS conditional which allows you to opt in to 0-based indexing. Expect the next generation Delphi compiler to use 0-based indexing only. The times they are changin'.
Historical accident.
Pascal strings and arrays traditionally start at 1.
C - and perhaps consequently AnsiStrings - start at 0.
I don't know the rationale for "breaking with Pascal tradition" for Dynamic arrays, which also start at zero. But it makes sense, and I agree with it ...
IMHO...

id values of different variables in python 3

I am able to understand immutability with python (surprisingly simple too). Let's say I assign a number to
x = 42
print(id(x))
print(id(42))
On both counts, the value I get is
505494448
My question is, does python interpreter allot ids to all the numbers, alphabets, True/False in the memory before the environment loads? If it doesn't, how are the ids kept track of? Or am I looking at this in the wrong way? Can someone explain it please?
What you're seeing is an implementation detail (an internal optimization) calling interning. This is a technique (used by implementations of a number of languages including Java and Lua) which aliases names or variables to be references to single object instances where that's possible or feasible.
You should not depend on this behavior. It's not part of the language's formal specification and there are no guarantees that separate literal references to a string or integer will be interned nor that a given set of operations (string or numeric) yielding a given object will be interned against otherwise identical objects.
I've heard that the C Python implementation does include a set of the first hundred or so integers as statically instantiated immutable objects. I suspect that other very high level language run-time libraries are likely to include similar optimizations: the first hundred integers are used very frequently by most non-trivial fragments of code.
In terms of how such things are implemented ... for strings and larger integers it would make sense for Python to maintain these as dictionaries. Thus any expression yielding an integer (and perhaps even floats) and strings (at least sufficiently short strings) would be hashed, looked up in the appropriate (internal) object dictionary, added if necessary and then returned as references to the resulting object.
You can do your own similar interning of any sorts of custom object you like by wrapping the instantiation in your own calls to your own class static dictionary.

Delphi TStringList wrapper to implement on-the-fly compression

I have an application for storing many strings in a TStringList. The strings will be largely similar to one another and it occurs to me that one could compress them on the fly - i.e. store a given string in terms of a mixture of unique text fragments plus references to previously stored fragments. StringLists such as lists of fully-qualified path and filenames should be able to be compressed greatly.
Does anyone know of a TStringlist descendant that implement this - i.e. provides read and write access to the uncompressed strings but stores them internally compressed, so that a TStringList.SaveToFile produces a compressed file?
While you could implement this by uncompressing the entire stringlist before each access and re-compressing it afterwards, it would be unnecessarily slow. I'm after something that is efficient for incremental operations and random "seeks" and reads.
TIA
Ross
I don't think there's any freely available implementation around for this (not that I know of anyway, although I've written at least 3 similar constructs in commercial code), so you'd have to roll your own.
The remark Marcelo made about adding items in order is very relevant, as I suppose you'll probably want to compress the data at addition time - having quick access to entries already similar to the one being added, gives a much better performance than having to look up a 'best fit entry' (needed for similarity-compression) over the entire set.
Another thing you might want to read up about, are 'ropes' - a conceptually different type than strings, which I already suggested to Marco Cantu a while back. At the cost of a next-pointer per 'twine' (for lack of a better word) you can concatenate parts of a string without keeping any duplicate data around. The main problem is how to retrieve the parts that can be combined into a new 'rope', representing your original string. Once that problem is solved, you can reconstruct the data as a string at any time, while still having compact storage.
If you don't want to go the 'rope' route, you could also try something called 'prefix reduction', which is a simple form of compression - just start out each string with an index of a previous string and the number of characters that should be treated as a prefix for the new string. Be aware that you should not recurse this too far back, or access-speed will suffer greatly. In one simple implementation, I did a mod 16 on the index, to establish the entry at which prefix-reduction started, which gave me on average about 40% memory savings (this number is completely data-dependant of course).
You could try to wrap a Delphi or COM API around Judy arrays. The JudySL type would do the trick, and has a fairly simple interface.
EDIT: I assume you are storing unique strings and want to (or are happy to) store them in lexicographical order. If these constraints aren't acceptable, then Judy arrays are not for you. Mind you, any compression system will suffer if you don't sort your strings.
I suppose you expect general flexibility from the list (including delete operation), in this case I don't know about any out of the box solution, but I'd suggest one of the two approaches:
You split your string into words and
keep separated growning dictionary
to reference the words and save list of indexes internally
You implement something related to
zlib stream available in Delphi, but operating by the block that
for example can contains 10-100
strings. In this case you still have
to recompress/compress the complete
block, but the "price" you pay is lower.
I dont think you really want to compress TStrings items in memory, because it terribly ineffecient. I suggest you to look at TStream implementation in Zlib unit. Just wrap regular stream into TDecompressionStream on load and TCompressionStream on save (you can even emit gzip header there).
Hint: you will want to override LoadFromStream/SaveToStream instead of LoadFromFile/SaveToFile

What datatype/structure to store file list info?

I have an application that searches files on the computer (configurable path, type etc). Currently it adds information to a database as soon as a matching file is found. Rather than that I want to hold the information in memory for further manipulation before inserting to database. The list may contain a lot of items. I consider performance as important factor. I may need iterating thru the items, so a structure that can be coded easily is another key issue. and how can I achieve php style associative arrays for this job?
If you're using Delphi 2009, you can use a TDictionary. It takes two generic parameters. The first should be a string, for the filename, and the second would be whatever data type you're associating with. It also has three built-in enumerators, one for key-value pairs, one for keys only and one for values only, which makes iterating easy.
Another solution would be to use just a standard TStringList.
As long as it's sorted and has some duplicate setting other than dupAccept, you can use indexof or indexofname to find items in the list quickly.
It also has the Objects addition which allows you to store object information attached to the name. Starting with D2009, TStringList has the OwnsObject property which allows you to delegate object cleanup to the TStringList. Prior to D2009 you have to handle that yourself.
Much of this will depend on how you are going to use the list and to what scale. If you are going to use it as a stack, or queue, then a TList would work fine. If your needing to search through the list for a specific item then you will need something that allows faster retrieval. TDictionary (2009) or TStringList (pre 2009) would be the most likely choice.
Dynamic arrays are also a possiblity, but if you use them you will want to minimize the use of SetLength as each time it is called it will re-allocate memory. TList manages this for you, which is why I suggested using a TList. if you KNOW how many you will deal with in advance, then use a dynamic array, and set its length on the onset.
If you have more items than will fit in memory then your choices also change. At that point I would either use a database table, or a tFileStream to store the records to be processed, then seek to the beginning of the table/stream for processing.
Try using the AVL-Tree by http://sourceforge.net/projects/alcinoe/ as your associative Array. It has an iterate-method for fast iteration. You may need to derive from his baseclass and implement your own comparator, but it's easy to use.
Examples are included.

Resources