I have Insufficient memory error with TClientDataset with TStringField fields. I need to use length of strings from 0 to 8000 - but don't know what length will be until I fill the TClientDataset.
So, can TStringField be created with dynamic Size?
P.S. The initial task is to copy TDBGrid to TClientDataset.
What kind of data are you storing in those fields to need up to 8000 characters?
Imagine you do partial search on those fields using wildcards. I bet such search would take ages and could posibly even crash your database server.
Besides not all databases tables support 8192 character sized StringFields.
http://docwiki.embarcadero.com/Libraries/XE6/en/Data.DB.TStringField
TStringField encapsulates the fundamental behavior common to fields
that contain string data. A value of a string field is physically
stored as a sequence of up to 8192 characters. However, some table
types may only support string fields of smaller dimensions.
So why don't you use TMemoFields instead since they alow to have dynamical size of their text?
http://docwiki.embarcadero.com/Libraries/XE6/en/Data.DB.TMemoField
I have used the following solution. First, calculate maximum Sizes based on Length(aDBGrid.Columns[i].Field.DisplayText). Then create TStringFields with the calculated Sizes.
P.S. The initial task is to copy TDBGrid to TClientDataset.
Related
Is there a simple way to get a size of the record (in terms of the disk space it takes) with activerecord (my db is mysql)?
I have found several (1, 2) answers offering a way of doing it with sql, I wonder if there is something built in into Activerecord, like MyRecord.first.bytesize.
Size of the table would also do, I need an average row size.
The purpose of this is estimating disk space requirements for a database.
UPDATE
I have also found ObjectSpace.memsize_of
require 'objspace'
ObjectSpace.memsize_of(MyRecord.first)
Is the size of the activerecord object equal to the size of the record in the database?
This seems to give the combined size of all the fields. I am casting to String all the fields that do not respond to size, such as timestamps.
record.attributes.keys.map{ |attr_name| record.send(attr_name)}.inject{|sum, attr| sum + (attr.respond_to?(:size) ? attr.size : attr.to_s.size) }
And here is the comparison of results of these methods for the same record:
combined size of all attributes (see above): 222
Marshal::dum(record).size: 2678
ObjectSpace.memsize_of(record): 128
An approximation of size can be found by using:
Marshal::dump(obj).size
Since you said, its for database sizing - The dump has name of classes and instance variables - it may be bit of overhead compared to absolute record size.
You may want to look at this answer for getting MySQL DB size - Database size calculation? and How to Get True Size of MySQL Database?
Based on number of entities currently stored in DB, and size thus determined using above methods - you can extrapolate for sizing.
Using Delphi XE, I have a JvCsvDataset component that is loading a CSV file which has 27 fields.
When the component tries to load the file I get the following error :
Too many fields, or too many long string fields in this record. You must increase the internal record size of the CsvDataSet.
When try it with a CSV file that has only 24 fields, it works fine.
How do I increase the internal record size of the CsvDataSet?
I've tried to reach Warren Postma who wrote the component but did not hear back from him.
Either specify the length of your fields to stay under the default limit or set the value of TextBufferSize to a bigger value before setting active to True.
From the last answer on
http://issuetracker.delphi-jedi.org/view.php?id=4768
#pozs... this is NOT a duplicate of the one you indicated. That was the first place I looked. I could care less about the difference between text and varchar. I'm asking about physical space used within the medium aka server hard drive.
I know that hard drives are split into blocks of bytes aka chunks, that if used less then the total amount of the block the remaining space is an empty waste of unused space. What I'm curious about is that the text option itself uses a certain amount of storage. Can the space used be reduced rather than just limiting quantity of input. I could say text limit => 1, and it may still use thousands upon thousands of bytes... this is what I'm asking about.
This is a photo of hard drive blocks. This is how I imagine ActiveRecord text type space used
Here's the wiki on Blocks(data storage) http://en.wikipedia.org/wiki/Block_(data_storage) As you can see they say "Block storage is normally abstracted by a file system or database management system (DBMS)" What they do NOT say is HOW it is abstracted.
According to Igor's blog he says "To my surprise, they determined that the average I/O size of our Postgres databases was much higher that 8KB block size, and up to 1MB." http://igorsf.wordpress.com/2010/11/01/things-to-check-when-configuring-new-postgres-storage-for-high-performance-and-availability/ While this is helpful to know it doesn't tell me the default behaviour between ActiveRecord and PostgreSQL in handling blocks.
According to concernedoftunbridgewells "The database will allocate space in a table or index in some given block size. In the case of Postgres this is 8K". https://dba.stackexchange.com/questions/15510/understanding-block-sizes/15514#15514?newreg=fc10593601be479b8ed697d1bbd108ed So if 8K is used as a block, then how high or low do I set the text type limit to match and fit within the one 8K block, because it may use more then just one block.
I know that PostgreSQL block size setting can be changed. So I would like clarity on "how ActiveRecord PostgreSQL block size handling currently works". I will accept a good answer for that.
A page contains more than one item (assuming there is space obviously). If your row is less than 8k, then other rows will be stored on the same page with it (I simplify slightly - postgres stores large columns separately anyway). Limiting the max length of a column doesn't interact with this.
My reading of the details on character types is that strings under 126 characters incur 3 bytes less overhead, but this happens on a row by row basis, independently of what the maximum length is.
The postgresql docs have details on the exact on disk format and how postgresql deals with large columns.
IMO, the size takes by a text type db column is mainly depends on the content store in the column. The limit setting inside the ActiveRecord will just do a validation before the content is saving into the db column, and didn't has an impact to the actual storage.
A TBEdit control has MaxLength property, but you don't need to do anything with it, because being a data aware control this is handled automatically.
A TDBCombobox control has no such property and max length is not handled automatically.
How should control the maximum number of chars in a TDBCombobox when Style is set to csDropDown? Ideally this should be based upon the length of the underlying field.
I have a form where there are "standard" selections the user can choose, but they need to also be able to enter free-form instructions, and so I need to use csDropDown, so I've also noticed that TDBComboBox doesn't expose the MaxLength property that the standard TComboBox does, and it doesn't work just using the field size, as you've noticed.
The other problem is that if the user types in too much content, TDBComboBox just silently truncates it to the underlying column's size when updating it. The text still looks like it's all there until the data is posted (where the truncation occurs), but the DBComboBox.Text is not updated to reflect the truncation. Closing the window and then reopening it reveals the loss of data.
I work around this by using an interposer class to access the protected MaxLength property of the TCustomComboBox it descends from, to set the maximum length properly for the size of the TStringField:
type
THackCB = class(TCustomComboBox);
procedure TEditForm.FormShow(Sender: TObject);
begin
THackCB(TheDBComboBx).MaxLength := DataMod.MyStringField.Size;
end;
I need to save a Template.Serialize in a Blob Field, How do it?
var
s : string;
....
s := Templ.Serialize; --- > size less that 1632 bytes Whay?
This may depend on the field type in your database (not your fielddefs). Which database system are you using and what is the field type in that database table? Some database systems have several blob types which support different sizes.
What is the size (not length) of variable s before writing to the blob field? If that is less than 1632, then your problem is in serialization, not a problem writing to the blob field.
Which data access controls are you using? DBX, ADO, BDE? Post as much information as you can for a question like this.