FirebirdSql case insensitive - firebird2.5

Is there a way to make a FirebirdSql database case insensitive, and if not the database then a complete table? I know about setting the CharSet to UTF8 and COLLATE UNICODE_CI on each column but wondered if there was a way to do the whole database.

You can configure the default character set (including collation) on database create
CREATE {DATABASE | SCHEMA}
...
[DEFAULT CHARACTER SET charset [COLLATION collation]]
Current Firebird versions do not support altering the default character set using DDL, you would need to make a system table update if you want to do this for an existing database.
The default character set is applied on column creation if no specific character set was defined. If no default character set is specified, it is set to NONE (which basically means: any byte combination is valid).

Related

AvroData replaces nulls with schema default values

i'm using io.confluent.connect.avro.AvroData.fromConnectData to convert message before serialization.
AvroData uses struct.get(field) to get values which in turn replaces nulls with schema default values.
as i understand from avro doc default values should be used for schema compatibility when reader expects field that missing in writer schema (not particular message).
so my question is: is it correct way to replace nulls with schema default value? or maybe i should use another way to convert messages?
The miss understanding is that the default value is not used to replace null values, it is used to populate your field value in case that your data does not include the field. This is primary used for schema evolution purposes. What you are trying to do (replace null values coming as part of your data with another value) is not possible through avro schemas, you will need to deal with it in your program.

SSDT- SQL DB DEPLOY Dacpac - on existing production SERVER failed

I am using Azure/Release to SQL Deploy. I have a DACPAC file to deploy, but I (1) have changed column datatypes, (2)deleted some columns on a table that has data already, (3) and added some FK columns. But because there is already production data on this, the deploy fails. What solution should be done on these scenarios?
SQL DEPLOY image
The column [dbo].[TableSample][ColumnSample] is being dropped, data loss could occur.
The type for column Description in table [dbo].[Table2] is currently NVARCHAR (1024) NULL but is being changed to NVARCHAR (100) NOT NULL. Data loss could occur.
The type for column Id in table [dbo].[Table3] is currently UNIQUEIDENTIFIER NOT NULL but is being changed to INT NOT NULL.
There is no implicit or explicit conversion. ***
The column [sampleColumnId] on table [dbo].[Table4] must be added
, but the column has no default value and does not allow NULL values. If the table contains data, the ALTER script will not work.
To avoid this issue you must either: add a default value to the column, mark it as allowing NULL values, or enable the generation of smart-defaults as a deployment option.
Your "data loss" warnings can likely be turned off by using the option to "allow data loss" in your publish options. It's just warning you that you are dropping columns or going to a smaller data length.
Your "Table3" change is just not going to work with keeping the data. GUIDs will not fit in an INT column. You might need to look at dropping/re-creating the table or renaming the current ID column to something else (OldId, maybe) and adding a new ID of type INT, probably with an Identity(1,1).
However the last column - you are trying to add a NOT NULL column with no default to an existing table. Either allow NULLs or put a named default value on the column so the column can be added.

In the Rails database should Strings be stored downcased or left as is?

I am storing street addresses and restaurant names in my database under. Whenever I return the name or address I want the original case to be untouched.
However in the future I plan to implement searching over the name and street address field, and I don't want the searching to be case sensitive.
How should I best store these fields?
There is no term like Rails database
The desired effect can be easily achieved for example in MySQL by using a case insensitive collation for your address table / column. For example utf8_general_ci is such collation - in fact all that end with ci are case insensitive
Having such collation of storing data will give you the ability to easily search the database without to worry about letter cases and preserve the original format.
First of all you need to decide if data case sensitive from user perspective. If you absolutely sure that data you going to store insensitive possibly you still don't need to store it down case, because insensitive search still available for you even if data stored in database in mixed case.
Also if you using postgresql and need insensitive search, you may create functional index like this:
CREATE INDEX test1_lower_col1_idx ON test1 (lower(col1));
and after that you can search using where lower(col1) without postgresql recalculate lower for every col1.

dbExpress/No key specified

I am working on a database program, using the dbExpress components (Delphi 7). The data is retrieved from the database via the following components: TSQLDataSet -> TDataSetProvider -> TClientDataSet -> TDatasource -> TDBEdit. Until now, the form has worked correctly. The query in the TSQLDataset is
select id, name, byteken, timeflag from scales where id = :p1
I added a large (2048) varchar field to the database table; when I add this field to the above query (and connect either a TDBMemo or a TDBRichEdit) to the TDatasource), I receive the following message when I try to edit the value in the new text field
Unable to find record. No key specified.
I get the same error when there is no TDBMemo on the form (but with the varchar field in the query). As soon as I remove the varchar field from the query, everything works properly again.
What could be the cause of this problem?
==== More information ====
I have now defined persistent fields in the form. The field which holds the key to the table has its provider flags set to [pfInUpdate,pfInWhere,pfInKey], whereas all the other fields have their flags as [pfInUpdate,pfInWhere]. This doesn't solve the problem.
The persistent fields were defined on the clientdataset. When I defined them on the TSQLDataSet, the error message about 'no key specified' does not occur. The program still puts out this error message (which I neglected to mention earlier):
EDatabase error: arithmetic exception, numeric overflow or string truncation
The large string field has the correct value in 'displaywidth' and 'size'.
==== Even more information ====
I rewrote the form to use non-data aware components. One query retrieves the data from the database (using exactly the same query string as I am using in the TSQLDataSet); the data is then transferred to the controls. After the user presses the OK button on the form, the data is passed back to the database via another query which performs an update or an insert. As this works correctly, I don't see what the problem is with the data aware components.
==== Yet another snippet of information ====
I found this question on Stack Overflow which seems to address a similar issue. I changed the query to be
select id, name, name, byteken, timeflag,
cast (constext as varchar (2048)) as fconstext
from scales
where id = :p1
and set the dbMemo's datafield to be 'fconstext'. After adding text to the dbMemo, the 'applyupdates' call now fails with the following message
column unknown 'fconstext'
despite the fact that there is a persistent field created with that name.
I don't know whether this helps or simply muddies the water.
==== More information, 23 April ====
I dropped the field from the database table, then added it back. The program as written works fine as long as the string being entered into the problematic data field is less than about 260 chars. I added ten characters at a time several times without problem until the string length was 256. Then I added some more characters (without counting), tried to save - and got the error. From this point on, trying to add even one more character causes the error message (which comes at the 'applyupdates' method of the clientdataset).
Originally, the field contained 832 characters, so there is not a hard limit to the number of characters which I can successfully store. But once the error message appears, it always appears, as if the database remembers that there is an error.
==== More information, 24 April ====
Once again, I dropped the field from the database then added it back; the character set is WIN1251, for reasons which are not clear to me now (I don't need Cyrillic characters). The maximum number of characters which I can enter using data-aware controls seems to be about 280, regardless of how the field itself is defined.
I have since moved to using non-data aware controls in the real program where this problem occurs, and I can assure you that this limit does not exists there. Thus I am fairly sure that the problem is not due to a mismatch in character size, as has been suggested. Don't forget that I am using Delphi 7, which does not have unicode strings. I think that there is a bug in one of the components, but as I'm using old versions, I imagine that the problem has been solved, but not in the versions which I use.
==== Hopefully final edit, 25/04/12 ====
Following mosquito's advice, I created a new database whose default character set is WIN1252 (UTF-8 did not appear as a choice and and anyway my programs are not unicode). In this clean database I defined the one table, where the 'constext' string's character set was also defined as WIN1252. I ran the data-aware version of the problematic form and was able to enter text without problem (currently over 1700 characters).
It would seem, thus, that the problem was created by having one character set defined for the database and one for the field. I don't know how to check in retrospect what the default character set of the database was defined as, so I can't confirm this.
I now have the small problem of defining a new database (there are 50+ tables) and copying the data from the original database. As this database serves the customer's flagship product, I am somewhat wary of doing this....
Check the UpdateMode property of the provider. If it is set to upWhereChanged or upWhereKeyOnly you need a key in the database table to work properly.
Unable to find record. No key specified.
set select id, name, byteken, timeflag from scales where id = :p1
to
select id, name, byteken, timeflag from scales where id = 245
an existing id while designing.
to casts
cast (constext as varchar (2048)).....
If a column's definition is altered, existing CASTs to that column's type may become invalid
Arithmetic exception, numeric overflow, or string truncation
String truncation
It happens when the concatenated string doesn't fit the underlying CHAR or VARCHAR datatype size. If the result goes into a table column, perhaps it's a valid error. Or maybe you really need to increase the column size. Similar goes for intermediary values stored in stored procedure or trigger variables.
Character transliteration failed
This happens when you have data in database stored in one character set, but the transliteration to required character set fails. There are various points where character set transliteration occurs. There is an automatic one:
Every piece of data you retrieve from database (via SELECT or otherwise) is transliterated from character set of database table's column to connection character set. If character sets are too different, there will be two translations: first from column charset to Unicode and then from Unicode to the connection charset.
Also, you can request transliteration manually by CASTing the column to another charset, example:
CAST(column_name AS varchar(100) character set WIN1251).
The reason that transliteration can fail is that simply some characters don't exist in certain character sets. For example, WIN1252 doesn't contain any Cyrillic characters, so if you use connection charset WIN1252 and try to SELECT from a column with Cyrillic characters, you may get such error.
In modern times, it is best to use Unicode or UTF8 in your applications and UTF8 connection character. And make sure you use at least Firebird 2.0, has UTF8 support.
Wrong order of parameters when using DotNetFirebird
The order in which Parameters are added to a FbCommand when using DotNetFirebird might cause the -303 exception with the hint "Arithmetic exception, numeric overflow, or string truncation". The order of the parameters has to fit the order of the params in the stored procedure - otherwise the exception will be thrown. Example (.NET, C#, DotNetFirebird (using FirebirdSql.Data.FirebirdClient;))
FbCommand CMD = new FbCommand("TBLTEXT_ADDTEXT", cnn);
CMD.Parameters.Add("TEXT1", FbDbType.VarChar, 600).Value = strText1;
CMD.Parameters.Add("TEXT2", FbDbType.VarChar, 600).Value = strText2;
CMD.CommandType = CommandType.StoredProcedure;
CMD.ExecuteNonQuery();
If the order of the parameters inside the procedure "TBLTEXT_ADDTEXT" differ from the order in which you´re adding parameters to the FbCommand-Object, you´ll receive the -303 error.
4.
No'am Newman said But once the error message appears, it always appears, as if the
database remembers that there is an error.
no remembers; the database is damaged !!!
As long as you are not able to change your database character-set and always experiment with dropping and adding fields to a damaged table, it's hard to solve the problem. 1. For every new test there must be an new database created (TIP: create one and copy them x times). 2. The field set with plain text not with Cyrillic characters stored in originally field; you can not see them but they are there. 3. set varchar(8191) and database PAGE_SIZE to 8192. The actual maximum VARCHAR length with UTF8 is 8191
CREATE DATABASE statement:
CREATE DATABASE localhost:mybase
USER SYSDBA
PASSWORD masterkey
PAGE_SIZE 8192
DEFAULT CHARACTER SET UTF8;
SET NAMES ISO8859_1;
CREATE TABLE scales (
ID ...,
byteken VARCHAR(8191) COLLATE DE_DE,
....
Collations
There is no default collation. So you should define a collation for every field that is to be used for sorting (ORDER BY) or comparing (UPPER):
You can also specify the collation with the ORDER BY clause:
ORDER BY LASTNAME COLLATE FR_CA, FIRSTNAME COLLATE FR_CA
or with the WHERE clause:
WHERE LASTNAME COLLATE FR_CA = :lastnametosearch
Unicode
Firebird 2.0. and above. Now there is the new UTF8 character set that correctly handles Unicode strings in UTF-8 format. The Unicode collation algorithm has been implemented so now you can use UPPER() and the new LOWER() function without the need to specify a collation.

ADO Database Table Boolean Column

I am having a bit of trouble with ADO. I have deployed a database application, which uses Access. With the release of different versions database tables have different fields, some added others deleted etc. What I can't get to work is how to add a BOOLEAN field in the database.
For upgrade purposes I use the standart sql query component with a sql that looks like this :
ALTER TABLE XXX ADD COLUMN YY BOOLEAN
while this works for other data types, such as VARCHAR, INTEGER, DOUBLE etc, it does not with BOOLEAN. I suspect it's Access's fault with it's YES/NO thing for boolean, but who knows.
Also how can I add fields to a table using TADOTable?
Thanks in advance.
In Microsoft Access SQL, The BIT column directly corresponds to the YES/NO field. I have experienced an odd behavior with it if you attempt to convert this later to SQL Server, and my advise is to then do the following:
When EVER you do a check against this field, remember the syntax should be (FIELD <> 0) for checking TRUE, and (FIELD = 0) for checking false. SQL Server doesn't understand the concept of TRUE/FALSE, and in access the value returns -1 and 0, while in SQL Server the values are 1 and 0.
In access it will only render a check box if you also set the field to not null. If nulls are allowed, then it will display the 0 or -1 or empty.
Not sure about Access, but SQL Server uses a bit type to handle boolean values.
Try BIT, not BOOLEAN
You can't do many operations on BIT (or your own custom) type, much better is TINYINT(1) and use 0 / 1.

Resources