using string as id - kotlin-multiplatform

I am making a KMM app using SQLDelight for the cache and recently I changed my database entities to use Text(String) for the id field instead of Int, now i am getting an error when inserting, I might just be missing some sqlDelight knowledge
here is my table:
CREATE TABLE sidework_Entity(
id TEXT NOT NULL PRIMARY KEY,
name TEXT NOT NULL,
employees TEXT NOT NULL,
todoToday INTEGER AS Boolean DEFAULT 0
);
here is my insert method:
insertSidework:
INSERT OR REPLACE
INTO sidework_Entity(
id,
name,
employees,
todoToday
) VALUES (?,?,?,?);
here is my error:
statement aborts at 5: [INSERT OR REPLACE
INTO sidework_Entity(
id,
name,
employees,
todoToday
) VALUES (?,?,?,?)] datatype mismatch
I think it is most likely the Primary Key i have set on the id field or something of that sort but the documentation is a bit short.

The solution to this problem is actually running a migration. Although deleting the app to clear the database cache is a valid solution for testing purposes. For an app in production, this isn't the correct approach.
Documentation for running Migration
Stackoverflow answer with explanation

Related

How to handle an autoincremented ID in HANA from a SAPUI5-application?

in my SAPUI5-Application I got the following function that takes the data and creates an entry in my HANA DB:
onCreateNewCustomer: function(){
var oEntry = {};
oEntry.NAME = this.byId("name").getValue();
oEntry.CITY = this.byId("city").getValue();
oEntry.PHONE = this.byId("phone").getValue();
oEntry.ID = this.byId("id").getValue();
// Post data to the server
this.getOwnerComponent().getModel("CustomerModel").create("/Customer", oEntry, null);
this.byId("createCustomer").close();
//location.reload();
}
The creating process works and my entries get saved. In the next step I wanted to implement my table in HANA in that way, that the ID of the entries will be autoincremented so the user does not have to enter it. I used the following command to create my table:
create column table TABLE
(ID bigint not null primary key generated by default as IDENTITY,
FIRSTNAME nvarchar(30))
That worked, table is created. The problem now is, if I use the code above without providing the ID, the following error is logged by the console:
The following problem occurred: HTTP request failed 400,Bad Request,The serialized resource has an missing value for member 'ID'.
The entry does not get saved in my DB. If I execute the following SQL-Statements in my HANA Workbench without providing the ID, it works:
insert into TABLE (FIRSTNAME) values (‘David’);
insert into TABLE (FIRSTNAME) values (‘Mike’);
insert into TABLE (FIRSTNAME) values (‘Bobby’);
So I did some searching on google but did not find a proper solution on how to do this. My goal is that the entry gets saved and the ID is provided by my HANA DB without providing it from my SAPUI5 Application.
Probably you are using ODataV2 XSODATA implementation which does not support auto-increment. The possible solution here is to use a database sequence and then with a separate request get a value from it and use it in OData create statement.
Did you try creating a new record by commenting out the below code like ?
oEntry.ID = this.byId("id").getValue();
With identity fields, if you provide the value you have to explicitly identify that you are providing the column value. Otherwise, just omit the columnn name and value from the INSERT command.

Error inserting data with EF6 and Always encrypted

We are experiencing some issues with EF6 and Always encrypted feature.
I believe we need to set up something into DBContext, in order to instruct how to encrypt or decrypt columns, but I couldn't find a way to do this.
We already have an ADO access layer, and it works perfectly with encrypted fields. We would rather use EF instead of ADO.
Symptoms are:
With EF, We are able to query the data. And decryption process works fine.
Insertion process throws error below:
Operand type clash: varchar is incompatible with varchar(8000) encrypted with (encryption_type = 'DETERMINISTIC', encryption_algorithm_name = 'AEAD_AES_256_CBC_HMAC_SHA_256', column_encryption_key_name = 'CEK_Auto1', column_encryption_key_database_name = 'Development_v2_qa') collation_name = 'SQL_Latin1_General_CP1_CI_AS'**
Query with where clause, using an encrypted field, throws same error.
Technologies used:
EF6 with Poco entities.
AzureKeyVault for storing encryp/decryp masterkey.
Using SSL Certidicate to authenticate against KeyVault
Connection string contains "Column Encryption Setting=enabled;"
AzureSqlServer
FWK4.6
ADO
We have some code which works fine with ADO. It works fine with every SqlConnection
// Instantiate our custom AKV column master key provider.
// It uses the GetToken function as the callback function to authenticate to AKV
SqlColumnEncryptionAzureKeyVaultProvider akvprov = new SqlColumnEncryptionAzureKeyVaultProvider();
akvprov.KeyVaultClient = SecureConfigurationManager.KeyVaultClient;
// Register the instance of custom provider to SqlConnection
Dictionary<string, SqlColumnEncryptionKeyStoreProvider> providers = new Dictionary<string, SqlColumnEncryptionKeyStoreProvider>();
// "SqlColumnEncryptionAzureKeyVaultProvider.ProviderName" is the name of the provider. It must match the string we used when we created the column master key
providers.Add(SqlColumnEncryptionAzureKeyVaultProvider.ProviderName, akvprov);
SqlConnection.RegisterColumnEncryptionKeyStoreProviders(providers);
Yep, I just found the same problem, needed to add
[Column(TypeName = "varchar(max)")]
in the POCO type before the field for it to work.
Be nice if the error was a bit clearer (and nicer still if NVARCHAR did actually work)
I'm working through the same issue. The problem is with the datatype mapping from C# to the database. Not all the lengths matter at all for always encrypted and varchars with Entity Framework, only varchar(max) or varchar(8000). I have all the entity framework working with azure key vault for all the datatypes, same as you. This link below shows how to do the insert with inline SQL. I've only worked with entity framework and hope I never have to work in inline sql, even though I might have to, if I can't find a way to shrink down the database storage overhead needed for encryption, or look to something like Stretch Db, also another feature in SQL Server 2016. Thanks Jakub Szymaszek and Microsoft.
I have conceded and made all of my data types varchar(max) and it works just fine. So string = varchar(max). The weird thing is that there is not 8000 characters in the encryption, but there is probably 8000 allocated.
"something1" becomes this after encryption and insert:
0x0190F9D80C3F70890FB154F2123459506AD5BDA165333710D161ED80E42FCAFA882C66FF5B68E412B5F9EE11A9F308201D0AE2BD4032151398171FDBE2F3AEA20D
Interesting thing about varchar(max) is that supposidly there is a link to a table or somewhere else the data is stored, beside the table it is inserted into, so varchar(max) may only take the amount shown. (I'm a dev)
The dataType for my column and stored procedure variables:
[testVarChar] varchar COLLATE Latin1_General_BIN2 ENCRYPTED WITH (COLUMN_ENCRYPTION_KEY = [CEK_Auto1], ENCRYPTION_TYPE = Randomized, ALGORITHM = 'AEAD_AES_256_CBC_HMAC_SHA_256') NOT NULL,
The data type of the parameter targeting the SSN column is set to an ANSI (non-Unicode) string, which maps to the char/varchar SQL Server data type. If the type of the parameter was set to a Unicode string (String), which maps to nchar/nvarchar, the query would fail, as Always Encrypted does not support conversions from encrypted nchar/nvarchar values to encrypted char/varchar values. See SQL Server Data Type Mappings for information about the data type mappings.
https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/develop-using-always-encrypted-with-net-framework-data-provider

rails & postgreSQL - How can I update a data with existing primary key?

I want to update some data in Lecture(id, name, etc.) table.
For example, there is a data(id: 1, name: "first") in the Lecture.
When I typed Lecture.create(id: 1, name: "newer") =>
PG::UniqueViolation: ERROR: duplicate key value violates unique constraint "lectures_pkey"
Is there any way to update data?
Try this:
Lecture.find(1).update(name: "newer")
Find more information on update here: http://guides.rubyonrails.org/active_record_basics.html#update
The reason it didn't work is because the id is unique. When you were using create, it was trying to create a NEW record with id of 1, not edit the same record.
PG::UniqueViolation: ERROR: duplicate key value violates unique
constraint "lectures_pkey"
id is the default primary key which shouldn't be changed or duplicated. As you are inserting a row with an existing id, you get that error.
Instead, you need to do like below.
#lecture = Lecture.find(1)
#lecture.update_attributes(name: "newer")
You can use
lec = Lecture.find(1)
lec.update_attributes(name: "newer")
You are getting error PG::UniqueViolation: ERROR: duplicate key value violates unique constraint "lectures_pkey" due to you can not create record with same id. Id is primary key in table.
All of the current answers are correct if you don't mind loading the record and then updating it. If you don't need to actually load the record, and just want to update data in the database you can do
Lecture.where(id: 1).update_all(name: "newer")
This will skip all validations, etc and just do a direct sql update of the data in the database of the record with id == 1

Updating a primary key value on grails gorm

I'm wondering if I can change a value of a primary key member of a composite primary key in a Grails Domain class? For example having this domain:
class StudentHistory implements Serializable {
String studentNumber
String schoolYear
Integer yearLevel
String section
Float average
String status
static mapping = {
...
id composite: ["studentNumber", "schoolYear", "yearLevel", "section"]
...
}
}
Let say, On the schoolYear: "2014-2015", a certain yearLevel: 1 student with studentNumber: "2011-488-MN-0" transferred section from section: "1D" to section: "1N". Now to perform this record update, we do something similar inside a service:
StudentHistory record = StudentHistory.find {
eq("studentNumber", "2014-488-MN-0")
eq("schoolYear", "2014-2015")
eq("yearLevel", 1)
eq("section", "1D")
}
record.setSection("1N")
record.save(flush: true, insert: false)
The problem is that the update on the primary key doesn't take effect but when I tried to update other non-Primary fields such as average and status, updating them works fine (I tried performing an SQL directly on the database to confirm). How can I update primary keys?
PS: Now, based on this design, I know some will suggest that why not just create another record, then just fetch the record that has been last entered? But what I am required to do is to update that composite primary key instead.
PPS: Please don't suggest on removing the old instance, and create a new one, copying the old details except for the section. I cannot do that since many tables are connected to this table.
I believe it is a good practice to avoid changing primary keys. Primary key is a unique identifier of an object and changing it effectively means creating a new object. So if your composite primary key is mutable (or can mutate) then you should use a surrogate key - an artificial primary key. At the same time you can create a unique constraint on the 4 fields currently being your primary key.
In your case it would be:
static mapping = {
...
}
static constraints = {
studentNumber(unique: ["schoolYear", "yearLevel", "section"])
}
Hope it makes sense.

How can I create friendly URLs with MongoDB/Node.js?

For example suppose in designing a blog application I want something like
domain.com/post/729
Instead of
domain.com/post/4f89dca9f40090d974000001
Ruby has the following
https://github.com/hakanensari/mongoid-slug
Is there an equivalent in Node.js?
The id in MongoDB is actually a hexadecimal value to convert that into a numerical value you can use the following code to search for numerical value in the database like 1, 2, 3.. and this code will convert that value into appropriate hex
article_collection.db.json_serializer.ObjectID.createFromHexString(id)
where article_collection is your collection object
There are a few ways :
1- Assuming you are trying to provide a unique id to each blog post .
Why not overwrite the '_id' field of your documents in the blogs collection ?
Sample document would be :
{ "_id" : 122 , "content" : { "title: ..... }
You will have to look out for a method to generate an autoincrement id though, which is pretty easy.
This type of primary keys are however not recommended.
http://www.mongodb.org/display/DOCS/How+to+Make+an+Auto+Incrementing+Field
2- Let the _id field remain as it is, and additionaly store a key 'blogid' which is an integer, you will have to run ensureIndex on 'blogid` field though to make access by blogid fast. Storage overhead would be minor, as you will be storing a keyname and an integer more in your document.
Sample document would be :
{ "_id" : xxxxxxxxxx ,"blogid" : 122, "content" : { "title: ..... }
There are a bunch of different projects on GitHub like https://github.com/dodo/node-slug and https://github.com/stipsan/String.Slugify.js but they focus on making valid URLs out of strings (usually the post subject or article title). I haven't seen any that take a random number and some how produce a shorter random (?) and unique number.
Personally I just have a token field on my post object that contains a unique value that is shorter than just using the DB id directly (and a tiny bit more secure). If you are using Mongoose, the token can be generated automatically by hooking the pre 'Save' event on your Mongoose model.

Resources