Did you ever try to create a ASP.NET MVC Project with Firebird database...I try it, and is difficult..
My problem:
I have working Firebird provider for Visual Studio 2010.
I have correct database with all needed for increasing the id of the tables.
I have created ASP.NET MVC 3 project, with included database, like EDMX file, with entities.
When i try to insert a record into a table, there occurs a problem that says:
FirebirdSql.Data.Common.IscException: violation of PRIMARY or UNIQUE KEY constraint "PK_USERS" on table "USERS"
That means the id of the record that is created is not increased.
I have stored procedures that must generate new id.
My question is:
How to insert record in Firebird data table from ASP.NET?
In Firebird you need to use triggers in combination with sequences (generators) if you want to have auto-increment like behavior. Otherwise you need to make sure that you assign a unique id yourself.
To create the sequence:
CREATE SEQUENCE mytable_id_sq;
To create a trigger for assigning a unique i (on a table called mytable)
set term !! ;
CREATE TRIGGER T1_BI FOR mytable
ACTIVE BEFORE INSERT POSITION 0
AS
BEGIN
if (NEW.ID is NULL) then NEW.ID = NEXT VALUE FOR mytable_id_sq;
END!!
set term ; !!
This trigger will only assign a generated value if no ID is assigned in the INSERT statement.
See also:
section SEQUENCE (GENERATOR) in the Firebird 2.5 Language Reference and How to create an autoincrement column? (this link talks about generators, the old name of sequences in Firebird).
Your EDMX isn't probably generated properly. Either you have to set the StoreGeneratedPattern manually or use http://blog.cincura.net/230841-generated-primary-key-in-entity-framework-model-from-firebird/ .
Related
As an example:
I have two tables in firebird:
TB_CUSTOMER
IDCUSTOMER (autoincrement generator)
CUSTOMERNAME
TB_PHONE
IDPHONE
IDCUSTOMER (foreing key from TB_CUSTOMER)
PHONE
I have a registration form developed in Delphi. The table data TB_PHONE are handled using a dbgrid. I can not assign the value of the field IDCUSTOMER in TB_PHONE, because it was not generated by the Firebird generator. How can I make the relationship between the tables? I want to implement it without first saving the table data TB_CUSTOMER. I'm using datamodules with IBDAC.
Any sugest?
Before detail table can be inserted into, you should have PK-index over master-table updated and having proper master-ID in it. That means that some piece of code should insert master-record before inserting detail-record. Where this piece of code would be - is only limited by your fantasy.
Few arrangements include
insert the master-row in your application. Read the id of the row. Insert detail-row using this id.
read ID from then Generator, then insert both rows (master 1st) using the obtained ID
create a stored procedure, inserting both rows and returning ID (implementing #1 or #2 server-side)
use EXECUTE BLOCK - basically ad hoc anonymous SQL procedure. But that only is available in FB 2.x and except for not using namespace it is inferior to #3.
add BEFORE INSERT trigger onto detail table, searching for ID in master and adding one if not found. This would slow down all insert operations (even when master-ID already exists - that should be checked), would not be able to fill all other master columns but ID and is potentially dangerous due to hiding application logic problems. But still that can be implemented (though ugly and dirty method)
create master-join-detail VIEW and add INSERT trigger for it, propagating the new view-row into both master-table and details-table.
et cetera
I want to implement it without first saving the table data TB_CUSTOMER
There's your problem. You need the primary key from the master table before you can save the detail. That's just the way it works. But if what you want is to make sure that the values get saved together, you can do that as a transaction. In Firebird, you can do it like this:
Begin a transaction. Exactly how you do that depends on which DB library you're using to access your Firebird database.
Run an INSERT INTO ... RETURNING statement to insert the row into your master table and retrieve the generated value as a single operation.
Use the generated PK value to fill in the FK value on your detail table.
Insert the detail row.
Commit the transaction.
We have a SQL Server 2008 R2 database with several tables and each table has a number of triggers. On one of the columns, we'll call this Person.Age we have a default value, so that if I don't explicitly supply a value it defaults to "18".
create table PERSON
(
id int IDENTITY(1,1) PRIMARY KEY,
age char(2) DEFAULT '18',
Name char(40),
);
I am using EntityFramework 4.0 (and have also tried 5.0) and Visual Studio 2010, to load and select from the database. Whenever I insert into the table using the following statement, it is inserting a row, but it isn't completing the default value:
var person = new Person
{
Name = "Peter"
};
using (var ctx = new MyEntities())
{
ctx.PERSON.AddObject(person);
ctx.SaveChanges();
}
This will result in a row with a Name of Peter, but the Age will be set to null - and not my default of 18.
When I refresh/load my EDMX file I can only seem to import simple tables and views and there doesn't appear to be an option for importing the properties - although I would have thought this was done by default? Any ideas why the default properties aren't firing?
Also, I have triggers defined in SQL Server so that when a new row is inserted into PERSON, an additional table gets updated. Again this works if I run the SQL direct against the database, but doesn't work if I execute through Visual Studio using EntityFramework.
Thanks,
EF will explicitly set the columns to values you passed. Since, when you created a Person entity, the value for age will be by default set to null EF will send a command in which it will set the column value to null. Set the default value in the ctor if you want to have the default value (otherwise the default value is null for reference properties and default(T) for value type properties (e.g. int)).
The EF designer brings all the columns from the database and create a model with entities that have properties coresponding to the values to the tables and columns it reversed engieneered. You can then go and tweak your model in the designer - for instance you can remove properties you don't want.
I don't know what "does not work" means for you in case of the triggers - it probably depends on your expectation. EF just sends a command to the database. So if you send the same Sql command as the EF sends it should "not work" in the same way. Having said that EF is database agnostic and is not aware of DB magic like triggers. Also the communication is one way only from EF to the DB. So, if you expect that the database notifies the EF about something then it will not work. There are no means for doing this.
Using EF4 /w SQL Server 2008.
The following code (against a table with a PK, defined as Int IDENTITY(1,1):
ctx.AddObject(GetEntitySetName(), newEntity);
ctx.SaveChanges();
The results when profiling SQL are the insert statement following by a lookup against the table I'm inserting into:
SELECT ID FROM Table
WHERE ID = ##ScopeIdentity AND ##RowCount > 0
Is there a way to prevent EntityFramework from retreiving the Identity seed? I don't need the ID back in my .Net code and under high volume situations it seems like a wasted operation.
Altenatively, is there a way to tell EF to change the way it performs this operation? The strategy used goes against recommendations made by Microsoft in this defect report:
http://connect.microsoft.com/SQL/feedback/ViewFeedback.aspx?FeedbackID=328811
You cannot change this behavior when using database generated keys (properties marked with StorgeGeneratedPattern.Identity). EF needs a real key value for inserted entity so the only way to avoid the query is to not using database generated keys at all and handling it yourselves in the application.
I have a (Firebird) DB. For most of my tables I have a trigger which fires before insert which will create the Primary Key (PK) for me via a generator as well as write to the newly inserted records a Created Date value and a Created By value. I also have an update trigger which writes to an Updated Date field and an Updated By field.
eg (Client is a table in my DB):
create trigger t_client_id for client
active before insert
as begin
new.client_id = gen_id(gen_client_id, 1);
new.created = current_timestamp;
new.created_by = current_user;
new.lock_vn = 1;
end ^
create trigger t_client_update for client
active before update
as begin
new.updated = current_timestamp;
new.updated_by = current_user;
end ^
When I apply updates thru my ClientDataSet (CDS) - which are attached to remote TDataSetProviders via a TDSProviderConnection, how can I "retrieve" these generated values? If I edit an existing one (which will in turn call the t_client_update trigger, calling RefreshRecord will get the updated and updated_by fields. However, the Doco says to use that method cautiously, so that may not be the correct way to achieve this. I call it straight after I've called ApplyUpdates(-1).
The CDS I use only contains the one record I am attempting to Edit. For a New record, the CDS is in dsInsert mode. Everything is written to the DB ok so I just need to get this new data back out again. I have also tried using a CDS which contains ALL records in the table too to see if it was any simpler but didn't make any difference - unsurprisingly. The reason I need this information is simply to show to the user in DB Aware controls these values. They are read only.
I could call a Get on the record I guess when editing an existing record, using the PK, but that won't help for an Insert as I don't know what the new PK is.
Example of where I attempt to ApplyUpdates to my CDS (actDSSave is a TDataSetPost action)
dsState := actDSSave.DataSource.DataSet.State;
DoApplyUpdates(-1);
if dsState = dsEdit then
TClientDataSet(actDSSave.DataSource.DataSet).RefreshRecord;
I am using TIBQuery for my dataset attached to the remote DataSetProvider. This query SQL is a simple select * from client where client_id = :client_id. I have tried associating this query with a TIBUpdateSQL too as well as trying to set poAutoRefresh to true in the DataSetProvider.
So is it possible to obtain these Trigger generated values this way or do I need to approach it in a different way? Another way I can think of, is to create stored procedures which do CRUD against each table and use that instead (with appropriate in/out params to return this new data) but hopefully I don't have to go down that track. Hopefully I have provided sufficient info here to explain and replicate the issue.
Thanks
EDIT
Realised in above, DoApplyUpdates(-1) is my own method. It's implementation at the moment is simply:
FdatCommon.cdsClient.ApplyUpdates(MaxErrorCount);
FdatCommon is a TDataModule containing my CDS.
You simply can't get "generated" values without new requery (RefreshRecord) of data after Post.
It's because triggers runs on server side when you call ApplyUpdates, but TClientDataSet does not refresh by default posted record. For example other libraries FIBPlus have an option to do it automatically.
About inserts, TIBDataSet have GeneratorField property. Using it, dataset query and increment generator value separatelly before insert. So you will have PK values after post even on inserts. But avoid using it again in trigger.
MIDAS (TClientDataSet) is a great library, but his general / universal architecture loose DB specific features (such as retriving values from inserts) compared to dedicated libraries for specific DBMS, such as FibPlus. By the way I saw TpFIBClientDataSet. It work in conjunction with TpFibDataSet.
I have the challenge of needing to audit data changes made by users of an MVC application.
Auditing creation and deletion of records is easy.
Updates is proving to be the problem.
I'm looking for a way to automate this, but the problem I have is that the application is using stored procedures to bring back EF "complex types".
These are then used to build a view model, and after postback, the controller receives a new view model built from the form values passed back from the view. Therefore the original values are no longer available.
Does anyone have any suggestions for a secure way to keep the original values so they can be compared with the updated values, so that changes can be stored?
(I appreciate I could go back to the database for these, but is not efficient, and I would have to retain all the parameters to remake the same call, and find a way to automate that part of the process).
Have you tried an Audit Trigger using the INSERTED and DELETED tables.
http://weblogs.asp.net/jgalloway/archive/2008/01/27/adding-simple-trigger-based-auditing-to-your-sql-server-database.aspx
OR
In your stored procedures for insert,delete,update you can make use FOR XML AUTO. To get the XML for the record and add it to an audit table.
http://www.a2zdotnet.com/View.aspx?Id=71
UPDATE A T-SQL example
BEGIN
-- these tables would be in your database
DECLARE #table TABLE(ID INT IDENTITY(1,1) PRIMARY KEY, STR VARCHAR(10), DT DATETIME)
DECLARE #audit_table TABLE(AuditXML XML, Type VARCHAR(10), Time DATETIME)
-- this is defined at the top of your stored procedure
DECLARE #temp_table TABLE(PK INT)
-- your stored procedure will add an OUTPUT to the temp table
INSERT INTO #table
OUTPUT inserted.ID INTO #temp_table
VALUES ('test1', GetDate()),
('test2', GetDate() + 2)
-- at the end of your stored procedure update your audit table
INSERT INTO #audit_table
VALUES(
(
SELECT *
FROM #table
WHERE ID IN (SELECT PK FROM #temp_table)
FOR XML AUTO
),
'INSERTION',
GETDATE()
)
-- your audit table will have the record data
SELECT * FROM #audit_table
END
In the example above you could make temp_table a clone of table (have all of the columns from table) and in your OUTPUT clause use INSERTED.* INTO #temp_table, this would avoid have to reselect the records before getting the FOR XML AUTO. Another note, for stored procedures that do DELETE you would use DELETED.* instead of INSERTED.* in your OUTPUT.
If using SQL Server I recommend that you look into Change Data Capture (CDC).
It's an out of the box solution for auditing changes to the underlying tables of your application and it's relatively straightforward to set up, so there is no need for a custom solution that you then have to maintain.
If you have any supporting applications for your site, they'll also be covered and it also has the benefit of auditing any changes made directly against the database, such as from a DBA running a script.
Since your asp.net application may be running under one particular account, you'll probably need to add additional tracking information to capture the user who made the change. Fortunately this is also relatively straightforward. The following Stack Overflow question covers an approach to this using the ObjectStateManager
I was lookging for this myself, found this, check out Tracker for EF