I have a (Firebird) DB. For most of my tables I have a trigger which fires before insert which will create the Primary Key (PK) for me via a generator as well as write to the newly inserted records a Created Date value and a Created By value. I also have an update trigger which writes to an Updated Date field and an Updated By field.
eg (Client is a table in my DB):
create trigger t_client_id for client
active before insert
as begin
new.client_id = gen_id(gen_client_id, 1);
new.created = current_timestamp;
new.created_by = current_user;
new.lock_vn = 1;
end ^
create trigger t_client_update for client
active before update
as begin
new.updated = current_timestamp;
new.updated_by = current_user;
end ^
When I apply updates thru my ClientDataSet (CDS) - which are attached to remote TDataSetProviders via a TDSProviderConnection, how can I "retrieve" these generated values? If I edit an existing one (which will in turn call the t_client_update trigger, calling RefreshRecord will get the updated and updated_by fields. However, the Doco says to use that method cautiously, so that may not be the correct way to achieve this. I call it straight after I've called ApplyUpdates(-1).
The CDS I use only contains the one record I am attempting to Edit. For a New record, the CDS is in dsInsert mode. Everything is written to the DB ok so I just need to get this new data back out again. I have also tried using a CDS which contains ALL records in the table too to see if it was any simpler but didn't make any difference - unsurprisingly. The reason I need this information is simply to show to the user in DB Aware controls these values. They are read only.
I could call a Get on the record I guess when editing an existing record, using the PK, but that won't help for an Insert as I don't know what the new PK is.
Example of where I attempt to ApplyUpdates to my CDS (actDSSave is a TDataSetPost action)
dsState := actDSSave.DataSource.DataSet.State;
DoApplyUpdates(-1);
if dsState = dsEdit then
TClientDataSet(actDSSave.DataSource.DataSet).RefreshRecord;
I am using TIBQuery for my dataset attached to the remote DataSetProvider. This query SQL is a simple select * from client where client_id = :client_id. I have tried associating this query with a TIBUpdateSQL too as well as trying to set poAutoRefresh to true in the DataSetProvider.
So is it possible to obtain these Trigger generated values this way or do I need to approach it in a different way? Another way I can think of, is to create stored procedures which do CRUD against each table and use that instead (with appropriate in/out params to return this new data) but hopefully I don't have to go down that track. Hopefully I have provided sufficient info here to explain and replicate the issue.
Thanks
EDIT
Realised in above, DoApplyUpdates(-1) is my own method. It's implementation at the moment is simply:
FdatCommon.cdsClient.ApplyUpdates(MaxErrorCount);
FdatCommon is a TDataModule containing my CDS.
You simply can't get "generated" values without new requery (RefreshRecord) of data after Post.
It's because triggers runs on server side when you call ApplyUpdates, but TClientDataSet does not refresh by default posted record. For example other libraries FIBPlus have an option to do it automatically.
About inserts, TIBDataSet have GeneratorField property. Using it, dataset query and increment generator value separatelly before insert. So you will have PK values after post even on inserts. But avoid using it again in trigger.
MIDAS (TClientDataSet) is a great library, but his general / universal architecture loose DB specific features (such as retriving values from inserts) compared to dedicated libraries for specific DBMS, such as FibPlus. By the way I saw TpFIBClientDataSet. It work in conjunction with TpFibDataSet.
Related
I need to call a vendor procedure that searches the database for possible matches. The input parameters are entered in a global temp table, then a procedure needs to be called that fills another global temp table with possible matches. Any thoughts on the best way to do this with APEX?
This is a vendor database. I really can't change anything. The vendor procedure requires that I load parameters into their GTT, run their procedure, then get the results from their result GTT. I'm new to APEX and just trying to figure out the best way to handle that...what type of apex object do I use to load the parameters to the parameter GTT? How do I call the procedure when the parameter row is saved? What apex object should I use to display the result GTT...a report, a grid...?
As data in a global temporary table (GTT) is "private", i.e. can be accessed in the same transaction or a session (which would probably be your choice, so you'd create a GTT with the ON COMMIT PRESERVE ROWS), as long as you do everything in the same session, that would work.
On the other hand, if there are several sessions involved, you're probably out of luck and will have to change the approach. The most obvious is to use a normal table (not a global temporary one), or - if possible - Apex collections.
As an example:
I have two tables in firebird:
TB_CUSTOMER
IDCUSTOMER (autoincrement generator)
CUSTOMERNAME
TB_PHONE
IDPHONE
IDCUSTOMER (foreing key from TB_CUSTOMER)
PHONE
I have a registration form developed in Delphi. The table data TB_PHONE are handled using a dbgrid. I can not assign the value of the field IDCUSTOMER in TB_PHONE, because it was not generated by the Firebird generator. How can I make the relationship between the tables? I want to implement it without first saving the table data TB_CUSTOMER. I'm using datamodules with IBDAC.
Any sugest?
Before detail table can be inserted into, you should have PK-index over master-table updated and having proper master-ID in it. That means that some piece of code should insert master-record before inserting detail-record. Where this piece of code would be - is only limited by your fantasy.
Few arrangements include
insert the master-row in your application. Read the id of the row. Insert detail-row using this id.
read ID from then Generator, then insert both rows (master 1st) using the obtained ID
create a stored procedure, inserting both rows and returning ID (implementing #1 or #2 server-side)
use EXECUTE BLOCK - basically ad hoc anonymous SQL procedure. But that only is available in FB 2.x and except for not using namespace it is inferior to #3.
add BEFORE INSERT trigger onto detail table, searching for ID in master and adding one if not found. This would slow down all insert operations (even when master-ID already exists - that should be checked), would not be able to fill all other master columns but ID and is potentially dangerous due to hiding application logic problems. But still that can be implemented (though ugly and dirty method)
create master-join-detail VIEW and add INSERT trigger for it, propagating the new view-row into both master-table and details-table.
et cetera
I want to implement it without first saving the table data TB_CUSTOMER
There's your problem. You need the primary key from the master table before you can save the detail. That's just the way it works. But if what you want is to make sure that the values get saved together, you can do that as a transaction. In Firebird, you can do it like this:
Begin a transaction. Exactly how you do that depends on which DB library you're using to access your Firebird database.
Run an INSERT INTO ... RETURNING statement to insert the row into your master table and retrieve the generated value as a single operation.
Use the generated PK value to fill in the FK value on your detail table.
Insert the detail row.
Commit the transaction.
I have a Tadocommand on my datamodule which is connected to a MSSQL storedproc. The storedproc is used to update a table.
In my code I call the the tadocommand in the beforeupdaterecord method of one of my Tclientdatasets.
first I supply values to the tadocommand parameters using the deltads.fieldbyname().newvalue of the Tclientdataset then I call the execute procedure. It works ok for the first update but if i try to do a next update it generates "error changing varchar to datetime".
if i dynamically create the tadocommand in the beforeupdaterecord method i.e
sp1_editcontract:=Tadocommand.Create(nil);
sp1_editcontract.CommandType:=cmdStoredProc;
sp1_editcontract.Connection:=DMDBconn.DBConn;
sp1_editcontract.CommandText:='EditContract';
sp1_editcontract.Parameters.Refresh;
//assign parameter values
sp1_editcontract.execute;
sp1_editcontract.free;
it works without any errors. I think there is some problem with the parameters values when using the static Tadocommand on the datamodule.
why does multiple update generate an error when using a static created tadocommand and not for the dynamically created tadocommand?
I'm going to assume you are referring to TDatasetProvider.BeforeUpdateRecord and not TClientDataSet.BeforeUpdateRecord.
Its kinda hard to say from the information you've provided (you don't indicate the data types or order of the arguments for the stored procedure). The error message is coming from the SQL Server engine. I would make sure the values being assigned to the parameters are always being set in the correct order. Also try to identify which parameter is causing the error. If you can reliably reproduce it in your client code you may try calling the stored procedure in SSMS passing the same values that are causing the error in the client application.
Once you identify the parameter you can check that its datatype is consistent between the ADOCommand, DatasetProvider and ClientDataset. If it changes type along the way that may be the cause of the error.
One last suggestion, make sure you set TDatasetProvider.Applied := True before exiting the BeforeUpdateRecord handler. This prevents the dataset provider from trying to apply the update using dynamic sql after you've already applied the updates. If the data in the client dataset was populated by a TADOQuery it may be attempting to update the tables directly.
I had a similar problem. In order to clear all the existing parameters of the ADOCommand before adding new ones a used the following code:
while Command.Parameters.Count>0 do
Command.Parameters.Delete(0);
Parameters.Clear should had worked, but it didnĀ“t, so I decided to remove the parameters one by one. That fixed to me.
I have the challenge of needing to audit data changes made by users of an MVC application.
Auditing creation and deletion of records is easy.
Updates is proving to be the problem.
I'm looking for a way to automate this, but the problem I have is that the application is using stored procedures to bring back EF "complex types".
These are then used to build a view model, and after postback, the controller receives a new view model built from the form values passed back from the view. Therefore the original values are no longer available.
Does anyone have any suggestions for a secure way to keep the original values so they can be compared with the updated values, so that changes can be stored?
(I appreciate I could go back to the database for these, but is not efficient, and I would have to retain all the parameters to remake the same call, and find a way to automate that part of the process).
Have you tried an Audit Trigger using the INSERTED and DELETED tables.
http://weblogs.asp.net/jgalloway/archive/2008/01/27/adding-simple-trigger-based-auditing-to-your-sql-server-database.aspx
OR
In your stored procedures for insert,delete,update you can make use FOR XML AUTO. To get the XML for the record and add it to an audit table.
http://www.a2zdotnet.com/View.aspx?Id=71
UPDATE A T-SQL example
BEGIN
-- these tables would be in your database
DECLARE #table TABLE(ID INT IDENTITY(1,1) PRIMARY KEY, STR VARCHAR(10), DT DATETIME)
DECLARE #audit_table TABLE(AuditXML XML, Type VARCHAR(10), Time DATETIME)
-- this is defined at the top of your stored procedure
DECLARE #temp_table TABLE(PK INT)
-- your stored procedure will add an OUTPUT to the temp table
INSERT INTO #table
OUTPUT inserted.ID INTO #temp_table
VALUES ('test1', GetDate()),
('test2', GetDate() + 2)
-- at the end of your stored procedure update your audit table
INSERT INTO #audit_table
VALUES(
(
SELECT *
FROM #table
WHERE ID IN (SELECT PK FROM #temp_table)
FOR XML AUTO
),
'INSERTION',
GETDATE()
)
-- your audit table will have the record data
SELECT * FROM #audit_table
END
In the example above you could make temp_table a clone of table (have all of the columns from table) and in your OUTPUT clause use INSERTED.* INTO #temp_table, this would avoid have to reselect the records before getting the FOR XML AUTO. Another note, for stored procedures that do DELETE you would use DELETED.* instead of INSERTED.* in your OUTPUT.
If using SQL Server I recommend that you look into Change Data Capture (CDC).
It's an out of the box solution for auditing changes to the underlying tables of your application and it's relatively straightforward to set up, so there is no need for a custom solution that you then have to maintain.
If you have any supporting applications for your site, they'll also be covered and it also has the benefit of auditing any changes made directly against the database, such as from a DBA running a script.
Since your asp.net application may be running under one particular account, you'll probably need to add additional tracking information to capture the user who made the change. Fortunately this is also relatively straightforward. The following Stack Overflow question covers an approach to this using the ObjectStateManager
I was lookging for this myself, found this, check out Tracker for EF
I'm coming from the delphi world and I want to make a master/detail interface, like Order and Products.
I already made actions to display the data using fields and a jqGrid. What I want know is how make possible to add lines, edit or remove them, but, just make the changes in db when the user confirm the changes in the master.
On delphi I would use a TClientDataSet with all the in memory changes and just after the confirmation would execute them inside a transaction like:
BEGIN
Master.Post
FOREACH Line IN Lines Line.Post
COMMIT
So in resume, I don't know how keep in memory the array of lines in the grid and how send them back to server to commit.
Any help will be appreciated. Thanks in Advance.
You'll need to keep track of the changes client side, perhaps using some hidden fields and/or form fields in your grid. When a line is deleted (that previously existed in the db), you'll need to add it's id to a field containing lines to delete. Lines that are added need to have associated form fields containing their data. When the master is committed you roll the whole set of fields up into a POST and send that back to the server.
Using LINQ to SQL, you'd create a data context, get the master object, then delete the related objects (from the hidden field of ids) that are so marked and create/add new related objects that didn't exist before taking the values from the appropriate form fields. Then you'd do a SubmitChanges and all of the statements would be executed within a single transaction.