Wondering if someone can help me understand how a DB2 before insert trigger behaves. I have a Grails app that inserts rows to a DB2 database. The table in question has a before insert trigger that updates the date and user for the update:
CREATE TRIGGER WTESTP.SCSMA11I NO CASCADE BEFORE INSERT ON
WTESTP.SCSMA01T REFERENCING NEW AS NEWROW FOR EACH ROW MODE
DB2SQL BEGIN ATOMIC SET NEWROW.LST_UPDT_TMSP =
CURRENT_TIMESTAMP ; SET NEWROW.USER_ID = RTRIM ( USER ) ; END ;
In my Grails application I set all the values, including the user id:
flatAdjustmentInstance.setUserID("TS37813")
We use a generic application ID and password via JNDI to make the connection to the database. For auditing purposes I need to set the value of the user to whomever logged into the application. Is the only solution to remove the trigger entirely and just be really sure it is set?
The DB2 USER variable (also called "special register") contains the authorization ID of the current database connection. If an application wishes to pass another user ID to DB2, it can do so by calling the API function sqleseti() or the stored procedure WLM_SET_CLIENT_INFO() -- more info here. The trigger can then reference another special register, CURRENT CLIENT_USERID.
Related
I want to only update the values email, firstname and lastname if they are blank.
I need this so that if the user decides to change these in the settings, they are not overwritten every time the user logs in with facebook.
Any solutions to check if the fields are blank without a datasnapshot? Trying to maximise efficency.
Current code when user signs in with facebook?
Database Structure for each user:
One way to do this is using a firebase transaction.
A transaction allows you to check the current value of a DB reference before you set/update it. It's main use case is preventing multiple concurrent updates from multiple sources but it can be used for this case as well - read and then write.
In the transaction block you get the value of the DB ref you're transacting on & can check that the value is null (hence 'create' case) -> then update it as required and return TransactionResult.success(withValue: newData).
If the object is already set you simply abort the transaction with TransactionResult.abort() and no write to the DB is executed.
Another option, that doesn't require a read/write, is to set a Firebase database rule on the relevant ref that will only allow write if the previous value was null:
"refPath": {
".write": "data.val() == null && newDataval() != null"
}
Writing a second time to the DB for an existing ref will fail.
I'd go with the transaction - more expressive of the requirement in the client code.
In firebase the only way you have to check if the current value of your fields in your database are empty is to fetch them before you are setting them.
You can check the field is empty only by fetching them.Then Use this code to update a particular value
ref.child("yourKey").child("yourKey").updateChildValues(["email": yourValue])
Hi and apologies in advance if the question has already been asked. I haven't been able to come across the answer.
I'm wondering if there is a table that holds a record of oracle usernames that have executed a particular procedure or function.
I'm trying to create a procedure that can be called as a subprogram by another procedure. The procedure which i'm looking to create will create a log entry every time the other procedure is executed. Example below;
User_Name = The Oracle user name of the person who executes the function.
Name = The name of the procedure or function.
LastCompileDT = The date/time the function or procedure was last compiled.
I'm a bit stuck on where to source the data from.
I've come across the all_source table but it only gives me the owner of the procedure and not the executing user.
Any feedback would be greatly appreciated.
Thanks
There might be a couple of ways to do that. Maybe someone else can suggest a method of extracting all this data from one data dictionary view. However, my method would be like this:
User_Name: use the keyword USER. It returns the Oracle user that executed the procedure:
SELECT USER FROM DUAL;
However, if you are interested in the OS user who executed that procedure, then you can use the following
SELECT sys_context( 'userenv', 'os_user' ) FROM DUAL;
More on this here. To my knowledge, this can be fetched on the fly only, and it is not logged anywhere by default. So you need to run it when you call the procedure.
Procedure Name: &
LastCompileDT : can be fetched from the view USER_OBJECTS
SELECT OBJECT_NAME, LAST_DDL_TIME
FROM USER_OBJECTS
WHERE OBJECT_TYPE = 'PROCEDURE'
AND OBJECT_NAME = '<YOUR PROCEDURE NAME>';
Rather than rolling your own audit, you could use the inbuilt auditing table provided.
See https://docs.oracle.com/cd/B19306_01/server.102/b14200/statements_4007.htm
--Create a test procedure as an example
CREATE PROCEDURE my_test_proc
AS
BEGIN
NULL;
END my_test_proc;
--Turning on auditing executions of the proc
AUDIT EXECUTE ON my_test_proc BY ACCESS WHENEVER SUCCESSFUL;
--Run the proc
EXEC my_test_proc;
--check audit history
SELECT *
FROM dba_common_audit_trail cat
WHERE cat.object_name = 'MY_TEST_PROC';
The dba_common_audit_trail table has columns DB_USER, and OBJECT_NAME for your User_Name/Name.
For the last compiled time see Hawk's answer, or if you want to see a history of last DDL times you can add this to the audit
--Turn on auditing of creating procs
AUDIT CREATE PROCEDURE BY ACCESS;
I've not been able to find an answer on this anywhere. Using Delphi XE7 with TClientDataSet, DataSnap & SQL Server. I need to insert a record, apply updates and then refresh that record so I can get the Id and assign it to my object. Seems pretty basic requirement, but on the contrary it is proving to be a royal pain.
I've found the obvious stuff on EDN, SO and Dr Bob:
http://edn.embarcadero.com/article/20847
DataSnap and the autoinc field
http://www.drbob42.com/examines/examinC0.htm
However these seem to focus on a "Refresh" of the TClientDataSet to re-fetches the entire table/query. Whilst this does actually resolve the Id field itself (good!), it also moves the cursor off the current record which was just inserted and so I'm not able to get the Id and assign it to my object. Also, for performance over HTTP I don't really want to refetch the entire table every time a record is inserted, if there's 10,000 records this will consume too much bandwidth and be ridiculously slow!
Consider the following code:
function TRepository<I>.Insert(const AEntity: I): I;
begin
FDataSet.DisableControls;
try
FDataSet.Insert;
AssignEntityToDataSet(AEntity); // SET'S ALL THE RELEVANT FIELDS
FDataSet.Post;
FDataSet.ApplyUpdates(-1);
FDataSet.Refresh; // <--- I tried RefreshRecord here but it cannot resolve the record
AEntity.Id := FDataSet.FieldByName('Id').AsInteger; // <----- THIS NOW POINTS TO WRONG ROW
finally
FDataSet.EnableControls;
end;
end;
Does anyone know how to achieve this? I need to be able to refresh and stay on the current record otherwise I do not know the Id of the record just created and the GUI cannot stay focused on the current record.
Hopefully something obvious I'm missing.
Cheers.
Rick.
Assuming you can get hands on the new ID inside the AfterUpdateRecord event of your DataProvider, your event handler then may look like this (the current record of DeltaDS is the one just inserted into SourceDS):
if (UpdateKind = ukInsert) then begin
DeltaDS.FindField('Id').NewValue := <TheNewID>;
end;
Make sure to have the poPropogateChanges option set in the provider. This will transfer the changed Id field back to the ClientDataSet.
Now you can get rid of the FDataSet.Refresh call.
SQL Server does allow you to get the last identity it generated in several ways - there's no need to "refresh" the record/query which means re-issuing a SELECT and can generate undesiderable side-effects. You can use SELECT SCOPE_IDENTITY() or use an OUTPUT clause. If the Delphi database driver supports it, TField.AutogenerateValue should accomplish that task automatically (see http://docwiki.embarcadero.com/Libraries/XE7/en/Data.DB.TField.AutoGenerateValue)
Otherwise you have to put that new data into your delta (see Raabe answer - this has to be done on the datasnap server which actually talks to the database) after reading it, so it's sent back to the client. You also need to set properly and TField.ProviderFlags to ensure data are applied correctly (see http://docwiki.embarcadero.com/RADStudio/XE7/en/Influencing_How_Updates_Are_Applied), usually you don't want those field appear in an UPDATE.
I have clients-->|cascade rule|-->orders_table-->|cascade rule|-->order_details
in my order_details I have after delete trigger that increment the quantity in my product table
CREATE OR ALTER TRIGGER TABLEAU_DETAIL_VENTES_AD0 FOR TABLEAU_DETAIL_VENTES
ACTIVE AFTER DELETE POSITION 0
AS
declare variable qte numeric_15_2;
begin
select qte_article from tableau_articles where id_article = old.id_article
into :qte;
qte = :qte + old.qte;
update tableau_articles
set qte_article = :qte
where id_article = old.id_article;
end
If I delete a client than all orders depending on it will be deleted
and the orders_detail so on.
The problem is that order_details after delete trigger will be fired and incrementing the product quantity I don't want that to happen.
My question: is there any way to know if the trigger has been fired by cascade rule or sql delete statement that come from the application?
I want to achieve something like:
If trigger triggered by the cascade rule then disable_all_triggers. Thanks in advance for your help.
You can try to wrap your delete code in stored procedure with execute statement for in/activate the trigers
CREATE PROCEDURE DeleteClient(
ID INTEGER)
AS
begin
execute statement 'alter trigger TABLEAU_DETAIL_VENTES_AD0 inactive;';
/*
Your Delete statement here
*/
execute statement 'alter trigger TABLEAU_DETAIL_VENTES_AD0 active;';
END^
I end up using context variables in my clients table i add after delete trigger and set a flag using rdb$set_context
SET TERM ^ ;
CREATE OR ALTER TRIGGER TABLEAU_CLIENTS_AD0 FOR TABLEAU_CLIENTS
ACTIVE AFTER DELETE POSITION 0
AS
declare variable id integer;
begin
execute statement 'select rdb$set_context(''USER_SESSION'', ''myvar'', 100) from rdb$database' into :id;
end
^
SET TERM ; ^
in the detail orders i check my flag with rdb$get_context and skip the trigger if the flag exist with the value associated
select rdb$get_context('USER_SESSION', 'myvar') from rdb$database into :i;
if (i = 100) then exit;
You can't determine that, but you can determine if your foreign key is still valid. Since Firebird cascaded deletes are sequential (rows that are referenced in a foreign keys are deleted first), you can check if your old.id_article is still valid before updating the record.
I'm not sure you would achieve what you want like that. What if you just delete an order and its items. Wouldn't you want to increment quantities in that case?
Anyway... I wouldn't deactivate triggers from within triggers. That is bad design.
Use some sort of variable... update a flag in a support table. From within the client delete trigger you can set this variable. Then in the order_items delete trigger you can check it to see if you need to update quantities.
Another better option is to analyze the situation better and determine why and when you actually want to update quantities. If you are deleting an old order which has already been fulfilled and delivered, you probably wouldn't want to. If you are canceling a new order, you probably would. So maybe updating the quantities depends actually more on the state of the order (or some other variable) then simply on the fact that you are deleting an order_items row.
Ok, so you say orders cannot be deleted, except when deleting the client. Then maybe you should flag the client or its order with a flag that states the client is being deleted. In the order_items delete trigger you update article quantities only if the client is not being deleted.
I'm building a messaging application. I update the badge count in the database via a sqlite trigger whenever any operation like insert/delete/read message happens.
Currently, though the value update in the DB happens asynchronously, I have no way to get notified about when the value changes in my application and hence am polling periodically.
Is there some way to setup an observer on a database value/publish some notification when a given value changes?
I know that I can do this easily by first updating the badge count in an in-memory property and then persisting the changes to the DB; but I am not very inclined to do this, since there are too many entry points for this value to change, and I don't want to add a SET property everywhere.
One possible option would be to define a trigger that is only called when this specific value in the database is updated. The trigger should then make a call to a user defined function you create in your app. You use the sqlite3_create_function function to add your own function to SQLite. Your trigger would like something like:
CREATE TRIGGER some_trigger_name
AFTER UPDATE OF some_column ON some_table
FOR EACH ROW BEGIN
SELECT my_custom_fuction();
END;
If needed, you can pass 1 or more arguments to your function.
Though that this might not be an option for you, Core Data does this well.