So, here is the situation -
I insert an item in the database calling the AddtoObject() and then call SaveChanges().
Then, I call a stored procedure to update the currently inserted record.
Then, I call the Save changes() again.
The database when I query it has the correct updated value, but the entity framework
context does not have the updated values..the first time..whenever I refresh the page it gets the value..but the first time it never gets the updated values.
So has anyone faced a similar issue anytime ? What am I doing wrong here ?
The problem is that the EF does not know what your stored procedure is doing, how could it? That work is done at the SQL Server. So after your stored procedure executes, you need to ask EF to update that (and other related) instance by issuing a Refresh() call:
context.Refresh(RefreshMode.StoreWins, myObject);
The StoreWins tells the framework to overwrite values in the instance with values from the database.
Related
This question is now a curiosity, more than anything. Dates will be the end of me.
Using EF 6.
I am storing a date and in the same http request, pulling the object back out of the database.
When I look at the SQL which EF sends, the milliseconds of the date in question which are returned are the same as that which are stored in the db (expected behaviour).
BUT, when EF deserializes that into the object graph in memory, the milliseconds are different.
So, I save '2018-10-16 21:46:22.293'
SQL retrieves '2018-10-16 21:46:22.293'
EF deserializes to 2018-10-16 21:46:22.294 !
I created a workaround by hitting the db with a raw ADO.NET query that gets the exact date ('2018-10-16 21:46:22.293').
Even weirder, if I use a fresh DbContext and grab the whole object with that, the date is fine i.e. '2018-10-16 21:46:22.293'
So, it is only when I use the same DbContext that save the data, to retrieve the data that the date gets rounded (or something).
Anyone seen this weird behaviour? Is there a better fix than either raw SQL (ado.net) or a fresh DbContext?
Cheers
You can bypass the EF cache by appending .AsNoTracking() to your retrieval query.
I'm using Entity Framework (DbContext with database first) with MVC. When user save from a form, I have a condition in the controller that send the entity to the update of insert method depending of some internal flag of mine.
When sending entity to the update method, I flag it to modified using context.Entry(myEntity).State = EntityState.Modified;, I call saveChanges() and everything work well.
When sending the entity to the insert method, I flag it to added using context.Entry(myEntity).State = EntityState.Added; but when calling saveChanges() I receive error about 2 fields that are required...
The problem is that thoses 2 fields are not empty and they effectively contain valid data just before saving... I have even try to force new values to thoses 2 fields just before saving but same error.
It may be usefull to mention that I'm using Devart DotConnect For PostgreSQL as db provider.
Any idea how to debug this problem?
EDIT:
Here is the error:
Validation failed for one or more entities. See 'EntityValidationErrors' property for more details.
When looking for this EntityValidationErrors I receive the 2 following specific errors:
The flg_actif field is required
The user_creation field is required
As mentionned before, those fields are filled with data just before saving so I don't understand what is happening.
I'm using EF v4.0.30319 (system.data.entity=> v4.0 and EntityFramework=> v4.4)
EDIT2:
Just to clarify a little bit more: The entity I'm trying to insert already exist in database. The form show the data of this database row. When saving, I decide if I update the row (this work well) but sometime, I need to insert the edited row as a new register instead of updating it to keep an history of the change in database.
Could you verify if the EntityKey property is set or null on the items you are trying to save?
If it already has a key, the context is already aware of the item, and you should use Attach instead of setting the state to added manually.
EDIT: To summarise the point from below. It looks like what you are doing is inserting a new copy of a row already associated with a context. That is almost certainly your problem. Try creating a fresh object based on your original row (i.e. copy the variable values or use a copy constructor), then add that new object.
Additionally, you should not need to set the state manually on a newly added object. You are trying to force the state here because the context doesn't see that item as a new one.
I have a (Firebird) DB. For most of my tables I have a trigger which fires before insert which will create the Primary Key (PK) for me via a generator as well as write to the newly inserted records a Created Date value and a Created By value. I also have an update trigger which writes to an Updated Date field and an Updated By field.
eg (Client is a table in my DB):
create trigger t_client_id for client
active before insert
as begin
new.client_id = gen_id(gen_client_id, 1);
new.created = current_timestamp;
new.created_by = current_user;
new.lock_vn = 1;
end ^
create trigger t_client_update for client
active before update
as begin
new.updated = current_timestamp;
new.updated_by = current_user;
end ^
When I apply updates thru my ClientDataSet (CDS) - which are attached to remote TDataSetProviders via a TDSProviderConnection, how can I "retrieve" these generated values? If I edit an existing one (which will in turn call the t_client_update trigger, calling RefreshRecord will get the updated and updated_by fields. However, the Doco says to use that method cautiously, so that may not be the correct way to achieve this. I call it straight after I've called ApplyUpdates(-1).
The CDS I use only contains the one record I am attempting to Edit. For a New record, the CDS is in dsInsert mode. Everything is written to the DB ok so I just need to get this new data back out again. I have also tried using a CDS which contains ALL records in the table too to see if it was any simpler but didn't make any difference - unsurprisingly. The reason I need this information is simply to show to the user in DB Aware controls these values. They are read only.
I could call a Get on the record I guess when editing an existing record, using the PK, but that won't help for an Insert as I don't know what the new PK is.
Example of where I attempt to ApplyUpdates to my CDS (actDSSave is a TDataSetPost action)
dsState := actDSSave.DataSource.DataSet.State;
DoApplyUpdates(-1);
if dsState = dsEdit then
TClientDataSet(actDSSave.DataSource.DataSet).RefreshRecord;
I am using TIBQuery for my dataset attached to the remote DataSetProvider. This query SQL is a simple select * from client where client_id = :client_id. I have tried associating this query with a TIBUpdateSQL too as well as trying to set poAutoRefresh to true in the DataSetProvider.
So is it possible to obtain these Trigger generated values this way or do I need to approach it in a different way? Another way I can think of, is to create stored procedures which do CRUD against each table and use that instead (with appropriate in/out params to return this new data) but hopefully I don't have to go down that track. Hopefully I have provided sufficient info here to explain and replicate the issue.
Thanks
EDIT
Realised in above, DoApplyUpdates(-1) is my own method. It's implementation at the moment is simply:
FdatCommon.cdsClient.ApplyUpdates(MaxErrorCount);
FdatCommon is a TDataModule containing my CDS.
You simply can't get "generated" values without new requery (RefreshRecord) of data after Post.
It's because triggers runs on server side when you call ApplyUpdates, but TClientDataSet does not refresh by default posted record. For example other libraries FIBPlus have an option to do it automatically.
About inserts, TIBDataSet have GeneratorField property. Using it, dataset query and increment generator value separatelly before insert. So you will have PK values after post even on inserts. But avoid using it again in trigger.
MIDAS (TClientDataSet) is a great library, but his general / universal architecture loose DB specific features (such as retriving values from inserts) compared to dedicated libraries for specific DBMS, such as FibPlus. By the way I saw TpFIBClientDataSet. It work in conjunction with TpFibDataSet.
I have a Tadocommand on my datamodule which is connected to a MSSQL storedproc. The storedproc is used to update a table.
In my code I call the the tadocommand in the beforeupdaterecord method of one of my Tclientdatasets.
first I supply values to the tadocommand parameters using the deltads.fieldbyname().newvalue of the Tclientdataset then I call the execute procedure. It works ok for the first update but if i try to do a next update it generates "error changing varchar to datetime".
if i dynamically create the tadocommand in the beforeupdaterecord method i.e
sp1_editcontract:=Tadocommand.Create(nil);
sp1_editcontract.CommandType:=cmdStoredProc;
sp1_editcontract.Connection:=DMDBconn.DBConn;
sp1_editcontract.CommandText:='EditContract';
sp1_editcontract.Parameters.Refresh;
//assign parameter values
sp1_editcontract.execute;
sp1_editcontract.free;
it works without any errors. I think there is some problem with the parameters values when using the static Tadocommand on the datamodule.
why does multiple update generate an error when using a static created tadocommand and not for the dynamically created tadocommand?
I'm going to assume you are referring to TDatasetProvider.BeforeUpdateRecord and not TClientDataSet.BeforeUpdateRecord.
Its kinda hard to say from the information you've provided (you don't indicate the data types or order of the arguments for the stored procedure). The error message is coming from the SQL Server engine. I would make sure the values being assigned to the parameters are always being set in the correct order. Also try to identify which parameter is causing the error. If you can reliably reproduce it in your client code you may try calling the stored procedure in SSMS passing the same values that are causing the error in the client application.
Once you identify the parameter you can check that its datatype is consistent between the ADOCommand, DatasetProvider and ClientDataset. If it changes type along the way that may be the cause of the error.
One last suggestion, make sure you set TDatasetProvider.Applied := True before exiting the BeforeUpdateRecord handler. This prevents the dataset provider from trying to apply the update using dynamic sql after you've already applied the updates. If the data in the client dataset was populated by a TADOQuery it may be attempting to update the tables directly.
I had a similar problem. In order to clear all the existing parameters of the ADOCommand before adding new ones a used the following code:
while Command.Parameters.Count>0 do
Command.Parameters.Delete(0);
Parameters.Clear should had worked, but it didnĀ“t, so I decided to remove the parameters one by one. That fixed to me.
In LINQ to Entities, I map the result set of a stored procedure to an entity.
Within the stored procedure, I execute some update statements and return the result set by running a SELECT query and mapping that result set to the entity.
The database rows get updated correctly, but the entities returned are not reflecting the changes. Instead, the data before the update is getting returned?
Any suggestions?
Thank you.
Abe
Actually, it turns out the DataContext.Refresh method solved my problem at
http://msdn.microsoft.com/en-us/library/system.data.linq.datacontext.refresh.aspx
Here's my code:
db.Refresh(System.Data.Objects.RefreshMode.StoreWins, affectedProjectTasks);
Thanks Marc for pointing me to the right direction!
Abe
Are the entities in question already cached in the context? (i.e. have you queried them already?)
If so, the identity manager will always give you back the original object (rather than creating a new object with the same identity in the same context). Hence for data that has already been read (by other queries) only the identity/primary-key field(s) are considered.