I am accessing my database through ADO.NET Entity framework in MVC 3 Application.
I am updating my database through Stored Procedure.
But the changes are not reflected at run time.I mean to say i am able to see the changes only after restarting it.
What is the reason for the problem and How can i avoid it ?
I am using Repository pattern So at repository My code look like this
Ther Is One Function Which Save Changes
public void SaveNewAnswer(AnswerViewModel answer,string user)
{
SurveyAdminDBEntities _entities = new SurveyAdminDBEntities();
_entities.usp_SaveNewAnswer(answer.QuestionId, answer.AnswerName, answer.AnswerText, answer.AnswerOrder, answer.Status, user);
_entities.SaveChanges();
}
Data Retreival Code
public IEnumerableGetMultipleChoiceQuestions(string questionId)
{
SurveyAdminDBEntities _entities = new SurveyAdminDBEntities();
_entities.AcceptAllChanges();
_entities.SaveChanges();
return _entities.usp_GetMultipleChoiceQuestions(Int32.Parse(questionId));
}
But Changes are not reflected till the time i don't clode the session of the browser and run it again .
Please help !
Thank You In advance
Are you calling context.SaveChanges() on your Entities (DbContext/ObjectContext) object? Are you using a transaction that you haven't committed?
If you have an uncommitted transaction in your sproc, you can try creating your own entity transaction and seeing if committing your transaction will commit the nested transaction as well. The problem is that calling SaveChanges() automatically begins and commits a transaction, so this may not be any different than that.
I would also call _entities.AcceptAllChanges() in your save operation.
public void SaveNewAnswer(AnswerViewModel answer,string user)
{
SurveyAdminDBEntities _entities = new SurveyAdminDBEntities();
_entities.Connection.Open();
System.Data.Common.DbTransaction tran = _entities.Connection.BeginTransaction();
try
{
_entities.usp_SaveNewAnswer(answer.QuestionId, answer.AnswerName, answer.AnswerText, answer.AnswerOrder, answer.Status, user);
_entities.SaveChanges(); // automatically uses the open transaction instead of a new one
tran.Commit();
}
catch
{
tran.Rollback();
}
finally
{
if (_entities.Connection.State == System.Data.ConnectionState.Open)
_entities.Connection.Close();
_entities.AcceptAllChanges();
}
}
Is your stored procedure doing an explicit commit? Things run in a database session will be available for that session, but not available to any other session until the action is committed.
When you pull data out of your database into your context that data is kept in memory, separate from the actual database itself.
You will see the changes if you create a new context object instance and load the data from the database with it.
It's good practice to not use the same instance of your context object but create them on an as needed basis for individual transactions with the database. In your case if you're updating via function imports instead of the context.SaveChanges() method then you need to refresh your context with the updated data after you commit those changes.
Add this to your connect string (assuming sql 2005)
transaction binding=Explicit Unbind;
if the data is no longer available after session reset, then the problem is indeed with a transaction, if the data is then available after reset, then your problem is something different and we'll likely need more details.
Related
I tried writing a service, having an update(User) function using SDN 4.0.0.
The function should look, it there is a User in the database with the same id, and if so, overwrite this user with the new one.
Having a UserRepository which extends the GraphRepository<User> I wrote the following code:
User updateUser(User user){
if(userRepository.findOne(user.getId())!=null){
user = userRepository.save(user);
return user;
}else{
//Exception handling here
}
}
I now have the problem, that each user I update stays the way it was in the database because from the moment, the findOne(id) is called, all attributes of the user object get overwritten with the user as it is in the database.
I already fixed the problem, by adding an existsById(Long id) function in the repository annotated with the Query "Match (n:User) where ID(n)={0}".
However, I'm still interested, why SDN overwrites an object having the same id as an object i tried to get. I'm assuming there a references involved, but i can't really see the advantages of it.
This is by design, that when you load an entity from the database, it is the most recent version in the graph, thus overwriting any unsaved changes.
If you change the order of operations- load first, if it exists, then modify and save- you should be fine.
Iam facing a problem while using factory model in MVC.
As i update and try to display the data from the same table, the update is being performed in the database but the updated data is not being fetched from the database.
I feel that it is fetching the data from the Entities and displaying the data.
I used Modelstate.clear() outputcache etc., but none of it worked.
code used:
For Update:
public virtual void Update(TObject TObject)
{
var entry = Context.Entry(TObject);
DbSet.Attach(TObject);
entry.State = EntityState.Modified;
}
calling Update method in my service and saving changes:
Registry.RepositoryFactory.GetUsersRepository().Update(userobj);
Registry.Context.SaveChanges();
Fetching data after save:
Select:
public virtual IQueryable<TObject> All()
{
return DbSet.AsQueryable();
}
I am able to update in the database, but as it try to retrieve the data immediately from the same table it is not hitting the database, i think it is fetching the data from the cache.
Any pointers are welcome.
Thanks in advance,
Girish.
I have followed the link provided by Damon, the problem is that Refresh occurs. But it is taking few seconds(2 or 3). The page has to load immediately.
The solution that has worked for me is that, while fetching the data from the repository, i am setting the entity using Set. And then i used Refresh method before fetching the data.
Code used:
DbSet set = ((DbContext)Context).Set();
((IObjectContextAdapter)Context).ObjectContext.Refresh(System.Data.Objects.RefreshMode.StoreWins, set);
return DbSet as IQueryable;
I'm going to guess that you are re-using the same EF Context object for both the Update and the Select. If the Context is not disposed of between these events you will end up with a stale context and data will be returned from the EF cache.
Make sure you are disposing of the EF context between calls with the best practise being to surround it in a Using statement. An alternative to this is to call Refresh() on the Context (See this question). You'll still need to dispose of the context at some point because otherwise it will continue to grow and your application will get slower and slower.
I've answered a similar question here.
I have a model called a DeviceAccount. It is a join table that allows me to create many to many relationships.
I have a function that creates a new DeviceAccount by handing it an account & a device to join. See here:
var createDeviceAccount = function (account, device) {
var initialValues = {
account: account,
device: device
};
return manager.createEntity(entityNames.deviceAccount, initialValues);
};
I have a function to delete a DeviceAccount. See here:
var deleteDeviceAccount = function (account, device) {
var baseQuery = entityQuery.from('DeviceAccounts');
var p1 = new breeze.Predicate('device', 'eq', device);
var p2 = new breeze.Predicate("account", "eq", account);
var modQuery = baseQuery.where(p1.and(p2));
var results = manager.executeQueryLocally(modQuery);
results[0].entityAspect.setDeleted();
};
If I locally create, remove, create, remove the same device/account pair there is no problem.
If I take a device/account pair that exists on the server I can remove it fine, but when I add it again I recieve the following error:
Uncaught Error: This key is already attached:
DeviceAccount:#Test.Models-5:::5
If I follow this in more depth I can see that removing a local device changes the entityState to be 'Detached' and if I remove a device that also exists on the server its entityState gets changed to be 'Deleted'. I can't follow much further than this and I was hoping someone could explain why this could be happening?
Just to be clear, deleting an entity via entityAspect.setDeleted causes its entityState to be set to "Deleted". This action marks the entity for deletion on the next save and also removes it from any navigation collections on the client. The entity is still being tracked by the EntityManager after this operation.
In contrast, detaching an entity via entityAspect.setDetached removes it from the entityManager cache completely. This also removes the entity from any navigation collections on the client, but will have NO effect on the server during an EntityManager.saveChanges call, because the EntityManager no longer "knows" about the entity. Think of "detaching" as telling the EntityManager to completely forget about an entity, as if it had never been queried in the first place.
"Deleting" an entity followed by "re-adding" the same entity is problematic because this would cause the EntityManager to have two incarnations of the same entity; a deleted version and an added version. Therefore the EntityManager throws the exception that you are seeing.
I think what you want to do is delete and add a "new" clone entity with a different id.
Hope this makes sense!
The reason this happens is that Breeze is keeping track of that entity until you have fully removed it from the server to keep you from creating a new entity with the same ID, which of course will throw a server exception since you can't do that.
If you called saveChanges() on your entityManager before you tried to recreate it, then Breeze will go out to the server, remove the entity from the DB, return the promise, and completely detach the entity from the local cache since it no longer exists on the server.
You could set the entityState to detached manually, but then if you try to saveChanges and that ID already exists on the server it will throw an error.
Best Option
Pass the entity into the saveChanges method in an array -
results[0].entityAspect.setDeleted();
manager.saveChanges([results[0]]).then(saveSucceeded);
function saveSucceeded() {
console.log('Entity removed from server');
}
Now after saveSucceeded has completed you can create a new entity with that ID
I re-posted this question as I think it is a bit vague. New Post
I am currently using a Windows Service that is on a 2 minute timer. I am using EF code first with a repository pattern for data access. I am using Ninject to inject my dependencies. I have the following bindings in my NinjectDependencyResolver class:
ConnectionStringSettings connectionStringSettings = ConfigurationManager.ConnectionStrings["Database"];
Bind<IDatabaseFactory>().To<DatabaseFactory>()
.InSingletonScope()
.WithConstructorArgument("connectionString", connectionStringSettings.Name);
Bind<IUnitOfWork>().To<UnitOfWork>().InSingletonScope();
Bind<IMyRepository>().To<MyRepository>().InSingletonScope();
When my service runs every 2 minutes I do some thing similar to this:
foreach (var row in rows)
{
var existing = myRepository.GetById(row.Id);
if (existing == null)
{
existing = new Row();
myRepository.Add(existing);
unitOfWork.Commit();
}
}
I am starting to see an error in my logs that say:
The changes to the database were committed successfully, but an error occurred while updating the object context. The ObjectContext might be in an inconsistent state. Inner exception message: AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges.
Is it correct to use InSingeltonScope when using Ninject in a Windows Service? I believe I tried using different scopes like InTransientScope but I could only get InSingeltonScope to work with data access. Does the error message have anything to do with Scope or is it unrelated?
Assuming that the service is not the only process that operates on the database you shouldn't use Singleton. What happens in this case is that you are reusing a DBContext that has cached entities which are out of date.
The better way is to treat each timer execution of the service in a similar way like it is a web/wcf request and create a new job processor for the request.
var processor = factory.CreateRowsProcessor();
processor.ProcessRows(rows);
public class RowsProcessor
{
public Processor(UoW uow, ....)
{
...
}
public void ProcessRows(Rows[] rows)
{
foreach (var row in rows)
{
var existing = myRepository.GetById(row.Id);
if (existing == null)
{
existing = new Row();
myRepository.Add(existing);
unitOfWork.Commit();
}
}
}
}
Depending of the problem it might even better to put the loop outside and have a new processor for each single row.
Read http://www.planetgeek.ch/2011/12/31/ninject-extensions-factory-introduction/ for more information about factories. Also have a look at the InCallScope of the named scope extension if you need to inject the UoW into multiple classes. http://www.planetgeek.ch/2010/12/08/how-to-use-the-additional-ninject-scopes-of-namedscope/
InSingletonScope will create singleton context = one context for the whole lifetime of your service. It is very bad solution. Because context holds all objects from all previous time events its memory consumption grows and there are possibilities to get errors as the one you are receiving at the moment (but the error really can be unrelated to your singleton context but most likely it is not). The exception says that you have two different objects with the same key identifier tracked by the context - that is not allowed.
Instead of using singleton uow, repository and context use singleton factory and in each time even request new fresh instances from the factory. Dispose context at the end of the time event processing.
I'm writing an app that we may be switching out the repository later (currently entity framework) to use either amazon or windows azure storage.
I have a service method that disables a user by the ID, all it does is set a property to true and set the DisabledDate. Should I call to the repository, get that user, set the properties in the service, then call to the save function in the repository? If I do this, then thats 2 database calls, should I worry about this? What if the user is updating the profile at the same time the admin is calling the disable method, and calls the user calls the save method in the repository (which currently holds false for the IsDisabled property?) Wouldn't that set the user back to being enabled if called right after the disabled method?
What is the best way to solve this problem? How do I update data in a high concurrent system?
CustomerRepository:
// Would be called from more specific method in Service Layer - e.g DisableUser
public void Update(Customer c)
{
var stub = new Customer { Id = c.Id }; // create "stub"
ctx.Customers.Attach(stub); // attach "stub" to graph
ctx.ApplyCurrentValues("Customers", c); // override scalar values of "stub"
ctx.SaveChanges(); // save changes - 1 call to DB. leave this out if you're using UoW
}
That should serve as a general-purpose "UPDATE" method in your repository. Should only be used when the entity exists.
That is just an example - in reality you should/could be using generics, checking for the existence of the entity in the graph before attaching, etc.
But that will get you on the right track.
As long as you know the id of the entity you want to save you should be able to do it by attaching the entity to the context first like so:
var c = new Customer();
c.Id = someId;
context.AttachTo("Customer", c)
c.PropertyToChange = "propertyValue";
context.SaveChanges();
Whether this approach is recommended or not, I'm not so sure as I'm not overly familiar with EF, but this will allow you to issue the update command without having to first load the entity.