Is NonShared DbContext in MVC a bad practice? - asp.net-mvc

It is an MVC application with Entity Framework Code First for the ORM and MEF as the IoC.
If I mark the DbContext with PartCreationPolicy.Shared there is an error saying the object already exists in the container every time I try to perform an edit.
But what if I simply mark the DbContext with PartCreationPolicy.NonShared so it gets created for every request?
Is there a terrible performance impact for that?
Update
Here is the code for save:
Provider IRepository<Provider>.Put(Provider item)
{
if (item.Id == Guid.Empty)
{
item.Id = Guid.NewGuid();
this.Providers.Add(item);
}
else this.Entry<Provider>(item).State = EntityState.Modified;
return item;
}
And this is the error when on Shared
An object with the same key already exists in the ObjectStateManager.
The ObjectStateManager cannot track multiple objects with the same
key.

You should definitely use PartCreationPolicy.NonShared. Everything you can read about context lifecycle management, whether it's linq to sql, entity framework, or NHibernate (sessions), will agree upon one thing: the context should be short-lived. An easy rule of the thumb is: use it for one unit of work, which means: create a context, do stuff, call SaveChanges once, dispose. Most of the times this rule works well for me.
A shared (or singleton) context is the pattern that hits performance, because the context gets bloated over time. The change tracker needs to track more and more objects, relationship fixup will get slower. And you will find yourself refreshing (re-loading) entities time and again.

Related

DBContext (entity framework) and pre-loaded entities

I use code first in a web application where I have a form to upload text files and import the data into my database.
Each file may have up to 20.000+ records for import.
To speed things up I preload some entities so not to ask the DbContext every time. Then when I create an object for insert, I do for example:
myNewObject.Category = preloadedCategories.First(p => p.Code == code);
I have read some articles on the web because EF is extremey slow on batch inserts, so what I do is:
first use Configuration.AutoDetectChangesEnabled = false;
then every 1000 records I dispose the object and make a new one.
BUT! since the preloaded entities where loaded from a db context that was disposed, after making a new DbContext, I have a problem with preloadedCategories.First(p => p.Code == code). When I make a SaveChanges(), EF tries to also save the preloadedCategories.First(p => p.Code == code) object and fails.
So how can I achive this? I don't want to aks the DbContext every time to load some (non changing) objects. Is it possible?
thanks
When dealing with a large number of records in EF, a few things will help
As #janhartmann states, use .AsNoTracking()
As you stated, use Configuration.AutoDetectChangesEnabled = false, which will require the next point
Use context.Categories.Entry(category).State = EntityState.Modified to attach a disconnected entity to a context and mark is as modified
Also make check that preloadedCategories is no longer an IQuerable and that the data really is local and not trying to lazy load from the database.
If there are no changes to your Category object and you just want to link your myNewObject to an existing category, you have two options
Set the foreign key on myNewObject instead of the navigation property
Use context.Products.Entry(myNewObject).State = EntitySate.Added instead of context.Products.Add(myNewObject) to avoid it adding the entire graph of navigation properties
Good luck

which is the difference betwwen this two ways to refresh the dbContext?

I am using EF 4.4 and I would like to update many entities, but some other user can modified many of the entities that the first user is modified. So I get a concurrency exception. Other case is that the first user tries to add many new registers and other user added some of them meanwhile. So I have an exception that exists some of the registers (unique constraint).
I would like to ensure that the first user finish his operation add only the registers that does no exists yet (add all his entities except the entities that are added by the second user).
To do that, I need to update the entities in my dbContext so I see that there at least two options.
First, in the catch when I capture the update exception, I can do:
ex.Entries.Single().Reload();
The second option is:
myContext.Entry<MyTable>(instance).Reload();
I guess that the second option only refreshes the entity that I use as parameter, but if the problem is that I need to refresh many entities, how can I do that?
What really does the first option, Single().Reload?
When you do
ex.Entries.Single().Reload();
you are sure that the offending entity is refreshed. What is does is taking the one and only (Single) entity from the DbUpdateConcurrencyException.Entries that could not be saved to the database (in case of a concurrency exception this is always exactly one).
When you do
myContext.Entry(instance).Reload();
You are not sure that you refresh the right entity unless you know that only one entity had changes before SaveChanges was called. If you save an entity with child entities any one of them can cause a concurrency problem.
In EF 6.x (6.1.3), below code will let you find all the changes; the way you asked in your question!
try
{
var listOfRefreshedObj = db.ChangeTracker.Entries().Select(x => x.Entity).ToList();
var objContext = ((IObjectContextAdapter)your_db_context).ObjectContext;
objContext.Refresh(System.Data.Entity.Core.Objects.RefreshMode.ClientWins, listOfRefreshedObj);
await db.Entry(<yourentity>).ReloadAsync();
return Content(HttpStatusCode.<code>, "<outputmessage>"); ;
}
catch (Exception e)
{
return Content(HttpStatusCode.<code>, "<exception>");
}
Explaination:
Query Entries in the ChangeTracker and store them in a list
var listOfRefreshedObj = db.ChangeTracker.Entries().Select(x => x.Entity).ToList();
Next is to refresh the context. In some cases (row is removed etc.), this will throw an exception which you can catch. RefreshMode.ClientWins tells EF to accept all client units as modified when next update occurs. In some cases, you might want to prompt the users with the changes and let them decide. RefreshMode Enumeration. An example is here ObjectContext.Refresh Method Example
objContext.Refresh(System.Data.Entity.Core.Objects.RefreshMode.ClientWins, listOfRefreshedObj);
You're probably doing this whole thing after you receive DbUpdateConcurrencyException anyways!

EF - generic "AddOrUpdate" method suddenly breaks

I am using Entity Framework 4 (database-first approach) in my ASP.NET 4.0 Webforms app.
What I'm basically doing is fetching the entity to be edited from my ObjectContext, and displaying the fields the user should enter data into (or modify existing data) on a web form.
When time comes to store the data back, I'm reading out the values from the web form, building up a new Entity instance, and then I have a generic method called AddOrUpdate that detects whether this is a new entity (so it needs to insert it), or if it's an existing one (so it needs to update the existing data).
My method using the EntityKey and checks to see if the object context already knows about this object - very similar to what Cesar de la Torre of Microsoft shows here in his blog post:
public static void AddOrUpdate(ObjectContext context, EntityObject objectDetached)
{
if (objectDetached.EntityState == EntityState.Detached)
{
object currentEntityInDb = null;
if (context.TryGetObjectByKey(objectDetached.EntityKey, out currentEntityInDb))
{
// attach and update the existing entity
}
else
{
// insert new entity into entity set
context.AddObject(objectDetached.EntityKey.EntitySetName, objectDetached);
}
}
}
This worked just fine - for the longest time. But today, suddenly, out of the blue, I keep getting exceptions like this on the context.TryGetObjectByKey statement:
System.InvalidOperationException: Object mapping could not be found for Type with identity 'MyEntityType'
I cannot remember having changed anything in this core code at all - and the entity type is defined, the ID value that's stored in the EntityKey does indeed exist in the database... everything should be fine - but it keeps failing on me...
What on earth happened here??
I did find a few blog and forum posts on the topic, but none could really enlighten me or help me fix the issue. I must have messed up something - bad - but I really cannot see the forest for the trees - any hints?
Generally this sort of issue happens when EF cant find the assembly that has the type. With out seeing the full exception is difficult to figure out exactly but it seems your recent changes and the way you are using EF seems to be the cause.
EF ususally picks the type directly from the type itself when it has to access it using ObjectSet on the context. In the other cases where the type is not available from the context of the call it looks at the calling assembly and any dll's referenced by the calling assembly. Id it cant find it it throws the error message.
You can use the LoadFromAssembly method in the MetadataWorkspace of the context.
ObjectContext.MetadataWorkspace.LoadFromAssembly(assembly).
This way EF will know where to look for your types.

InSingletonScope using Ninject and a Windows Service

I re-posted this question as I think it is a bit vague. New Post
I am currently using a Windows Service that is on a 2 minute timer. I am using EF code first with a repository pattern for data access. I am using Ninject to inject my dependencies. I have the following bindings in my NinjectDependencyResolver class:
ConnectionStringSettings connectionStringSettings = ConfigurationManager.ConnectionStrings["Database"];
Bind<IDatabaseFactory>().To<DatabaseFactory>()
.InSingletonScope()
.WithConstructorArgument("connectionString", connectionStringSettings.Name);
Bind<IUnitOfWork>().To<UnitOfWork>().InSingletonScope();
Bind<IMyRepository>().To<MyRepository>().InSingletonScope();
When my service runs every 2 minutes I do some thing similar to this:
foreach (var row in rows)
{
var existing = myRepository.GetById(row.Id);
if (existing == null)
{
existing = new Row();
myRepository.Add(existing);
unitOfWork.Commit();
}
}
I am starting to see an error in my logs that say:
The changes to the database were committed successfully, but an error occurred while updating the object context. The ObjectContext might be in an inconsistent state. Inner exception message: AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges.
Is it correct to use InSingeltonScope when using Ninject in a Windows Service? I believe I tried using different scopes like InTransientScope but I could only get InSingeltonScope to work with data access. Does the error message have anything to do with Scope or is it unrelated?
Assuming that the service is not the only process that operates on the database you shouldn't use Singleton. What happens in this case is that you are reusing a DBContext that has cached entities which are out of date.
The better way is to treat each timer execution of the service in a similar way like it is a web/wcf request and create a new job processor for the request.
var processor = factory.CreateRowsProcessor();
processor.ProcessRows(rows);
public class RowsProcessor
{
public Processor(UoW uow, ....)
{
...
}
public void ProcessRows(Rows[] rows)
{
foreach (var row in rows)
{
var existing = myRepository.GetById(row.Id);
if (existing == null)
{
existing = new Row();
myRepository.Add(existing);
unitOfWork.Commit();
}
}
}
}
Depending of the problem it might even better to put the loop outside and have a new processor for each single row.
Read http://www.planetgeek.ch/2011/12/31/ninject-extensions-factory-introduction/ for more information about factories. Also have a look at the InCallScope of the named scope extension if you need to inject the UoW into multiple classes. http://www.planetgeek.ch/2010/12/08/how-to-use-the-additional-ninject-scopes-of-namedscope/
InSingletonScope will create singleton context = one context for the whole lifetime of your service. It is very bad solution. Because context holds all objects from all previous time events its memory consumption grows and there are possibilities to get errors as the one you are receiving at the moment (but the error really can be unrelated to your singleton context but most likely it is not). The exception says that you have two different objects with the same key identifier tracked by the context - that is not allowed.
Instead of using singleton uow, repository and context use singleton factory and in each time even request new fresh instances from the factory. Dispose context at the end of the time event processing.

System.InvalidOperationException when trying to iteratively add objects using EF 4

This question is very similiar to this one. However, the resolution to that question:
Does not seem to apply, or
Are somewhat suspect, and don't seem like a good approach to resolving the problem.
Basically, I'm iterating over a generic list of objects, and inserting them. Using MVC 2, EF 4 with the default code generation.
foreach(Requirement r in requirements)
{
var car = new CustomerAgreementRequirement();
car.CustomerAgreementId = viewModel.Agreement.CustomerAgreementId;
car.RequirementId = r.RequirementId;
_carRepo.Add(car); //Save new record
}
And the Repository.Add() method:
public class BaseRepository<TEntity> : IRepository<TEntity> where TEntity : class
{
private TxRPEntities txDB;
private ObjectSet<TEntity> _objectSet;
public void Add(TEntity entity)
{
SetUpdateParams(entity);
_objectSet.AddObject(entity);
txDB.SaveChanges();
}
I should note that I've been successfully using the Add() method throughout my code for single inserts; this is the first time I've tried to use it to iteratively insert a group of objects.
The error:
System.InvalidOperationException: The changes to the database were committed successfully, but an error occurred while updating the object context. The ObjectContext might be in an inconsistent state. Inner exception message: AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges.
As stated in the prior question, the EntityKey is set to True, StoreGeneratedPattern = Identity. The actual table that is being inserted into is a relationship table, in that it is comprised of an identity field and two foreign key fields. The error always occurs on the second insert, regardless of whether that specific entity has been inserted before or not, and I can confirm that the values are always different, no key conflicts as far as the database is concerned. My suspicion is that it has something to do with the temporary entitykey that gets set prior to the actual insert, but I don't know how to confirm that, nor do I know how to resolve it.
My gut feeling is that the solution in the prior question, to set the SaveOptions to None, would not be the best solution. (See prior discussion here)
I've had this issue with my repository using a loop as well and thought that it might be caused by some weird race-like condition. What I've done is refactor out a UnitOfWork class, so that the repository.add() method is strictly adding to the database, but not storing the context. Thus, the repository is only responsible for the collection itself, and every operation on that collection happens in the scope of the unit of work.
The issue there is that: In a loop, you run out of memory damn fast with EF4. So you do need to store the changes periodically, I just don't store after every save.
public class BaseRepository : IRepository where TEntity : class
{
private TxRPEntities txDB;
private ObjectSet _objectSet;
public void Add(TEntity entity)
{
SetUpdateParams(entity);
_objectSet.AddObject(entity);
}
public void Save()
{
txDB.SaveChanges();
}
Then you can do something like
foreach(Requirement r in requirements)
{
var car = new CustomerAgreementRequirement();
car.CustomerAgreementId = viewModel.Agreement.CustomerAgreementId;
car.RequirementId = r.RequirementId;
_carRepo.Add(car); //Save new record
if (some number limiting condition if you have thousands)
_carRepo.Save(); // To save periodically and clear memory
}
_carRepo.Save();
Note: I don't really like this solution, but I hunted around to try to find why things break in a loop when they work elsewhere, and that's the best I came up with.
We have had some odd collision issues if the entity is not added to the context directly after being created (before doing any assignments). The only time I've noticed the issue is when adding objects in a loop.
Try adding the newed up entity to the context, do the assignments, then save the context. Also, you don't need to save the context each time you add a new entity unless you absolutely need the primary key.

Resources