System.InvalidOperationException when trying to iteratively add objects using EF 4 - entity-framework-4

This question is very similiar to this one. However, the resolution to that question:
Does not seem to apply, or
Are somewhat suspect, and don't seem like a good approach to resolving the problem.
Basically, I'm iterating over a generic list of objects, and inserting them. Using MVC 2, EF 4 with the default code generation.
foreach(Requirement r in requirements)
{
var car = new CustomerAgreementRequirement();
car.CustomerAgreementId = viewModel.Agreement.CustomerAgreementId;
car.RequirementId = r.RequirementId;
_carRepo.Add(car); //Save new record
}
And the Repository.Add() method:
public class BaseRepository<TEntity> : IRepository<TEntity> where TEntity : class
{
private TxRPEntities txDB;
private ObjectSet<TEntity> _objectSet;
public void Add(TEntity entity)
{
SetUpdateParams(entity);
_objectSet.AddObject(entity);
txDB.SaveChanges();
}
I should note that I've been successfully using the Add() method throughout my code for single inserts; this is the first time I've tried to use it to iteratively insert a group of objects.
The error:
System.InvalidOperationException: The changes to the database were committed successfully, but an error occurred while updating the object context. The ObjectContext might be in an inconsistent state. Inner exception message: AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges.
As stated in the prior question, the EntityKey is set to True, StoreGeneratedPattern = Identity. The actual table that is being inserted into is a relationship table, in that it is comprised of an identity field and two foreign key fields. The error always occurs on the second insert, regardless of whether that specific entity has been inserted before or not, and I can confirm that the values are always different, no key conflicts as far as the database is concerned. My suspicion is that it has something to do with the temporary entitykey that gets set prior to the actual insert, but I don't know how to confirm that, nor do I know how to resolve it.
My gut feeling is that the solution in the prior question, to set the SaveOptions to None, would not be the best solution. (See prior discussion here)

I've had this issue with my repository using a loop as well and thought that it might be caused by some weird race-like condition. What I've done is refactor out a UnitOfWork class, so that the repository.add() method is strictly adding to the database, but not storing the context. Thus, the repository is only responsible for the collection itself, and every operation on that collection happens in the scope of the unit of work.
The issue there is that: In a loop, you run out of memory damn fast with EF4. So you do need to store the changes periodically, I just don't store after every save.
public class BaseRepository : IRepository where TEntity : class
{
private TxRPEntities txDB;
private ObjectSet _objectSet;
public void Add(TEntity entity)
{
SetUpdateParams(entity);
_objectSet.AddObject(entity);
}
public void Save()
{
txDB.SaveChanges();
}
Then you can do something like
foreach(Requirement r in requirements)
{
var car = new CustomerAgreementRequirement();
car.CustomerAgreementId = viewModel.Agreement.CustomerAgreementId;
car.RequirementId = r.RequirementId;
_carRepo.Add(car); //Save new record
if (some number limiting condition if you have thousands)
_carRepo.Save(); // To save periodically and clear memory
}
_carRepo.Save();
Note: I don't really like this solution, but I hunted around to try to find why things break in a loop when they work elsewhere, and that's the best I came up with.

We have had some odd collision issues if the entity is not added to the context directly after being created (before doing any assignments). The only time I've noticed the issue is when adding objects in a loop.
Try adding the newed up entity to the context, do the assignments, then save the context. Also, you don't need to save the context each time you add a new entity unless you absolutely need the primary key.

Related

How to correctly use Data Annotations to select which Items should be returned by the Web API?

I'm trying to specify a subset of data to be returned from a database query by Web API 2.
In particular, for this query, I first turn lazy loading on:
db.Configuration.LazyLoadingEnabled = true;
This is because there are potentially infinite levels of children. For example:
Parent: {"name":"Jon","children":[{"name":"Dave","children":["name":"Ed"...
Each person in the above sequence can also have a biography. In the database, the books also have related tables for, let's say, authors, reviewers etc.
As far as I know I can add data annotations to the model to specify which fields to return:
[Key]
= specifies the key which will be returned
[DataMember]
= specifies a property which will be returned
[JsonIgnore]
[IgnoreDataMember]
= specifies a property which will not be returned
[JsonObject(IsReference = true)]
= specifies that the object is being references from another object and therefore related objects should not be loaded
I'm struggling to load the related biographies. The id for the biography is returning, but the biography objects are null. From the parent object, I have annotated both the nullable int and the virtual object references to biography with [DataMember]. In the biography object, I have then specified the id with [Key] and the name with [DataMember] and all other properties with [JsonIgnore]
[IgnoreDataMember]. However the biographies are not being loaded. The db query is returning the items loaded, but they are then being nulled by web api, I assume because of some circular reference in the chain somewhere.
There are about 50 tables linked in some way, do I need to go through them all and add data annotations to everyone - even if I have used an ignore annotation to break the chain? Hoping for a simple solution, but any solution appreciated!
It seems to be working fine that [DataMember] will load a HashSet of related data, which gets instantiated in the constructor, but related databases objects (which are not instantiated in the constructor) do not get loaded.
It seems that the update statement doesn't turn lazy loading on:
db.Configuration.LazyLoadingEnabled = true;
The related items would only be returned if I went into debug mode and loaded the related data when hovering over the object (seems strange), but basically it was staying in lazy loading = false mode.
My solution has been to turn lazy loading on globally and to use data annotations as described above to avoid circular references.

Is NonShared DbContext in MVC a bad practice?

It is an MVC application with Entity Framework Code First for the ORM and MEF as the IoC.
If I mark the DbContext with PartCreationPolicy.Shared there is an error saying the object already exists in the container every time I try to perform an edit.
But what if I simply mark the DbContext with PartCreationPolicy.NonShared so it gets created for every request?
Is there a terrible performance impact for that?
Update
Here is the code for save:
Provider IRepository<Provider>.Put(Provider item)
{
if (item.Id == Guid.Empty)
{
item.Id = Guid.NewGuid();
this.Providers.Add(item);
}
else this.Entry<Provider>(item).State = EntityState.Modified;
return item;
}
And this is the error when on Shared
An object with the same key already exists in the ObjectStateManager.
The ObjectStateManager cannot track multiple objects with the same
key.
You should definitely use PartCreationPolicy.NonShared. Everything you can read about context lifecycle management, whether it's linq to sql, entity framework, or NHibernate (sessions), will agree upon one thing: the context should be short-lived. An easy rule of the thumb is: use it for one unit of work, which means: create a context, do stuff, call SaveChanges once, dispose. Most of the times this rule works well for me.
A shared (or singleton) context is the pattern that hits performance, because the context gets bloated over time. The change tracker needs to track more and more objects, relationship fixup will get slower. And you will find yourself refreshing (re-loading) entities time and again.

EF - generic "AddOrUpdate" method suddenly breaks

I am using Entity Framework 4 (database-first approach) in my ASP.NET 4.0 Webforms app.
What I'm basically doing is fetching the entity to be edited from my ObjectContext, and displaying the fields the user should enter data into (or modify existing data) on a web form.
When time comes to store the data back, I'm reading out the values from the web form, building up a new Entity instance, and then I have a generic method called AddOrUpdate that detects whether this is a new entity (so it needs to insert it), or if it's an existing one (so it needs to update the existing data).
My method using the EntityKey and checks to see if the object context already knows about this object - very similar to what Cesar de la Torre of Microsoft shows here in his blog post:
public static void AddOrUpdate(ObjectContext context, EntityObject objectDetached)
{
if (objectDetached.EntityState == EntityState.Detached)
{
object currentEntityInDb = null;
if (context.TryGetObjectByKey(objectDetached.EntityKey, out currentEntityInDb))
{
// attach and update the existing entity
}
else
{
// insert new entity into entity set
context.AddObject(objectDetached.EntityKey.EntitySetName, objectDetached);
}
}
}
This worked just fine - for the longest time. But today, suddenly, out of the blue, I keep getting exceptions like this on the context.TryGetObjectByKey statement:
System.InvalidOperationException: Object mapping could not be found for Type with identity 'MyEntityType'
I cannot remember having changed anything in this core code at all - and the entity type is defined, the ID value that's stored in the EntityKey does indeed exist in the database... everything should be fine - but it keeps failing on me...
What on earth happened here??
I did find a few blog and forum posts on the topic, but none could really enlighten me or help me fix the issue. I must have messed up something - bad - but I really cannot see the forest for the trees - any hints?
Generally this sort of issue happens when EF cant find the assembly that has the type. With out seeing the full exception is difficult to figure out exactly but it seems your recent changes and the way you are using EF seems to be the cause.
EF ususally picks the type directly from the type itself when it has to access it using ObjectSet on the context. In the other cases where the type is not available from the context of the call it looks at the calling assembly and any dll's referenced by the calling assembly. Id it cant find it it throws the error message.
You can use the LoadFromAssembly method in the MetadataWorkspace of the context.
ObjectContext.MetadataWorkspace.LoadFromAssembly(assembly).
This way EF will know where to look for your types.

InSingletonScope using Ninject and a Windows Service

I re-posted this question as I think it is a bit vague. New Post
I am currently using a Windows Service that is on a 2 minute timer. I am using EF code first with a repository pattern for data access. I am using Ninject to inject my dependencies. I have the following bindings in my NinjectDependencyResolver class:
ConnectionStringSettings connectionStringSettings = ConfigurationManager.ConnectionStrings["Database"];
Bind<IDatabaseFactory>().To<DatabaseFactory>()
.InSingletonScope()
.WithConstructorArgument("connectionString", connectionStringSettings.Name);
Bind<IUnitOfWork>().To<UnitOfWork>().InSingletonScope();
Bind<IMyRepository>().To<MyRepository>().InSingletonScope();
When my service runs every 2 minutes I do some thing similar to this:
foreach (var row in rows)
{
var existing = myRepository.GetById(row.Id);
if (existing == null)
{
existing = new Row();
myRepository.Add(existing);
unitOfWork.Commit();
}
}
I am starting to see an error in my logs that say:
The changes to the database were committed successfully, but an error occurred while updating the object context. The ObjectContext might be in an inconsistent state. Inner exception message: AcceptChanges cannot continue because the object's key values conflict with another object in the ObjectStateManager. Make sure that the key values are unique before calling AcceptChanges.
Is it correct to use InSingeltonScope when using Ninject in a Windows Service? I believe I tried using different scopes like InTransientScope but I could only get InSingeltonScope to work with data access. Does the error message have anything to do with Scope or is it unrelated?
Assuming that the service is not the only process that operates on the database you shouldn't use Singleton. What happens in this case is that you are reusing a DBContext that has cached entities which are out of date.
The better way is to treat each timer execution of the service in a similar way like it is a web/wcf request and create a new job processor for the request.
var processor = factory.CreateRowsProcessor();
processor.ProcessRows(rows);
public class RowsProcessor
{
public Processor(UoW uow, ....)
{
...
}
public void ProcessRows(Rows[] rows)
{
foreach (var row in rows)
{
var existing = myRepository.GetById(row.Id);
if (existing == null)
{
existing = new Row();
myRepository.Add(existing);
unitOfWork.Commit();
}
}
}
}
Depending of the problem it might even better to put the loop outside and have a new processor for each single row.
Read http://www.planetgeek.ch/2011/12/31/ninject-extensions-factory-introduction/ for more information about factories. Also have a look at the InCallScope of the named scope extension if you need to inject the UoW into multiple classes. http://www.planetgeek.ch/2010/12/08/how-to-use-the-additional-ninject-scopes-of-namedscope/
InSingletonScope will create singleton context = one context for the whole lifetime of your service. It is very bad solution. Because context holds all objects from all previous time events its memory consumption grows and there are possibilities to get errors as the one you are receiving at the moment (but the error really can be unrelated to your singleton context but most likely it is not). The exception says that you have two different objects with the same key identifier tracked by the context - that is not allowed.
Instead of using singleton uow, repository and context use singleton factory and in each time even request new fresh instances from the factory. Dispose context at the end of the time event processing.

CreateDbCommandDefinition fires twice during method PUT through WCF Data Services

We are trying to develop our own EF provider for our legacy APIs. We managed to get "GET/POST" operation working successfully.
However, for operation "PUT/MERGE", the method "CreateDbCommandDefinition" (of DbProviderServices implementation) fires twice. One with "DbQueryCommandTree" and another with "DbUpdateCommandTree".
I understand that it needs to fetch the entity prior to update it (for change tracking I guess). In our case, I don't need the entity information to be fetched prior to update. I simply want to call our legacy APIs with the entity sent for update. How can we strictly ask it to not to do the work of "DbQueryCommandTree" (and do only the work of "DbUpdateCommandTree") when I working with "PUT/MERGE" operations.
The client code looks something like the one below:
public void CustomerUpdateTest()
{
try
{
Ctxt.MergeOption = MergeOption.NoTracking;
var oNewCus = new Customer()
{
MasterCustomerId = "1001",
SubCustomerId = "0",
FirstName = "abc",
LastName = "123"
};
Ctxt.AttachTo("Customers", oNewCus);
Ctxt.UpdateObject(oNewCus);
//Ctxt.SaveChanges();
Ctxt.SaveChanges(SaveChangesOptions.ReplaceOnUpdate);
}
catch (Exception ex)
{
Assert.Fail(ex.Message);
}
You will have to write your own IDataServiceUpdateProvider to make this happen. For EF, the in built EF update provider does 2 queries - one to get the entity which needs to be modified and one for the actual modification. We are planning to make this provider public in our next release, so folks can derive from it and just override one or more methods. But for now, you will have to implement the interface yourself.
For PUT/MERGE requests, WCF Data Services calls IDataServiceUpdateProvider.GetResource to get the entity to update. In your implementation of this method, you can return a token that represents the object that need to get modified (you will have to visit the expression tree that gets passed in this method to find out the entity set and the key value of the entity in question).
In SaveChanges, you can push the update based on the token. That way you can avoid one round trip to the database.
Hope this helps.

Resources