How many DbContext subclasses should I have, in relation to my models? - asp.net-mvc

I'm learning ASP.NET MVC and I'm having some questions that the tutorials I've read until now haven't explored in a way that covers me. I've tried searching, but I didn't see any questions asking this. Still, please forgive me if I have missed an existing ones.
If I have a single ASP.NET MVC application that has a number of models (some of which related and some unrelated with each other), how many DbContext subclasses should I create, if I want to use one connection string and one database globally for my application?
One context for every model?
One context for every group of related models?
One context for all the models?
If the answer is one of the first two, then is there anything I should have in mind to make sure that only one database is created for the whole application? I ask because, when debugging locally in Visual Studio, it looks to me like it's creating as many databases as there are contexts. That's why I find myself using the third option, but I'd like to know if it's a correct practice or if I'm making some kind of mistake that will come back and bite me later.

#jrummell is only partially correct. Entity Framework will create one database per DbContext type, if you leave it to its own devices. Using the concept of "bounded contexts" that #NeilThompson mentioned from Julie Lerhman, all you're doing is essentially telling each context to actually use the same database. Julie's method uses a generic pattern so that each DbContext that implements it ends up on the same database, but you could do it manually for each one, which would look like:
public class MyContext : DbContext
{
public MyContext()
: base("name=DatabaseConnectionStringNameHere")
{
Database.SetInitializer(null);
}
}
In other words, Julie's method just sets up a base class that each of your contexts can inherit from that handles this piece automatically.
This does two things: 1) it tells your context to use a specific database (i.e., the same as every other context) and 2) it tells your context to disable database initialization. This last part is important because these contexts are now essentially treated as database-first. In other words, you now have no context that can actually cause a database to be created, or to signal that a migration needs to occur. As a result, you actually need another "master" context that will have every single entity in your application in it. You don't have to use this context for anything other than creating migrations and updating your database, though. For your code, you can use your more specialized contexts.
The other thing to keep in mind with specialized contexts is that each instantiation of each context represents a unique state even if they share entities. For example, a Cat entity from one context is not the same thing as a Cat entity from a second context, even if they share the same primary key. You will get an error if you retrieved the Cat from the first context, updated it, and then tried save it via the second context. That example is a bit contrived since you're not likely to have the same entity explicitly in two different contexts, but when you get into foreign key relationships and such it's far more common to run into this problem. Even if you don't explicitly declare a DbSet for a related entity, it an entity in the context depends on it, EF will implicitly create a DbSet for it. All this is to say that if you use specialized contexts, you need to ensure that they are truly specialized and that there is zero crossover at any level of related items.

I use what Julie Lerman calls the Bounded Context
The SystemUsers code might have nothing to do with Products - so I might have a System DbContext and a Shop DbContext (for example).
Life is easier with a single context in a small app, but for larger application it helps to break the contexts up.

Typically, you should have one DbContext per database. But if you have separate, unrelated groups of models, it would make sense to have separate DbContext implementations.
it looks to me like it's creating as many databases as there are
contexts.
That's correct, Entity Framework will create one database per DbContext type.

Related

Breeze Creating Complex Entities with Lookup Tables

I am new to Angular and Breeze. I am trying to create a new group with customer relationships at the same time. My DB structure has 3 tables that are affected the Groups, UserToGroups and the CustomersToGroups tables. The entity manager has the tables and references registered. My question is, do I need to first create the group before I can add the records to the lookup tables? If a groupId does not exist it should not be able to create a UserToGroups record or a CustomersToGroups record. Is Breeze and the entity manager handling all of this? Can I do this all in one shot? How?
controller.js
function addCustAndGroup(CustId, grpId, GroupCategory, GroupDesc) {
var newGroup = app.dataservice.manager.createEntity("Group");
newGroup.GroupId(CustId);
newGroup.GroupCategory(GroupCategory);
newGroup.GroupDescription(GroupDesc);
manager.StudnetsToGroup.push(studentToGroup);
manager.UserToGroup.push(userToGroup);
manager.saveChanges();
};
I want to point out something that you're doing that I find worrisome: your dataservice exposes the manager to its caller. A key reason for having a dataservice is to encapsulate the EntityManager ... to hide it from all callers. If you're going to expose the manager to callers, why bother with the dataservice at all?
I don't have a lot of absolute rules but you've broken one of the few, namely "never expose the EntityManager from the dataservice." I have never had to relax that rule. I promise you that you will regret breaking it.
I think what you want to do is have your dataservice expose a createGroup method. All of the work of creating the group should be tucked inside the dataservice.
Now that I'm done ranting, I'm trying to make sense of your actual question and your code. I'm sorry but I'm kind of lost. I'm stumped by manager.UserToGroup.push because (a) the EntityManager doesn't have a UserToGroup member and (b) there is no push method in the Breeze API ... certainly nothing like that for adding/attaching entities to the EntityManager cache.
I don't know which of your "tables" are "lookup tables". I'm kind of confused by the words "table" and "record" anyway as you seem really to be talking about entity type and entity instances, not database objects. I get the parallelism but it's worth maintaining a clear distinction between entities and the database structures to which they are mapped.
A final confession of my confusion: your question subject refers to "Complex Entities". Breeze has a notion called "Complex Type". I don't think that's what you mean. I think you mean that you believe your Entity is complicated. Is that right?
I sense that none of this is getting closer to an answer to your question ... which I cannot quite fathom. It seems to be about whether you have to create one type of entity before a different type. But I can't be sure.
I think you'll have to clarify your question before you get a proper answer.

Code First - I feel I am missing the point

So to put things into perspective:
Until now i've had some experience with working with Entity Framework edmx models. The process (layer wise) would summarize to:
creating the database, having a layer of objects generated in the edmx file
having a layer of business objects (i am not sure if business is the right word) that are initialized from the edmx objects (and only the business objects are used inside my application)
in the MVC site having a layer of view model objects wherever needed (whenever generating views directly from my business objects would not work)
I am under the impression that with Code First the database layer objects (previously named edmx objects) and the business layer objects are one and the same, practically skipping a layer. That is what a colleague who had much more experience than me told me.
The problem is that i don't think this is quite applicable. Limitations of code first like scalar keys, having navigational properties inside objects, being unable to map private fields inside the DB without workarounds and being forced somewhat to have autoproperties everywhere feel that i am messing up my business objects.
Quick Example:
I have the entities of:
Team : TeamId and Description (Backend, FrontEnd etc)
Group : GroupId and Description (Developers, TeamLeaders, PMs, etc)
Status: an object that has a Team and a Group, something that is not to be kept inside the DB
User: UserId, FirstName, LastName
Employee: a user which also is part of a Team and a Group (inside it has a User property and a Status property)
The Users table will be denormalized in the sense that it will be containing all the info my User class has, but also have a TeamId and a GroupId, which will be populated only if the User is also an Employee. So basically Employee and User are mapping to the same table
This leads to some weird workarounds, since for instance in the Employee class i need to expose UserId as a primary key, and not get it through the User property's UserId (because the key needs to be scalar) which is ugly; also, even though i have a Status property inside the Employee class, i still need to have 2 additional properties (TeamId and GroupId, gotten from the Status properties Team and Group) in order to have scalar foreign keys to Groups and Teams, which again is cumbersome and feels messy. Adding inside the Group/Team class a virtual IList for navigational purposes also seems unneeded (though from what i've seen this can be skipped). Also having only auto properties for the objects seems wrong, i want to have privates instantiated by the ctor and only getters for them.
Am I not getting it ? Do i still need the additional layer even while using code first ? Is my model messed up from the beginning ? Sorry i cannot provide you with the code, but i am not home at the moment, and the code block seems very dodgy to use :D
I tend to work within the repository and service pattern(s). In my expereince, I have always found it easier to solidify what my application is going to be using (models) and how they will stored (structure). I lean on the side of building up the database with all the linking and FKs, building an EF model (EDMX), and then adding a layer on top of that as a Service/Repo layer. This way, your application always just references that Service/Repo layer, and if your EDMX breaks or you have to change the way something is calling your EDMX, you only have to fix it in one spot. Recently I have been doing a mix of the IRepository pattern mixed in with a Service class, and it seems to be meshing really well and is easy to use. Hope this clarified some for you, best of luck!

Asp.net MVC Architecture

I'm coming to the end of my first MVC project, and I'm not overly happy with how I constructed my Model objects and I'm looking for some ideas on how to improve them.
I use repositories for each DB table with Get, Save, Delete etc methods.
The repositories use Linq2Sql for the DB access.
I do mapping from the Linq2Sql objects to MVC Model objects, in the main, these are very much 1 to 1 mappings.
My problem is, I don't think my MVC model objects were granular enough, and I am probably passing more data back and forth than needed.
For example, I have a User table. An admin can edit a users details as can the user themselves, so I reckon I should really have a "AdminUserModel" and "UserModel" objects, where "AdminUserModel" has a greater set of values (IsEnabled for example).
So my bigger question is really, what kind of architectures are people using out there in the wild, in order to map many similar, related Model objects down through the layers to the DB?
Any sample architecture solutions anyone can suggest beyond NerdDinner?
thanks in advance!
In the case of your user model, you should use inheritence in stead of 2 seperated models. In this way you can use the code that was created for user in the ones that inherite from it.
the type of model you use depends completely on what you want to do with it. A good thing might be to take a look at patterns and try to get the patterns working that are needed for your situation...
I usually take implement inheritance in my models.
I usually have a base class of entity, which will have id, datecreated, valid and any other fields that are shared between entities (publishStatus, locked etc).
If needs be you can create other base classes inheriting from entity: person entity, product entity etc.
this way you can have a generic repository base, constrained to Entity or IEntity, i find that most entities CRUD functions dont need much more behaviour than that provided by the generic base (perhaps you will need to add a few additional get methods for some types)
In your case, AdminUser could inherit from User

Update relationships when saving changes of EF4 POCO objects

Entity Framework 4, POCO objects and ASP.Net MVC2. I have a many to many relationship, lets say between BlogPost and Tag entities. This means that in my T4 generated POCO BlogPost class I have:
public virtual ICollection<Tag> Tags {
// getter and setter with the magic FixupCollection
}
private ICollection<Tag> _tags;
I ask for a BlogPost and the related Tags from an instance of the ObjectContext and send it to another layer (View in the MVC application). Later I get back the updated BlogPost with changed properties and changed relationships. For example it had tags "A" "B" and "C", and the new tags are "C" and "D". In my particular example there are no new Tags and the properties of the Tags never change, so the only thing which should be saved is the changed relationships. Now I need to save this in another ObjectContext. (Update: Now I tried to do in the same context instance and also failed.)
The problem: I can't make it save the relationships properly. I tried everything I found:
Controller.UpdateModel and Controller.TryUpdateModel don't work.
Getting the old BlogPost from the context then modifying the collection doesn't work. (with different methods from the next point)
This probably would work, but I hope this is just a workaround, not the solution :(.
Tried Attach/Add/ChangeObjectState functions for BlogPost and/or Tags in every possible combinations. Failed.
This looks like what I need, but it doesn't work (I tried to fix it, but can't for my problem).
Tried ChangeState/Add/Attach/... the relationship objects of the context. Failed.
"Doesn't work" means in most cases that I worked on the given "solution" until it produces no errors and saves at least the properties of BlogPost. What happens with the relationships varies: usually Tags are added again to the Tag table with new PKs and the saved BlogPost references those and not the original ones. Of course the returned Tags have PKs, and before the save/update methods I check the PKs and they are equal to the ones in the database so probably EF thinks that they are new objects and those PKs are the temp ones.
A problem I know about and might make it impossible to find an automated simple solution: When a POCO object's collection is changed, that should happen by the above mentioned virtual collection property, because then the FixupCollection trick will update the reverse references on the other end of the many-to-many relationship. However when a View "returns" an updated BlogPost object, that didn't happen. This means that maybe there is no simple solution to my problem, but that would make me very sad and I would hate the EF4-POCO-MVC triumph :(. Also that would mean that EF can't do this in the MVC environment whichever EF4 object types are used :(. I think the snapshot based change tracking should find out that the changed BlogPost has relationships to Tags with existing PKs.
Btw: I think the same problem happens with one-to-many relations (google and my colleague say so). I will give it a try at home, but even if that works that doesn't help me in my six many-to-many relationships in my app :(.
Let's try it this way:
Attach BlogPost to context. After attaching object to context the state of the object, all related objects and all relations is set to Unchanged.
Use context.ObjectStateManager.ChangeObjectState to set your BlogPost to Modified
Iterate through Tag collection
Use context.ObjectStateManager.ChangeRelationshipState to set state for relation between current Tag and BlogPost.
SaveChanges
Edit:
I guess one of my comments gave you false hope that EF will do the merge for you. I played a lot with this problem and my conclusion says EF will not do this for you. I think you have also found my question on MSDN. In reality there is plenty of such questions on the Internet. The problem is that it is not clearly stated how to deal with this scenario. So lets have a look on the problem:
Problem background
EF needs to track changes on entities so that persistance knows which records have to be updated, inserted or deleted. The problem is that it is ObjectContext responsibility to track changes. ObjectContext is able to track changes only for attached entities. Entities which are created outside the ObjectContext are not tracked at all.
Problem description
Based on above description we can clearly state that EF is more suitable for connected scenarios where entity is always attached to context - typical for WinForm application. Web applications requires disconnected scenario where context is closed after request processing and entity content is passed as HTTP response to the client. Next HTTP request provides modified content of the entity which has to be recreated, attached to new context and persisted. Recreation usually happends outside of the context scope (layered architecture with persistance ignorace).
Solution
So how to deal with such disconnected scenario? When using POCO classes we have 3 ways to deal with change tracking:
Snapshot - requires same context = useless for disconnected scenario
Dynamic tracking proxies - requires same context = useless for disconnected scenario
Manual synchronization.
Manual synchronization on single entity is easy task. You just need to attach entity and call AddObject for inserting, DeleteObject for deleting or set state in ObjectStateManager to Modified for updating. The real pain comes when you have to deal with object graph instead of single entity. This pain is even worse when you have to deal with independent associations (those that don't use Foreign Key property) and many to many relations. In that case you have to manually synchronize each entity in object graph but also each relation in object graph.
Manual synchronization is proposed as solution by MSDN documentation: Attaching and Detaching objects says:
Objects are attached to the object
context in an Unchanged state. If you
need to change the state of an object
or the relationship because you know
that your object was modified in
detached state, use one of the
following methods.
Mentioned methods are ChangeObjectState and ChangeRelationshipState of ObjectStateManager = manual change tracking. Similar proposal is in other MSDN documentation article: Defining and Managing Relationships says:
If you are working with disconnected
objects you must manually manage the
synchronization.
Moreover there is blog post related to EF v1 which criticise exactly this behavior of EF.
Reason for solution
EF has many "helpful" operations and settings like Refresh, Load, ApplyCurrentValues, ApplyOriginalValues, MergeOption etc. But by my investigation all these features work only for single entity and affects only scalar preperties (= not navigation properties and relations). I rather not test this methods with complex types nested in entity.
Other proposed solution
Instead of real Merge functionality EF team provides something called Self Tracking Entities (STE) which don't solve the problem. First of all STE works only if same instance is used for whole processing. In web application it is not the case unless you store instance in view state or session. Due to that I'm very unhappy from using EF and I'm going to check features of NHibernate. First observation says that NHibernate perhaps has such functionality.
Conclusion
I will end up this assumptions with single link to another related question on MSDN forum. Check Zeeshan Hirani's answer. He is author of Entity Framework 4.0 Recipes. If he says that automatic merge of object graphs is not supported, I believe him.
But still there is possibility that I'm completely wrong and some automatic merge functionality exists in EF.
Edit 2:
As you can see this was already added to MS Connect as suggestion in 2007. MS has closed it as something to be done in next version but actually nothing had been done to improve this gap except STE.
I have a solution to the problem that was described above by Ladislav. I have created an extension method for the DbContext which will automatically perform the add/update/delete's based on a diff of the provided graph and persisted graph.
At present using the Entity Framework you will need to perform the updates of the contacts manually, check if each contact is new and add, check if updated and edit, check if removed then delete it from the database. Once you have to do this for a few different aggregates in a large system you start to realize there must be a better, more generic way.
Please take a look and see if it can help http://refactorthis.wordpress.com/2012/12/11/introducing-graphdiff-for-entity-framework-code-first-allowing-automated-updates-of-a-graph-of-detached-entities/
You can go straight to the code here https://github.com/refactorthis/GraphDiff
I know it's late for the OP but since this is a very common issue I posted this in case it serves someone else.
I've been toying around with this issue and I think I got a fairly simple solution,
what I do is:
Save main object (Blogs for example) by setting its state to Modified.
Query the database for the updated object including the collections I need to update.
Query and convert .ToList() the entities I want my collection to include.
Update the main object's collection(s) to the List I got from step 3.
SaveChanges();
In the following example "dataobj" and "_categories" are the parameters received by my controller "dataobj" is my main object, and "_categories" is an IEnumerable containing the IDs of the categories the user selected in the view.
db.Entry(dataobj).State = EntityState.Modified;
db.SaveChanges();
dataobj = db.ServiceTypes.Include(x => x.Categories).Single(x => x.Id == dataobj.Id);
var it = _categories != null ? db.Categories.Where(x => _categories.Contains(x.Id)).ToList() : null;
dataobj.Categories = it;
db.SaveChanges();
It even works for multiple relations
The Entity Framework team is aware that this is a usability issue and plans to address it post-EF6.
From the Entity Framework team:
This is a usability issue that we are aware of and is something we have been thinking about and plan to do more work on post-EF6. I have created this work item to track the issue: http://entityframework.codeplex.com/workitem/864 The work item also contains a link to the user voice item for this--I encourage you to vote for it if you have not done so already.
If this impacts you, vote for the feature at
http://entityframework.codeplex.com/workitem/864
All of the answers were great to explain the problem, but none of them really solved the problem for me.
I found that if I didn't use the relationship in the parent entity but just added and removed the child entities everything worked just fine.
Sorry for the VB but that is what the project I am working in is written in.
The parent entity "Report" has a one to many relationship to "ReportRole" and has the property "ReportRoles". The new roles are passed in by a comma separated string from an Ajax call.
The first line will remove all the child entities, and if I used "report.ReportRoles.Remove(f)" instead of the "db.ReportRoles.Remove(f)" I would get the error.
report.ReportRoles.ToList.ForEach(Function(f) db.ReportRoles.Remove(f))
Dim newRoles = If(String.IsNullOrEmpty(model.RolesString), New String() {}, model.RolesString.Split(","))
newRoles.ToList.ForEach(Function(f) db.ReportRoles.Add(New ReportRole With {.ReportId = report.Id, .AspNetRoleId = f}))

Is it good to use a static EF object context in an MVC application for better perf?

Let's start with this basic scenario:
I have a bunch of Tables that are essentially rarely changed Enums (e.g. GeoLocations, Category, etc.) I want to load these into my EF ObjectContext so that I can assign them to entities that reference them as FK. These objects are also used to populate all sorts of dropdown controls. Pretty standard scenarios so far.
Since a new controller is created for each page request in MVC, a new entity context is created and these "enum" objects are loaded repeatedly. I thought about using a static context object across all instances of controllers (or repository object).
But will this require too much locking and therefore actually worsen perf?
Alternatively, I'm thinking of using a static context only for read-only tables. But since entities that reference them must be in the same context anyway, this isn't any different from the above.
I also don't want to get into the business of attaching/detaching these enum objects. Since I believe once I attach a static enum object to an entity, I can't attach it again to another entity??
Please help, I'm quite new to EF + MVC, so am wondering what is the best approach.
Personally, I never have any static Context stuff, etc. For me, when i call the database (CRUD) I use that context for that single transaction/unit of work.
So in this case, what you're suggesting is that you wish to retrieve some data from the databse .. and this data is .. more or less .. read only and doesn't change / static.
Lookup data is a great example of this.
So your Categories never change. Your GeoLocations never change, also.
I would not worry about this concept on the database/persistence level, but on the application level. So, just forget that this data is static/readonly etc.. and just get it. Then, when you're in your application (ie. ASP.NET web MVC controller method or in the global.asax code) THEN you should cache this ... on the UI layer.
If you're doing a nice n-tiered MVC app, which contains
UI layer
Services / Business Logic Layer
Persistence / Database data layer
Then I would cache this in the Middle Tier .. which is called by the UI Layer (ie. the MVC Controller Action .. eg. public void Index())
I think it's important to know how to seperate your concerns .. and the database stuff is should just be that -> CRUD'ish stuff and some unique stored procs when required. Don't worry about caching data, etc. Keep this layer as light as possible and as simple as possible.
Then, your middle Tier (if it exists) or your top tier should worry about what to do with this data -> in this case, cache it because it's very static.
I've implemented something similar using Linq2SQL by retrieving these 'lookup tables' as lists on app startup and storing them in ASP's caching mechanism. By using the ASP cache, I don't have to worry about threading/locking etc. Not sure why you'd need to attach them to a context, something like that could easily be retrieved if necessary via the table PK id.
I believe this is as much a question of what to cache as how. When your are dealing with EF, you can quickly run into problems when you try to persist EF objects across different contexts and attempt to detach/attach those objects. If you are using your own POCO objects with custom t4 templates then this isn't an issue, but if you are using vanilla EF then you will want to create POCO objects for your cache.
For most simple lookup items (i.e numeric primary key and string text description), you can use Dictionary. If you have multiple fields you need to pass and back with the UI then you can build a more complete object model. Since these will be POCO objects they can then be persisted pretty much anywhere and any way you like. I recommend using caching logic outside of your MVC application such that you can easily mock the caching activity for testing. If you have multiple lists you need to cache, you can put them all in one container class that looks something like this:
public class MyCacheContainer
{
public Dictionary<int, string> GeoLocations { get; set; }
public List<Category> Categories { get; set; }
}
The next question is do you really need these objects in your entity model at all. Chances are all you really need are the primary keys (i.e. you create a dropdown list using the keys and values from the dictionary and just post the ID). Therefore you could potentially handle all of the lookups to the textual description in the construction of your view models. That could look something like this:
MyEntityObject item = Context.MyEntityObjects.FirstOrDefault(i => i.Id == id);
MyCacheContainer cache = CacheFactory.GetCache();
MyViewModel model = new MyViewModel { Item = item, GeoLocationDescription = GeoLocations[item.GeoLocationId] };
If you absolutely must have those objects in your context (i.e. if there are referential entities that tie 2 or more other tables together), you can pass that cache container into your data access layer so it can do the proper lookups.
As for assigning "valid" entities, in .Net 4 you can just set the foreign key properties and don't have to actually attach an object (technically you can do this in 3.5, but it requires magic strings to set the keys). If you are using 3.5, you might just try something like this:
myItem.Category = Context.Categories.FirstOrDefault(c => c.id == id);
While this isn't the most elegant solution and does require an extra roundtrip to the DB to get a category you don't really need, it works. Doing a single record lookup based on a primary key should not really be that big of a hit especially if the table is small like the type of lookup data you are talking about.
If you are stuck with 3.5 and don't want to make that extra round trip and you want to go the magic string route, just make sure you use some type of static resource and/or code generator for your magic strings so you don't fat finger them. There are many examples here that show how do assign a new EntityKey to a reference without going to the DB so I won't go into that on this question.

Resources