Entity framework ObjectContext share - pros and cons - entity-framework-4

In my project I use entity framework 4.0 as ORM to persist data in a SQL Server.
My project is a ribbon from application with a grid view and navigation tree in the main form with ribbon panel on top of it. My app basically acts a CRUD UI with very little business logic.
Being first time with EF, I developed this project by creating and holding an instance of objectContext in the orchestrating form (main form or the one that shows up as application to user) as a member variable and bound a query to a grid view.
For various events like ribbon panel button clicks, grid view row clicks etc, I open another windows form. In that window form I create another object context and stored in member variable of that form class.
I had read through few blogs and questions like:
How to decide on a lifetime for your objectcontext
Entity Framework and ObjectContext n-tier architecture etc.
One set of authors suggests to share the object context while other suggest to short lived and non-shared.
I reached this state of confusion because I am now in a state where the changes I made to objectContext in one of the child form is not reflecting the parent form that showed it. I attempted to refresh but still nothing useful. Just for an experiment, I shared the objectContext that I first created in the most parent class through constructor injection and my problem of change reflection is solved.
It is a huge work for me to convert all my child forms to share objectContext. But I am ready if it is worth. I am not sure what will be the lurking problems of sharing it?
I may opt for a static instance of objectContext as I am not using this for Web and not planning for multi threaded scenarios. If required I can rise to as a singleton.
My Questions:
To share or not to share ObjectContext for my situation?
If not to share, how can I solve my present problem of updating one objectContext with the changes made in other?
If to share - which would be better way? Static or singleton or something else?
The details of the project and environment are as below:
Winforms
C#
VS 2012
EF 4.0, model created with data first approach.
I am posting this after searching and reading through many questions and blog posts. The more I read, the more confusing it becomes :) Please bear with me if I am leaving someone to assume something to answer. I will try to update the question if such clarifications are asked through comments.

Your Questions
To share or not to share ObjectContext for my situation?
Do not share your context. The EntityFramework context should follow a UnitOfWork pattern. Your object context should be as short lived as possible without unnecessarily creating/destroying too many contexts. This usually translates to individual "operations" in your app as units of work. For a web app/api this might be per HttpWebRequest, or you might do it per logical data operation (for each of your implemented pieces of "Business Logic").
For example:
LoadBusinssObjects() would create a context, load your list of data plus any related data you want then dispose of the context.
CreateBusinessObject() would create a context, create an instance of some entity, populate it with data, attached it to a collect in the context, save changes and then dispose of the context.
UpdateBusinessObject() would read some object from the context, update it, save changes, and dispose of the context.
DeleteBusinessObject() would find a business object in the context, remove it from the collection in the context, save changes and dispose of the context.
If not to share, how can I solve my present problem of updating one objectContext with the changes made in other?
This is a job for a pub/sub architecture. This can be as simple as a few static event handlers on your objects for each operation you implemented above. Then in your code for each business operation, you fire the corresponding events.
If to share - which would be better way? Static or singleton or something else?
This is incorrect. The EF context will continue to grow in memory footprint as the context's state manager continuously collects up cached objects (both attached and not-attached) for every single interaction you do in your application. The context is not designed to work like this.
In addition to resource usage, the EF context is not thread safe. For example what if you wanted to allow one of your editor forms to save some changes at the same time as the tree list is loading some new data? With one static instance you better make sure this is all running on the UI thread or synchronized with a semaphore (yuck, and yuck - bad practices).
Example
Here's an example using C# and code first approach as per your post. Note, I'm not addressing things like data concurrency or threading to keep the example short. Also in a real application this concept is implemented with generics and reflection so that ALL of our models have basic events on them for Creating, Updating, Deleting.
public class MyCodeFirstEntityChangedArgs : EventArgs
{
/// <summary>
/// The primary key of the entity being changed.
/// </summary>
public int Id {get;set;}
/// <summary>
/// You probably want to make this an ENUM for Added/Modified/Removed
/// </summary>
public string ChangeReason {get;set;}
}
public class MyCodeFirstEntity
{
public int Id {get;set;}
public string SomeProperty {get;set;}
/// <summary>
/// Occurs when an instance of this entity model has been changed.
/// </summary>
public static event EventHandler<MyCodeFirstEntityChangedArgs> EntityChanged;
}
public class MyBusinessLogic
{
public static void UpdateMyCodeFirstEntity(int entityId, MyCodeFirstEntity newEntityData)
{
using(var context = new MyEFContext())
{
// Find the existing record in the database
var existingRecord = context.MyCodeFirstEntityDbSet.Find(entityId);
// Copy over some changes (in real life we have a
// generic reflection based object copying method)
existingRecord.Name = newEntityData.Name;
// Save our changes via EF
context.SaveChanges();
// Fire our event handler so that other UI components
// subscribed to this event know to refresh/update their views.
// ----
// NOTE: If SaveChanges() threw an exception, you won't get here.
MyCodeFirstEntity.EntityChanged(null, new MyCodeFirstEntityChangedArgs()
{
Id = existingRecord.Id,
ChangeReason = "Updated"
});
}
}
}
Now you can attach event handlers to your model from anywhere (its a static eventhandler) like this:
MyCodeFirstEntity.EntityChanged += new EventHandler<MyCodeFirstEntityChangedArgs>(MyCodeFirstEntity_LocalEventHandler);
and then have a handler in each view that will refresh local UI views whenever this event is fired:
static void MyCodeFirstEntity_LocalEventHandler(object sender, MyCodeFirstEntityChangedArgs e)
{
// Something somewhere changed a record! I better refresh some local UI view.
}
Now every UI component you have can subscribe to what events are important to it. If you have a Tree list and then some editor form, the tree list will subscribe for any changes to add/update/remove a node (or the easy way - just refresh the whole tree list).
Updates Between Applications
If you want to go a step further and even link separate instances of your app in a connected environment you can implement a pub/sub eventing system over the network using something like WebSync - a comet implementation for the Microsoft Technology Stack. WebSync has all the stuff built in to separate events into logical "channels" for each entity/event you want to subscribe to or publish to. And yes, I work for the software company who makes WebSync - they're paying for my time as I write this. :-)
But if you didn't want to pay for a commercial implementation, you could write your own TCP socket client/server that distributes notifications for the above events when an entity changes. Then when the subscribing app gets the notification over the network, it can fire its local event handlers the same way which will cause local views to refresh. You can't do this with a poorly architected static instance of your data context (you'd be bound to only ever have one instance of your app running). With some good setup early on, you can easily tack on a distributed pub-sub system later that works across multiple instances of native apps and web apps all at the same time! That gets very powerful.

Related

How can I access lookup data via an injected service from within an entity?

My application is using DDD with .NET Core and EF Core. I have some business rules that run within an entity that need to check dates against a cached list of company holiday dates. The company holidays are loaded from the db and cached by an application service that is configured with our DI container so it can be injected into our controllers, etc.
I cannot determine how, or if it's the right/best approach, to get the service injected into the entity so it can grab those dates when running business rules. I did find this answer that appears to demonstrate one way to do it, but I wanted to see if there were any additional options because that way has a bit of a code-smell to me upon first glance (adding a property to the DbContext to grab off the private constructor injected context).
Are there any other ways to accomplish something like this?
ORM classes are very rarely your domain objects. If you can start with your domain and seamlessly map to an ORM without the need for infrastructure specific alterations or attributes then that is fine; else you need to split your domain objects from your ORM objects.
You should not inject any services or repositories into aggregates. Aggregates should focus on the command/transactional side of the solution and work with pre-loaded state and should avoid requesting additional state through any handed mechanisms. The state should be obtained and handed to the aggregate.
In your specific scenario I would suggest loading your BusinessCalendar and then hand it to your aggregate when performing some function, e.g.:
public class TheAggregate
{
public bool AttemptRegistration(BusinessCalendar calendar)
{
if (!calendar.IsWorkingDay(DateTime.Now))
{
return false;
}
// ... registration code
return true;
}
// or perhaps...
public void Register(DateTime registrationDate, BusinessCalendar calendar)
{
if (!calendar.IsWorkingDay(registrationDate))
{
throw new InvalidOperationException();
}
// ... registration code
}
}
Another take on this is to have your domain ignore this bit and place the burden on the calling code. In this way if you ask you domain to do something it will do so since, perhaps, a registration on a non-working day (in my trivial example) may be performed in some circumstances. In these cases the application layer is responsible for checking the calendar for "normal" registration or overriding the default behaviour in some circumstances. This is the same approach one would take for authorisation. The application layer is responsible for authorisation and the domain should not care about that. If you can call the domain code then you have been authorised to do so.

Where do models belong in my solution?

This is the first time I'm doing a web project using MVC and I'm trying to make it based on three layers : MVC, DAL (Data Access Layer) and BLL (Business Logic Layer).
I'm also trying to use repositories and I'm doing it code-first.
Anyway I've searched a lot on the web but yet if you've got a good reference for me I'll be glad to see it .
My current project looks like this:
And here are my questions:
Where are the Product and User classes that represent tables supposed to be? It seems that I need to use them in the BLL and I don't really need them in the DAL but for the PASContext.
Where do I initiate the PASContext? In all of the examples I've seen on the internet no one made a constructor in the repository that takes 0 argument, which means that the context is not created within the repository (and I've read some reasons why like so all repositories will use one context).
If I'm trying to initiate the PASContext in the ProductBLL the compiler says it doesn't recognize it and I'm missing a reference (although I've added the needed reference and the name PASContext is marked in a blue like vs did recognize it)
PASContext is the class that is inherited from DbContext.
Here is some code to demonstrate:
public class ProductsBLL
{
private EFRepository<Product> productsRepository;
private List<Product> products;
public ProductsBLL()
{
PASContext context = new PASContext();
productsRepository = new EFRepository<Product>(context);
//LoadData();
}
About the View models, if I want, for example, to present a list of products for the client, do I need to create a ProductViewModel, get the data from ProductsBLL which has a list of Product and convert it to a list of ProductViewModel and then send it to the controller?
In addition, in the ProductController, do I only initiate ProductsBLL? I don't initiate any repository or context, right?
If someone could show me some project that uses repository, three-tier architecture and takes data from the database, transfers it to the BLL and from there to the MVC layer and using a ViewModel show it to the client it will be great.
Question 1
Where are the Product and User classes that represent tables supposed to be?
I would have these in a project that can be referenced by all other projects. That is, all other projects can depend on the solution with the models. In the case of onion architecture the models belong in the core which is at the center of the solution.
In your case I'd put them in the BLL.
Question 2
Where do I initiate the PASContext?
The reason why you don't normally see this done directly is because it's very common to use Dependency Injection or DI (What is dependency injection?)
This means that you don't need to instantiate a DbContext directly; you let the DI container do it for you. In my MVC applications the context has a PerWebRequest lifestyle.
PerWebRequest Lifestyle:
Instance of a component will be shared in scope of a single web request. The instance will be created the first time it's requested in scope of the web request.
A context is created when the request is made, used throughout the request (so all repositories gain the benefits of the first-level caching) and then the context is disposed of when the request is completed. All of it is managed by the DI container.
Question 3
do I need to create a ProductViewModel [...] ?
You generally only have one view-model to give to a view. The view should be its own object that has all of the things that the view needs in order to display everything. You suggest that you create multiple view-model objects and pass that to the view. My concern with that approach is what happens if you want to display more information for that view? Say, you want to display a single DateTime object to the user. Now you want to display one of something but you're passing many objects to the view.
Instead, separate things up. Create a single view-model and pass that to the view. If you have a part of the view that needs to display many of something, have the view call it as a child action or partial so that the view isn't doing too much.
Your approach:
A different approach:
Conclusion
If someone could show me some project that uses [...] three-tier architecture
I'm not sure about three-tier architecture. Here are some example projects that use a variety of solution architectures:
MVCForum
EFMVC
Official MVC samples
There is no single correct approach - just good and bad approaches.
I would start here. You'll get a lot more information on broad topics like this from prepared resources sources.
http://www.asp.net/mvc/tutorials/getting-started-with-aspnet-mvc3/cs/intro-to-aspnet-mvc-3.
Update
In-depth tutorials
http://www.codeproject.com/Articles/70061/Architecture-Guide-ASP-NET-MVC-Framework-N-tier-En
http://www.asp.net/mvc/tutorials/hands-on-labs/aspnet-mvc-4-models-and-data-access
http://www.asp.net/mvc/tutorials/getting-started-with-ef-5-using-mvc-4
http://www.codedigest.com/Articles/ASPNET/187_How_to_create_a_website_using_3_tier_architecture.aspx
http://www.mvcsharp.org/Basics_of_MVC_and_MVP/Default.aspx
Example Projects
http://prodinner.codeplex.com/
http://www.nopcommerce.com/
http://www.mvcsharp.org/Getting_started_with_MVCSharp/Default.aspx

Using DbContext on background thread

My MVC4 app uses code-first Entity Framework 5.0. I want to access my SQL Server data from a timer thread. Is there any reason why I can't instantiate, use, and dispose an instance of the same derived DbContext class that I also use on the main ASP.NET worker thread? (Yes, I use the same using() pattern to instantiate, use, and dispose the object on the main thread.)
A little problem context: My website has a WebsiteEnabled field in a table in the database. Currently, I incur a database fetch for each GET request to read that value. I want to change the code to read the value once every 15 seconds on a background thread, and store the value in a static variable that the GET request can read. I know that you run into problems if you try to instantiate multiple instances of the same DbContext on the same thread; I'm not sure if the same restrictions apply to instances of the same DbContext on different threads.
We use a background thread as well to check for emails and do cleanups every so often in one of our larger MVC applications. As long as you create a new context (and dispose of it) on the background thread and not try and use the one from your main application thread, you will be fine. The DbContext is not thread safe, meaning you cannot share it across multiple threads safely. This does not mean you cannot have multiple threads each with their own copy of the db context. The only caution is beware of concurrency issues (trying to update a row at the same time).
Statics and EF are recipe for a mess. Under asp.net, 1 app pool, many threads. Store statics if you must but not the context. So always make sure each thread gets its own context.
but given your problem, there is a simple out of the box solution I would use
In thr controller that should have cached values. On the GET method
You can cache per ID, for a specific period of time...
Worth checking out. Let IIS, ASP.NET do work for you. :-)
[OutputCache(Duration = int.MaxValue, VaryByParam = "id", Location = OutputCacheLocation.ServerAndClient)]
public ActionResult Get(string id) {
// the value that can be cached is collected with a NEW CONTEXT !

ASP.NET MVC -> WCF -> NHibernate, how to efficiently update entity with data from viewmodel?

A week back, I had an ASP.NET MVC application that called on a logical POCO service layer to perform business logic against entities. One approach I commonly used was to use AutoMapper to map a populated viewmodel to an entity and call update on the entity (pseudo code below).
MyEntity myEntity = myService.GetEntity(param);
Mapper.CreateMap<MyEntityVM, MyEntity>();
Mapper.Map(myEntityVM, myEntity);
this.myService.UpdateEntity(myEntity);
The update call would take an instance of the entity and, through a repository, call NHibernate's Update method on the entity.
Well, I recently changed my logical service layer into WCF Web Services. I've noticed that the link NHibernate makes with an entity is now lost when the entity is sent from the service layer to my application. When I try to operate against the entity in the update method, things are in NHibernate's session that shouldn't be and vice-versa - it fails complaining about nulls on child identifiers and such.
So my question...
What can I do to efficiently take input from my populated viewmodel and ultimately end up modifying the object through NHibernate?
Is there a quick fix that I can apply with NHibernate?
Should I take a different approach in conveying the changes from the application to the service layer?
EDIT:
The best approach I can think of right now, is to create a new entity and map from the view model to the new entity (including the identifier). I would pass that to the service layer where it would retrieve the entity using the repository, map the changes using AutoMapper, and call the repository's update method. I will be mapping twice, but it might work (although I'll have to exclude a bunch of properties/children in the second mapping).
No quick fix. You've run into the change tracking over the wire issue. AFAIK NHibernate has no native way to handle this.
These may help:
https://forum.hibernate.org/viewtopic.php?f=25&t=989106
http://lunaverse.wordpress.com/2007/05/09/remoting-using-wcf-and-nhibernate/
In a nutshell your two options are to adjust your service to send state change information over the Nhibernate can read or load the objects, apply the changes and then save in your service layer.
Don't be afraid of doing a select before an update inside your service. This is good practice anyway to prevent concurrency issues.
I don't know if this is the best approach, but I wanted to pass along information on a quick fix with NHibernate.
From NHibernate.xml...
<member name="M:NHibernate.ISession.SaveOrUpdateCopy(System.Object)">
<summary>
Copy the state of the given object onto the persistent object with the same
identifier. If there is no persistent instance currently associated with
the session, it will be loaded. Return the persistent instance. If the
given instance is unsaved or does not exist in the database, save it and
return it as a newly persistent instance. Otherwise, the given instance
does not become associated with the session.
</summary>
<param name="obj">a transient instance with state to be copied</param>
<returns>an updated persistent instance</returns>
</member>
It's working although I haven't had time to examine the database calls to see if it's doing exactly what I expect it to do.

Is it good to use a static EF object context in an MVC application for better perf?

Let's start with this basic scenario:
I have a bunch of Tables that are essentially rarely changed Enums (e.g. GeoLocations, Category, etc.) I want to load these into my EF ObjectContext so that I can assign them to entities that reference them as FK. These objects are also used to populate all sorts of dropdown controls. Pretty standard scenarios so far.
Since a new controller is created for each page request in MVC, a new entity context is created and these "enum" objects are loaded repeatedly. I thought about using a static context object across all instances of controllers (or repository object).
But will this require too much locking and therefore actually worsen perf?
Alternatively, I'm thinking of using a static context only for read-only tables. But since entities that reference them must be in the same context anyway, this isn't any different from the above.
I also don't want to get into the business of attaching/detaching these enum objects. Since I believe once I attach a static enum object to an entity, I can't attach it again to another entity??
Please help, I'm quite new to EF + MVC, so am wondering what is the best approach.
Personally, I never have any static Context stuff, etc. For me, when i call the database (CRUD) I use that context for that single transaction/unit of work.
So in this case, what you're suggesting is that you wish to retrieve some data from the databse .. and this data is .. more or less .. read only and doesn't change / static.
Lookup data is a great example of this.
So your Categories never change. Your GeoLocations never change, also.
I would not worry about this concept on the database/persistence level, but on the application level. So, just forget that this data is static/readonly etc.. and just get it. Then, when you're in your application (ie. ASP.NET web MVC controller method or in the global.asax code) THEN you should cache this ... on the UI layer.
If you're doing a nice n-tiered MVC app, which contains
UI layer
Services / Business Logic Layer
Persistence / Database data layer
Then I would cache this in the Middle Tier .. which is called by the UI Layer (ie. the MVC Controller Action .. eg. public void Index())
I think it's important to know how to seperate your concerns .. and the database stuff is should just be that -> CRUD'ish stuff and some unique stored procs when required. Don't worry about caching data, etc. Keep this layer as light as possible and as simple as possible.
Then, your middle Tier (if it exists) or your top tier should worry about what to do with this data -> in this case, cache it because it's very static.
I've implemented something similar using Linq2SQL by retrieving these 'lookup tables' as lists on app startup and storing them in ASP's caching mechanism. By using the ASP cache, I don't have to worry about threading/locking etc. Not sure why you'd need to attach them to a context, something like that could easily be retrieved if necessary via the table PK id.
I believe this is as much a question of what to cache as how. When your are dealing with EF, you can quickly run into problems when you try to persist EF objects across different contexts and attempt to detach/attach those objects. If you are using your own POCO objects with custom t4 templates then this isn't an issue, but if you are using vanilla EF then you will want to create POCO objects for your cache.
For most simple lookup items (i.e numeric primary key and string text description), you can use Dictionary. If you have multiple fields you need to pass and back with the UI then you can build a more complete object model. Since these will be POCO objects they can then be persisted pretty much anywhere and any way you like. I recommend using caching logic outside of your MVC application such that you can easily mock the caching activity for testing. If you have multiple lists you need to cache, you can put them all in one container class that looks something like this:
public class MyCacheContainer
{
public Dictionary<int, string> GeoLocations { get; set; }
public List<Category> Categories { get; set; }
}
The next question is do you really need these objects in your entity model at all. Chances are all you really need are the primary keys (i.e. you create a dropdown list using the keys and values from the dictionary and just post the ID). Therefore you could potentially handle all of the lookups to the textual description in the construction of your view models. That could look something like this:
MyEntityObject item = Context.MyEntityObjects.FirstOrDefault(i => i.Id == id);
MyCacheContainer cache = CacheFactory.GetCache();
MyViewModel model = new MyViewModel { Item = item, GeoLocationDescription = GeoLocations[item.GeoLocationId] };
If you absolutely must have those objects in your context (i.e. if there are referential entities that tie 2 or more other tables together), you can pass that cache container into your data access layer so it can do the proper lookups.
As for assigning "valid" entities, in .Net 4 you can just set the foreign key properties and don't have to actually attach an object (technically you can do this in 3.5, but it requires magic strings to set the keys). If you are using 3.5, you might just try something like this:
myItem.Category = Context.Categories.FirstOrDefault(c => c.id == id);
While this isn't the most elegant solution and does require an extra roundtrip to the DB to get a category you don't really need, it works. Doing a single record lookup based on a primary key should not really be that big of a hit especially if the table is small like the type of lookup data you are talking about.
If you are stuck with 3.5 and don't want to make that extra round trip and you want to go the magic string route, just make sure you use some type of static resource and/or code generator for your magic strings so you don't fat finger them. There are many examples here that show how do assign a new EntityKey to a reference without going to the DB so I won't go into that on this question.

Resources