We use ASP.Net MVC + Autofac + EF6.
DbContext is wrapped by UnitOfWork. We create it for each http request via Autofac.
We also open transaction for whole http request in UnitOfWork constructor.
The problem is that not all http requests have to be wrapped in to the transaction. Some of them even don't have requests to DB.
We'd like to delay transaction open till the first actual request to DB.
Any ideas how one can do it?
We can override SaveChages and open transaction before save, but select queries will not be executed in transaction this way.
One more problem here: we use global filters from EF Plus for soft removable entities. It works good, but filters initialization for context is rather slow. We'd like to delay it till the first actual request to DB too.
The problem is that your UnitOfWork is injected into controller despite an action being called and therefore its constructor is always called even if you don't need to use it. One solution could be using Autofac lazy injection. In this case UnitOfWork's constuctor is called only when its instance is needed
public class SomeController : Controller {
//..
private readonly Lazy<IUnitOfWork> _unitOfWork;
private IAnotherService UnitOfWork => _unitOfWork.Value;
public SomeController(
//..
Lazy<IUnitOfWork> unitOfWork
)
{
//..
_unitOfWork = unitOfWork;
}
//..
public ActionResult ActionNeedsTransaction()
{
//use UnitOfWork
UnitOfWork.SaveChanges();
return Json(value);
}
}
Related
I am using EF Core 1.0 (previously known ad EF7) and ASP.NET Core 1.0 (previously known as ASP.NET 5) for a RESTful API.
I'd like to have some unit of work scoped to an http request in such a way that when responding to the HTTP request either ALL the changes made to the DbContext will be saved onto the database, or none will be saved (if there was some exception, for example).
In the past I have used WebAPI2 for this purpose with NHibernate by using an Action filter where I begin the transaction on action executing, and on action executed I end the transaction and close the session. This was the way recommended at http://isbn.directory/book/9781484201107
However now I am using Asp.Net Core (with Asp.Net Core Mvc although this should not be relevant) and Entity Framework which, I understood, already implements a unit of work.
I think having a middleware plugged into the ASP.NET pipeline (before MVC) would be the right way to do things. So a request would go:
PIPELINE ASP.NET: MyUnitOfWorkMiddleware ==> MVC Controller ==> Repository ==> MVC Controller ==> MyUnitOfWorkMiddleware
I was thinking of having this middleware save the DbContext changes if no exception happened, so that in my repository implementations I don't even need to do dbcontext.SaveChanges() and everything would be like a centralized transaction. In pseudocode I guess it would be something like:
class MyUnitOfWorkMiddleware
{
//..
1-get an instance of DbContext for this request.
try {
2-await the next item in the pipeline.
3-dbContext.SaveChanges();
}
catch (Exception e) {
2.1-rollback changes (simply by ignoring context)
2.2-return an http error response
}
}
Does this make sense? Does anybody have any example of something similar? I can't find any good practice or recommendation around this.
Also, if I go with this approach at my MVC controller level I would not have access to any resource ID created by the database when POSTing a new resource because the ID would not be generated until the dbContext changes are saved (later on in the pipeline in my middleware AFTER the controller has finished executing). What if I needed to access the newly created ID of a resource in my controller?
Any advice would be greatly appreciated!
UPDATE 1: I found a problem with my approach to use middleware to achieve this because the DbContext instance in the middleware is not the same as during the MVC (and repositories) lifetime. See the question Entity Framework Core 1.0 DbContext not scoped to http request
UPDATE 2:I haven't yet found a good solution. Basically these are my options so far:
Save the changes in DB as soon as possible. That means saving it on the repository implementation itself. The problem with this approach is that for an Http request maybe I want to use several repositories (i.e: save something in database and then upload a blob to a cloud storage) and in order to have a Unit of Work I would have to implement a repository that deals with more than one entity or even more than one persistance method (DB and Blob Storage), which defeats the whole purpose
Implement an Action Filter where I wrap the whole action execution in a DB transaction. At the end of the controller's action execution, if there are no exceptions I commit chanches to DB but if there are exceptions I rollback and discard the context. The problem with this is that my controller's action may need a generated Entity's Id in order to return it to the http client (i.e: If I get a POST /api/cars I would like to return a 201 Accepted with a location header that identifies the new resource created at /api/cars/123 and the Id 123 would not be available yet since the entity has not been saved in DB and the Id is still a temporary 0). Example in controller's action for a POST verb request:
return CreatedAtRoute("GetCarById", new { carId= carSummaryCreated.Id }, carSummaryCreated); //carSummaryCreated.Id would be 0 until the changes are saved in DB
How could I have the whole controller's action wrapped in a DB transaction and at the same time have available any Id generated by the database in order to return it in the Http Response from the controller? Or.. is there any elegant way to overwrite the http response and set the Id at the action filter level once the DB changes have been commited?
UPDATE 3: As per nathanaldensr's comment I could get the best of both worlds (wrapping my controller's action execution in a DB transaction _ UoW and also knowing the Id of the new resource created even before the DB commits changes) by using code generated Guids instead relying on database to generate the Guid.
As per Entity Framework Core 1.0 DbContext not scoped to http request
I could not use a middleware to achieve this because the instance of DbContext that the middleware gets injected is not the same as the DbContext during MVC execution (in my controllers, or repositories).
I had to go with a similar approach to save the changes in DbContext after the controller's action execution using a Global Filter.
There is no official documentation yet about filters in MVC 6 so if anybody is interested on this solution see below the filter and the way I make this filter global so that it executes before any controller's action.
public class UnitOfWorkFilter : ActionFilterAttribute
{
private readonly MyDbContext _dbContext;
private readonly ILogger _logger;
public UnitOfWorkFilter(MyDbContext dbContext, ILoggerFactory loggerFactory)
{
_dbContext = dbContext;
_logger = loggerFactory.CreateLogger<UnitOfWorkFilter>();
}
public override async Task OnActionExecutionAsync(ActionExecutingContext executingContext, ActionExecutionDelegate next)
{
var executedContext = await next.Invoke(); //to wait until the controller's action finalizes in case there was an error
if (executedContext.Exception == null)
{
_logger.LogInformation("Saving changes for unit of work");
await _dbContext.SaveChangesAsync();
}
else
{
_logger.LogInformation("Avoid to save changes for unit of work due an exception");
}
}
}
and the filter gets plugged into my MVC at Startup.cs when configuring MVC.
public void ConfigureServices(IServiceCollection services)
{
//..
//Entity Framework 7
services.AddEntityFramework()
.AddSqlServer()
.AddDbContext<SpeediCargoDbContext>(options => {
options.UseSqlServer(Configuration["Data:DefaultConnection:ConnectionString"]);
});
//MVC 6
services.AddMvc(setup =>
{
setup.Filters.AddService(typeof(UnitOfWorkFilter));
});
//..
}
This still leaves a question (see UPDATE 2 on my question). What if I want my controller to respond to an http POST request with a 201 Accepted with a Location header that includes the Id of the entity created in DB? When the controller's action finalises execution the changes have not yet been committed to DB therefore the Id of the entity created is still 0 until the action filter saves changes and the DB generates a value.
I am also facing the same issue and not sure which approach to follow.
One of the approach that I used is as follow:
public class UnitOfWorkFilter : ActionFilterAttribute
{
private readonly AppDbContext _dbContext;
public UnitOfWorkFilter(AppDbContext dbContext,)
{
_dbContext = dbContext;
}
public override void OnActionExecuted(ActionExecutedContext context)
{
if (!context.HttpContext.Request.Method.Equals("Post", StringComparison.OrdinalIgnoreCase))
return;
if (context.Exception == null && context.ModelState.IsValid)
{
_dbContext.Database.CommitTransaction();
}
else
{
_dbContext.Database.RollbackTransaction();
}
}
public override void OnActionExecuting(ActionExecutingContext context)
{
if (!context.HttpContext.Request.Method.Equals("Post", StringComparison.OrdinalIgnoreCase))
return;
_dbContext.Database.BeginTransaction();
}
}
My advice, use dbContext.SaveChanges() in the controller as it is demonstrated in all examples over the web. What you want to do sounds quite fancy and could backfire as you guessed at the end of your post. And IMO, it doesn't make sense.
Regarding your second question/task:
....when responding to the HTTP request either ALL the changes made to the DbContext will be saved onto the database, or none will be saved (if there was some exception, for example).
I think you need something like 'transaction-per-request'. It is just an idea, haven't tested it at all. I just put the code together in this sample middleware:
public class TransactionPerRequestMiddleware
{
private readonly RequestDelegate next_;
public TransactionPerRequestMiddleware(RequestDelegate next)
{
next_ = next;
}
public async Task Invoke(HttpContext context, DbContext dbContext)
{
var transaction = dbContext.Database.BeginTransaction(
System.Data.IsolationLevel.ReadCommitted);
await next_.Invoke(context);
if (context.Response.StatusCode == 200)
{
transaction.Commit();
}
else
{
transaction.Rollback();
}
}
}
Good luck
I've recently started using IQueryable inspired by http://www.codethinked.com/keep-your-iqueryable-in-check. So I've been used to doing this in my repos:
public IEnumerable<POCO> GetById(int id)
{
using (var ctx = new DbContext())
{
var query = from ...;
return query.ToList();
}
}
Now I'm doing this instead:
public IPageable<POCO> GetById(int id)
{
var ctx = new DbContext()
var query = from ...;
return new Pageable(query);
}
But I'm wondering if this is the best way to handle new DbContext().
Is it better to place DbContext as a class member
public class Repo
{
private DbContext _ctx = new DbContext();
}
Or even injection it
public class Repo
{
private DbContext _ctx;
public Repo(DbContext ctx)
{
_ctx = ctx;
}
}
What are the pros and cons to:
a new DbContext in each method.
a new DbContext per object (class member).
injecting DbContext.
I'm using Ninject so I can use .InRequestScope(); (if that should effect the answer)
A couple other questions:
Should my repo implement IDisposable if DbContext is kept as a class
member?
Is there an even better way to handle disposal of DbContext then the above?
I would always go with injecting the DBContext, with the InRequestScope. Gives all benefits of dependency injection.Ninject would also dispose the DBContext on the end of the cycle as DBContext implements IDisposable. See this thread
If you use DI, your other two questions become irrelevant.
Entity Framework loves caching. If you are constantly changing your application and reloading it in your browser, you'll probably notice that the first time you reload it, it takes a couple of seconds to load, but after that, pages are almost instantaneous. This is because MVC and EF are caching common queries that are repeatedly used, making your app faster to use after that initial load time.
Because of this, it is not of huge concern where you create your DBContext. Yes, creating anything takes time. However, EF will recognize these queries and will load them quickly, even if you have just created a new instance of your context.
On a side note, if your application doesn't have a large amount of queries, the use of Using blocks around your DbContext would be considered ideal (as this handles the dispose for you), but again, the runtime and memory use results would be negligible.
What is the appropriate LifeCycle Scope for a repository and the EF context when using Entity Framework 4 with Ninject in an MVC 3 application?
I've been using the default of InTransientScope, but questioning whether it should be InRequestScope.
public class MyController: Controller
{
private readonly IMyRepo _repo;
public MyController(IMyRepo repo)
{
_repo = repo;
}
public ActionResult Index()
{
var results = _repo.GetStuff();
return View(results);
}
}
Ninject Module:
public class MyServices : NinjectModule
{
public overrride void Load()
{
Bind<IMyRepo>.To<MyRepo>();
Bind<MyContext>.ToSelf();
}
}
MyRepo:
public class MyRepo: IMyRepo
{
private readonly MyContext _context;
public MyRepo(MyContext context)
{
_context = context;
}
public IEnumerable GetStuff()
{
return _context.Entity;//query stuff
}
}
Your repository can be transient scope, however, I would bind the context in request scope. This way all of your repository instances will share the same context. This way you can reap the caching and transactional benefits of an ORM.
The way it works currently in your code is that a new context is created any time you request one. So if your controller first uses a repository and then calls another module that in turn uses a repository. Each of those repositories will have a different instance of the context. So in effect you are now using your ORM simply as a connection manager and SQL generator.
This can also have unintended consequences. Imagine a code like the following:
public ActionResult MyAction(int id)
{
var entity = _repository.Get<Entity>(id);
entity.Prop = "Processing";
_module.DoStuff(id);
}
If the DoStuff method, eventually calls _repository.Get<Entity>(id); again, you will have 2 different copies of your entity that are out of sync.
This depends on a couple of factors.
Do you care about transactions at all? It not that transient scope is ok for you.
Do you care about transactions but think one transaction per web request is ok for you? Then use web scoped.
Are you ok with objects being "cached" in EF's context and don't want a full database refresh if you request the same object twice? Web scope has this side effect.
Where should I call Commit() on my UnitOfWork in a Asp.Net MVC app? And still keep my controllers unit testable.
Do I use a HttpModule? Create a base controller and use OnActionExecuted? Or Global.asax: Application_EndRequest()?
Your controller should look something like this:
[HttpPost]
public ActionResult SubmitOrder(Order o)
{
try
{
repository.Add(o);
unitOfWork.Commit();
}
catch (YourCustomExceptionClass exc)
{
ModelState.AddError(exc.ToString());
}
return View();
}
unitOfWork should be declared at the controller-level as:
IUnitOfWork unitOfWork;
And injected into the ctor of the controller - preferably with DI per HTTP Request.
When you think about it - a unit of work in the context of a web application is usually a HTTP Request.
And a HTTP request is directed to only one action method to perform the work. Of course you have the PRG pattern (redirect to a HttpGet action afterwards) - but there should be only 1 [HttpPost] action call per HTTP request.
Therefore it makes sense to commit the UoW at the action method level.
You should have two implementations of IUnitOfWork:
EntityFrameworkUnitOfWork : IUnitOfWork
InMemoryUnitOfWork : IUnitOfWork
So when unit testing - just inject InMemoryUnitOfWork (which commits changes into a static List<T>, for example)
It sounds like your UI should send the commit call to the domain controller which should then pass the call onto the relevant parties in the domain layer.
I have a property on my BaseController called DataContext that holds my LINQ to SQL data context (or fake context for testing). When using a parameterless constructor (in other words, when a request to ASP.NET MVC is made), a new instance of my LINQ to SQL data context is assigned to the property:
public class BaseController : Controller {
public IDataContextWrapper DataContext { get; set; }
public BaseController() : this(new DataContextWrapper<MyDataContext>()) { }
public BaseController(IDataContextWrapper context) {
DataContext = context;
}
}
Also in my BaseController, I set some global ViewData items:
protected override void OnActionExecuting(ActionExecutingContext filterContext) {
ViewData["Example"] = DataContext.Table<Example>().Count();
base.OnActionExecuting(filterContext);
}
This is working fine for almost every action. The only one that doesn't work is the Logout action on my AccountController:
public ActionResult Logout() {
FormsAuth.SignOut();
return RedirectToResult("Login");
}
This raises a NullReferenceException during BaseController.OnActionExecuting. When executing that particular action, the DataContext property is null.
Why would this only occur on one action?
Note: IDataContextWrapper and DataContextWrapper simply wraps the existing functionality of the LINQ to SQL DataContext object so that it can be replaced with a fake context in unit tests. It doesn't do any disposing on its own, but leaves it up to the underlying DataContext, so I'm pretty certain that's not the problem.
To follow up my comment, check out this link and more specifically the link Microsoft documentation here which state:
In general, a DataContext instance is designed to last for one "unit of work" however your application defines that term. A DataContext is lightweight and is not expensive to create. A typical LINQ to SQL application creates DataContext instances at method scope or as a member of short-lived classes that represent a logical set of related database operations.
Microsoft did a terrible job explaining this and frankly explaining using Linq in an n-tier environment in the first place. In my particular case, I had one (static) datacontext implemented via Singleton pattern, which I am guessing is what you have done as well. ( As it is the most logical design, IMHO ). This however, is extremely NOT the way to do things. In my case, the fix was actually pretty easy, changing my GetDataContext() call to return a new DataContext every time, instead of returning the static instance. This however, you will find, creates a whole new crop of problems. None of them are insurmountable once you figure them out, but definitely a pain.
If you have such a setup ( Singleton accessors for your DataContext), change it to see if it fixes your problem.
Regardless, do not use a global DataContext, nor persist a DataContext if dealing with an n-tier architecture.
Even if this doesn't solve your particular problem, I highly suggest you re-architect your solution to make DataContexts have a unit of work lifespan, if it hasn't bitten you already, it will.
For reasons that I don't quite understand, when a new AccountController is created for the Logout action, ASP.NET MVC is using the second constructor with a null parameter (could be a bug?). I changed the class to create a new default DataContext when the parameter is null:
public class BaseController : Controller {
public IDataContextWrapper DataContext { get; set; }
public BaseController() : this(null) { }
public BaseController(IDataContextWrapper context) {
DataContext = dataContext ?? new DataContextWrapper<MyDataContext>();
}
}
Now it works.
It strikes me as strange that ASP.NET MVC used the default constructor in some cases, and an overload in others, though. Can anyone shed some light on this?