asp.net mvc 4 - Okay to share DbContext per thread? - asp.net-mvc

From One DbContext per web request... why?
My understanding is that a DbContext instance should not be shared across concurrent web request, so definitely not across threads.
But how about sharing it across non-concurrent web requests?
Due to thread agility (What is the meaning of thread-agility in ASP.Net?), am I right that a thread can handle more than one web request before it dies?
If so, is it safe to dependency inject a DbContext instance for each thread?
Reason for this is I'm using Unity, which does not include per-request lifetime option.
From MVC, EF - DataContext singleton instance Per-Web-Request in Unity , I think I could use a custom LifetimeManager; I'm just wondering if it is safe and sufficient to use PerThreadLifetimeManager.

is it safe to dependency inject a DbContext instance for each thread?
It depends. If your idea is to have one DbContext per web request, and the consistency of your application depends on it, having one DbContext per thread is a bad idea, since a single web request can still get multiple DbContext instances. And since ASP.NET pools threads, instances that are cached per thread, will live for the duration of the entire application, which is very bad for a DbContext (as explained here).
On the other hand, you might be able to come up with a caching scheme that ensures that a single DbContext is used for a single web request, and is returned to a pool when the request is finished, so other web requests could pick it up. This is basically the way how connection pooling in .NET works. But since DbContext instances cache data, that data becomes stale very quickly, so even if you are able to come up with a thread-safe solution, your system still behaves in an inconsistent way, since at some seemingly random moments, old data is shown to the user, while in a following request new data is shown.
I think it is possible to clear the DbContext's cache at the beginning of a web request, but that would be basically be the same as creating a new DbContext for that request, but with the downside of much slower performance.
I'm just wondering if it is safe and sufficient to use PerThreadLifetimeManager.
No, it isn't safe because of the reasons described above.
But it is actually quite easy to register a DbContext on a per-web request basis:
container.Register<MyApplicationEntities>(new InjectionFactory(c => {
var context = (MyApplicationEntities)HttpContext.Current.Items["__dbcontext"];
if (context == null) {
context = new MyApplicationEntities();
HttpContext.Current.Items["__dbcontext"] = context;
}
return context;
}));

If you will use multiple DbContext per request you can end up with a multi threaded application with high possibility loosing data integrity. One web request can be viewed as one transaction ; but with PerThreadLifeTimeManager you would have multiple distinct transactions which aren't related but maybe they should. For example posting a form with many data can end up with saving multiple database table and with 2 or more independent context can happen that one insert succeeds but another fails and you could have inconsistent data.
The other important thing is that the ASP.NET infrastructure uses thread pooling so every started thread will be reused after the request has finished and if something went wrong in one request it can affect another one. That is why not recommended to use any Thread-LocalStorage (static in the thread) in threadpool environment because we can not control the threads' lifetime.

Related

Unity container in ASP.NET MVC

I have a simple question. I'm newer with UnityContainer of Miscrosoft. I'm writing an ASP.NET MVC application with Unity for DI.
Have I a different CONTAINER for each user connected to my web app? Or the CONTAINER is the same for all users?
So if I resolve the life time of an object with ContainerControlledLifetimeManager does it mean that only for one user session this object is always the same?
I hope you understand.
Thanks,
Christian
Lifetime refers to the life of the object created by the DI process. Per request means each request gets its own object. If the object depends on the current user, querystring values on that request or values/presence of Request headers a PerRequest lifetime is appropriate. If you have settings that vary based on location of your service, for example, you saved values from web.config, then a the container is most likely created in global.asa and these objects can live as long as the container lives.
A concrete example:
You have a service as part of your site and you are migrating to vNext of that service. Users can opt-in by clicking a link that includes a parameter like &myService=vNext to see the new behavior. your Factory method uses the value of this parameter to select vNow or vNext for each request.
Here's some pseudo code to get you started:
container.RegisterInstance<IProductFactory>("enterprise", new EnterpriseProductFactory());
container.RegisterInstance<IProductFactory>("retail", new RetailProductFactory());
container.RegisterVersionedServiceFactory<IProductFactorySettings, IProductFactory>();
In this example RegisterVersionedServiceFactory is an extension method that does nothing but decide which of the IProductFactory instances to use for the current request. The factory provides the current instance (there are only two for the life of the service) to use for this request (thousands per seconds).
This pattern is what makes a very large site you probably used recently both very stable and very flexible. New versions of services are rolled out using this exact same pattern to help keep the site very stable.

Create orchard session

I need to manipulate with data in separated thread in orchard cms.
Problem is when request ends session and services are disposed.
What is the best way to create db session, or how to manipulate with data after request finish?
EDIT:
I am trying something like this code
var builder = new ContainerBuilder();
builder.RegisterGeneric(typeof(Repository<>)).As(typeof(IRepository<>)).InstancePerLifetimeScope();
//builder.RegisterInstance(_shellSettings = new ShellSettings { Name = ShellSettings.DefaultName });
builder.RegisterType<TransactionManager>().As<ITransactionManager>().InstancePerLifetimeScope();
builder.RegisterType<SessionFactoryHolder>().As<ISessionFactoryHolder>().InstancePerLifetimeScope();
But i don't know what exactly to register, it throws me error when resolving repository.
Spawning threads on a web server is bad, it reduces its ability to serve many requests simultaneously. You should consider offloading your task to some other process, like a windows service, communicating through MSMQ by example.
Otherwise consider letting your task instantiates and disposes off the services and sessions it needs itself, instead of using those bounded to the request life-cycle. You may need to setup for this a dedicated dependency resolver letting the task explicitly control the lifetime of the objects it requests to the dependency resolver.

Can I Use Async to Perform Fire-and-Forget NHibernate Saves in an ASP.NET MVC Application?

I have an ASP.NET MVC application that uses NHibernate to persist data into a SQL Server database.
There are cases where I want to save an entry into a database (initially triggered by a call into an action method on a controller) but there's no need to block the caller.
Is it "safe" to try to implement a fire-and-forget mechanism into the database that will put the database call into a Task and then invoke it on the background so control can return immediately to the caller? (OR accomplish the same thing with BackgroundWorker or the "async/await" keywords) I need a solution where NHibernate will not get tripped up by ASP.NET trying to clean up its ISession, which is per-request. I'm using Autofac for lifetime management on the session. I assume that the database operation would have a slightly longer lifetime than the web request itself, and I'm not sure how smoothly that would work.
It is not safe to do this; I have a blog post on the subject. The problem is that when you have no requests in progress, it is possible that your entire AppDomain can be torn down. Also, consider what would happen if the database insert failed for some reason? If you return early, then there's no way to notify the client of an error.
A reliable solution must store the data in some kind of persistent place before returning success to the caller. This can be directly in the database, or in a queue of some kind (to be later processed by an independent worker).

Design considerations for heavy loaded asp.net MVC + Web API application and asynchronous message bus

I'm planning to build a quite large application (large in term of concurrent user / number of request, not in term of features).
Basically, I'll have a service somewhere, that is waiting for commands execute them, and acknowledge the completion later. This service will use a service bus to communicate, making the execution eventual, before a acknowledge message is issued.
The consumers of this service can be any kind of application (WPF, SL, ...) but my main (and first) client will be an asp.net MVC application + WebApi (.Net 4.5) or MVC only (.Net 4.0) with ajax controller actions.
The web application will be relying on Ajax call to keep a user friendly responsive application.
I'm quite new to such full blown async architecture, and I'm having some questions to avoid future headache :
my web api calls can take some amount of times. How should I design properly the api to support long running operations (some kind of async?). I've read about the new async keyword, but for the sake of knowledge, I'd like to understand what's behind.
My calls to the service will consist is publishing a message and wait for the ack message. If I wrap this in a single method, how should I write this method? Should I "block" until the ack is received (I suppose I shouldn't)? Should I return a Task object and let the consumer decide?
I'm also wondering if SignalR can help me. With signalR, I think I can use a real "fire and forget" command issuing, and route up to the client to ack message.
Am I completely out of subject, and should I take another approach?
In term of implementation details / framework, I think I'll use :
Rabbitmq as messaging system
Masstransit to abstract the messaging system
asp.MVC 4 to build the UI
Webapi to isolate command issuing out of UI controllers, and to allow other kind of client to issue commands
my web api calls can take some amount of times.
How should I design properly the api to support long
running operations (some kind of async?).
I'm not 100% sure where you're going. You ask questions about Async but also mention message queuing, by throwing in RabbitMQ and MassTransit. Message queuing is asynchronous by default.
You also mention executing commands. If you're referring to CQRS, you seperate commands and queries. But what I'm not 100% about is what you're referring to when mentioning "long running processes".
When you query data, the data should already be present. Preferably in a way that is needed for the question at hand.
When you query data, no long-running-process should be started
When you execute commands, a long-running-processes can be started. But that's why you should use message queuing. Specify a task to start the long running process, create a message for it, throw it onto the queue, forget about it altogether. Some other process in the background will pick it up.
When the command is executed, the long-running-process can be started.
When the command is executed, a database can be updated with data
This data can be used by the API if someone requests data
When using this model, it doesn't matter that the long-running-process might take up to 10 minutes to complete. I won't go into detail on actually having a single thread take up to 10 minutes to complete, including locks on database, but I hope you get the point. Your API will be free almost instantly after throwing a message onto the queue. No need for Async there.
My calls to the service will consist is publishing a message and wait for the ack message.
I don't get this. The .NET Framework and your queuing platform take care of this for you. Why would you wait on an ack?
In MassTransit
Bus.Instance.Publish(new YourMessage{Text = "Hi"});
In NServiceBus
Bus.Publish(new YourMessage{Text = "Hi"});
I'm also wondering if SignalR can help me.
I should think so! Because of the asynchronous nature of messaging, the user has to 'wait' for updates. If you can provide this data by 'pushing' updates via SignalR to the user, all the better.
Am I completely out of subject, and should I take another approach?
Perhaps, I'm still not sure where you're going.
Perhaps read up on the following resources.
Resources:
http://www.udidahan.com/2013/04/28/queries-patterns-and-search-food-for-thought/
http://www.udidahan.com/2011/10/02/why-you-should-be-using-cqrs-almost-everywhere%E2%80%A6/
http://www.udidahan.com/2011/04/22/when-to-avoid-cqrs/
http://www.udidahan.com/2012/12/10/service-oriented-api-implementations/
http://bloggingabout.net/blogs/dennis/archive/2012/04/25/what-is-messaging.aspx
http://bloggingabout.net/blogs/dennis/archive/2013/07/30/partitioning-data-through-events.aspx
http://bloggingabout.net/blogs/dennis/archive/2013/01/04/databases-and-coupling.aspx
my web api calls can take some amount of times. How should I design
properly the api to support long running operations (some kind of
async?). I've read about the new async keyword, but for the sake of
knowledge, I'd like to understand what's behind.
Regarding Async, I saw this link being recommended on another question on stackoverflow:
http://msdn.microsoft.com/en-us/library/ee728598(v=vs.100).aspx
It says that when a request is made to an ASP .NET application, a thread is assigned to process the request from a limited thread pool.
An asynchronous controller action releases the thread back to the thread pool so that it is ready to accept addtitional requests. Within the action the operation which needs to be executed asynchronously is assigned to a callback controller action.
The asynchronous controller action is named using Async as the suffix and the callback action has a Completed suffix.
public void NewsAsync(string city) {}
public ActionResult NewsCompleted(string[] headlines) {}
Regarding when to use Async:
In general, use asynchronous pipelines when the following conditions
are true:
The operations are network-bound or I/O-bound instead of CPU-bound.
Testing shows that the blocking operations are a bottleneck in site performance and that IIS can service more requests by using
asynchronous action methods for these blocking calls.
Parallelism is more important than simplicity of code.
You want to provide a mechanism that lets users cancel a long-running request.
I think developing your service using ASP .NET MVC with Web API and using Async controllers where needed would be a good approach to developing a highly available web service.
Using a message based service framework like ServiceStack looks good too:
http://www.servicestack.net/
Additional resources:
http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
http://www.codethinked.com/net-40-and-systemthreadingtasks
http://dotnet.dzone.com/news/net-zone-evolution
http://www.aaronstannard.com/post/2011/01/06/asynchonrous-controllers-ASPNET-mvc.aspx
http://channel9.msdn.com/Events/TechDays/Techdays-2012-the-Netherlands/2287
http://www.dotnetcurry.com/ShowArticle.aspx?ID=948 // also shows setup of performance tests
http://www.asp.net/mvc/tutorials/mvc-4/using-asynchronous-methods-in-aspnet-mvc-4
http://visualstudiomagazine.com/articles/2013/07/23/async-actions-in-aspnet-mvc-4.aspx
http://hanselminutes.com/327/everything-net-programmers-know-about-asynchronous-programming-is-wrong
http://www.hanselman.com/blog/TheMagicOfUsingAsynchronousMethodsInASPNET45PlusAnImportantGotcha.aspx

Asynchronous Asp.Net MVC controller methods?

I will build an Asp.net MVC 3 web page.
View: The view (web page) invoke about five Ajax(jQuery) calls against the methods, which return JsonResult, in a controller and render the results on the web page.
Control: The controller methods read a SQL Server 2008 database using EF4. Two of the SQL statements may take half a minute to execute depending on the server load.
I wish the users can at least see the contents returned from the quick controller/database calls as soon as possible. The page will not have a lot of users (maybe up to 15). Will the long run controller method calls block others if they are not asynchronous? Or is it irrelevant as long as the thread pool is big enough to handle the peak requests of the users?
From the user's view, loading the initial web page is synchronous, i.e. he has to wait until the server delivers the page. The Ajax requests however look asynchronous to him because he can already see part of the page.
From the server's view, everything is synchronous. There is an HTTP request that needs to be processed and the answer is either HTML, JSON or whatever. The client will wait until it receives the answer. And several requests can be processed in parallel.
So unless you implement some special locking (either on the web server or in the database) that blocks some of the requests, nothing will be blocked.
The proposed approach seems just fine to me.
Update:
There's one thing I forgot: ASP.NET contains a locking mechanism to synchronize access to the session data that can get into the way if you have several concurrent requests from the same user. Have a look at the SessionState attribute for a way to work around that problem.
Update 2:
And for an asynchronous behavior from the user's point of view, there's no need to use the AsyncController class. They where built for something else, which is not relevant in your case since only have 15 users.
Will the long run controller method calls block others if they are not asynchronous?
The first important thing to note is that all those controller actions should not have write access to the Session. If they write to the session whether they are sync or async they will always execute sequentially and never in parallel. That's due to the fact that ASP.NET Session is not thread safe and if multiple requests from the same session arrive they will be queued. If you are only reading from the Session it is OK.
Now, the slow controller actions will not block the fast controller actions. No matter whether they are synchronous or not. For long controller actions it could make sense to make them asynchronous only if you are using the asynchronous ADO.NET methods to access the database and thus benefit from the I/O Completion Ports. They allow you to not consume any threads during I/O operations such as database access. If you use standard blocking calls to the database you get no benefit from async actions.
I would recommend you the following article for getting deeper understanding of when asynchronous actions could be beneficial.

Resources