How to perform work after the request in MVC - asp.net-mvc

I have a long running CPU bound task that I want to initialize from a link in my MVC application. When I click the link, I want the server to create a GUID to identify the job, return that GUID to the client, and perform the job after returning.
I set this up using ThreadPool.QueueWorkItem, but I've read this can be problematic in MVC. Is there a better option for this case? Is there a different approach I should be using?

In my experience it is better to perform long running CPU tasks not in ASP.NET application itself but in separate application. For example you can create separate Windows service to process tasks. To interchange data you can use for example message queue, database (probably the easiest way) or web service.
This approach has following advantages:
1) Integrity of background job. In IIS you can configure to restart worker processes periodically. If your background job is running at that moment it will be interrupted what could be undesirable.
2) Plan server load balancing. For example you can move your web service to separate server which will free web server and can provide better end user experience.
Take a look at this example to see how it can be implemented with Azure.

You can do a fire and forget, by creating an asynchronous task without waiting for it, and it will run successfully most of the time, but due IIS application life cycle management those task may be abruptly cut.
You can register an IRegisteredObject object in IIS, so IIS will such object know that the domain is being shutdown.
Please take a look to this article:
http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx/

Related

In asp.net-mvc, what is the correct way to do expensive operations without impacting other users?

I asked this question about 5 years ago around how to "offload" expensive operations where the users doesn't need to wait for (such as auditng, etc) so they get a response on the front end quicker.
I now have a related but different question. On my asp.net-mvc, I have build some reporting pages where you can generate excel reports (i am using EPPlus) and powerpoint reports (i am using aspose.slides). Here is an example controller action:
public ActionResult GenerateExcelReport(FilterParams args)
{
byte[] results = GenerateLargeExcelReportThatTake30Seconds(args);
return File(results, #"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml", "MyReport.xlsx");
}
The functionality working great but I am trying to figure out if these expensive operations (some reports can take up to 30 seconds to return) are impacting other users. In the previous question, I had an expensive operation that the user DIDN"T have to wait for but in this case he does have to wait for as its a syncronoous activity (click Generate Report and expectation is that users get a report when its finished)
In this case, I don't care that the main user has to wait 30 seconds but i just want to make sure I am not negatively impacting other users because of this expensive operation, generating files, etc
Is there any best practice here in asp.net-mvc for this use case ?
You can try combination of Hangfire and SignalR. Use Hangfire to kickoff a background job and relinquish the http request. And once report generation is complete, use SignalR to generate a push notification.
SignalR notification from server to client
Alternate option is to implement a polling mechanism on client side.
Send an ajax call to enque a hangfire job to generate the report.
And then start polling some api using another ajax call that provides status and as soon report is ready, retrieve it. I prefer to use SignalR rather than polling.
If the report processing is impacting the performance on the web server, offload that processing to another server. You can use messaging (ActiveMQ or RabbitMQ or some other framework of your choice) or rest api call to kick off report generation on another server and then again use messaging or rest api call to notify report generation completion back to the web server, finally SignalR to notify the client. This will let the web server be more responsive.
UPDATE
Regarding your question
Is there any best practice here in asp.net-mvc for this use case
You have to monitor your application overtime. Monitor both Client side as well as server side. There are few tools you can rely upon such as newrelic, app dynamics. I have used newrelic and it has features to track issues both at client browser as well as server side. The names of the product are "NewRelic Browser" and "NewRelic Server". I am sure there are other tools that will capture similar info.
Analyze the metrics overtime and if you see any anomalies then take appropriate actions. If you observe server side CPU and memory spikes, try capturing metrics on client side around same timeframe. On client side if you notice any timeout issues, connection errors that means your application users are unable to connect to your app while the server is doing some heavy lifting. Next try to Identify server side bottlenecks. If there is not enough room to performance tune the code, then go thru some server capacity planning exercise and figure out how to further scale your hardware or move the background jobs out of the web servers to reduce load. Just capturing metrics using these tools may not be enough, you may have to instrument (log capturing) your application to capture additional metrics to properly monitor application health.
Here you can find some information about capacity planning for .net application from Microsoft.
-Vinod.
These are all great ideas on how to move work out of the request/response cycle. But I think #leora simply wants to know whether a long-running request will adversely impact other users of an asp.net application.
The answer is no. asp.net is multi-threaded. Each request is handled by a separate worker thread.
In general it could be considered a good practice to run long running tasks in background and give some kind of notification to user when the job is done. As you probably know web request execution time is limited to 90 seconds, so if your long running task could exceed this, you have no choice but to run in some other thread/process. If you are using .net 4.5.2 you can use HostingEnvironment.QueueBackgroundWorkItem for running long running tasks in background and use SignalR to notify user when the task is finished the execution. In case that you are generating a file you can store it on server with some unique ID and send to user a link for downloading it. You can delete this file later (with some windows service for example).
As mentioned by others, there are some more advanced background task runners such as Hangfire, Quartz.Net and others but the general concept is the same - run task in backround and notify user when it is done. Here is some nice article about different oprions to run background tasks.
You need to use async and await of C#.
From your question I figured that you are just concerned with the fact that the request can be taking more resources than it should, instead of with scalability. If that's the case, make your controller actions async, as well as all the operations you call, as long as they involve calls that block threads. e.g. if your requests go through wires or I/O operations, they will be blocking the thread without async (technically, you will, since you will wait for the response before continuing). With async, those threads become available (while awaiting for the response), and so they can potentially serve other requests of other users.
I assumed you are not wandering how to scale the requests. If you are, let me know, and I can provide details on that as well (too much to write unless it's needed).
I believe a tool/library such as Hangfire is what your looking for. First, it'll allows for you to specify a task run on a background thread (in the same application/process). Using various techniques, such as SignalR allows for real-time front-end notification.
However, something I set up after using Hangfire for nearly a year was splitting our job processing (and implementation) to another server using this documentation. I use an internal ASP.NET MVC application to process jobs on a different server. The only performance bottleneck, then, is if both servers use the same data store (e.g. database). If your locking the database, the only way around it is to minimize the locking of said resource, regardless if the methodology you use.
I use interfaces to trigger jobs, stored in a common library:
public interface IMyJob
{
MyJobResult Execute( MyJobSettings settings );
}
And, the trigger, found in the front-end application:
//tell the job to run
var settings = new MyJobSettings();
_backgroundJobClient.Enqueue<IMyJob>( c => c.Execute( settings ) );
Then, on my background server, I write the implementation (and hook in it into the Autofac IOC container I'm using):
public class MyJob : IMyJob
{
protected override MyJobResult Running( MyJobSettings settings )
{
//do stuff here
}
}
I haven't messed too much with trying to get SignalR to work across the two servers, as I haven't run into that specific use case yet, but it's theoretically possible, I imagine.
You need to monitor your application users to know if other users are being affected e.g. by recording response times
If you find that this is affecting other users, you need to run the task in another process, potentially on another machine. You can use the library Hangfire to achieve this.
Using that answer, you can declare a Task with low priority
lowering priority of Task.Factory.StartNew thread
public ActionResult GenerateExcelReport(FilterParams args)
{
byte[] result = null;
Task.Factory.StartNew(() =>
{
result = GenerateLargeExcelReportThatTake30Seconds(args);
}, null, TaskCreationOptions.None, PriorityScheduler.BelowNormal)
.Wait();
return File(result, #"application/vnd.openxmlformats-officedocument.spreadsheetml.sheet.main+xml", "MyReport.xlsx");
}
Queue the jobs in a table, and have a background process poll that table to decide which Very Large Job needs to run next. Your web client would then need to poll the server to determine when the job is complete (potentially by checking a flag in the database, but there are other methods.) This guarantees that you won't have more than one (or however many you decide is appropriate) of these expensive processes running at a time.
Hangfire and SignalR can help you here, but a queueing mechanism is really necessary to avoid major disruption when, say, five users request this same process at the same time. The approaches mentioned that fire off new threads or background processes don't appear to provide any mechanism for minimizing processor / memory consumption to avoid disrupting other users due to consuming too many resources.

Scheduled Job equivalent functionality in MVC

I have a requirement in my MVC app.
I had an export to excel functionality that is taking 3 mins of time (user clicks on a export button and waits on).
This export downloads an excel that has multiple worksheets after applying certain rules on the data.
These rules are datamanipulations plus applying colors on the cells belonging to certain columns.
Inorder to avoid the wait time, I was asked to develop a code with in the MVC app that can run like a scheduled job.
This job has to export the excel to a dedicated folder with in the network on the scheduled time (daily once).
Also i was asked to develop a web page within the app which has links to download these excels.
Quesions here (Any help would be appreciated) :
I have chosen Quartz.NET to implement this requirement. This is an open source (to my little knowledge) that can
provide the facility to schedule a job (class developed in .NET). Is it the right choice or would there be any implications in future?
Is it really needed to develop a job like code or any other way of coding can address this?
I'm not very familiar with Quartz.net, but I do know that trying to run background/scheduled tasks from within the same process as the MVC application can be problematic.
Ref 1: http://haacked.com/archive/2011/10/16/the-dangers-of-implementing-recurring-background-tasks-in-asp-net.aspx/
Ref 2: http://www.hanselman.com/blog/HowToRunBackgroundTasksInASPNET.aspx
Essentially, you can't guarantee that the process will complete correctly when running it due to how IIS handles app pools (which is where you MVC process runs: assuming hosting on IIS anyway).
You mention running a scheduled task within your MVC app. Again, this is incorrect. Why can't you just slap a console app project into the solution and drive the code from there, then put it on the server and use the Windows Task Scheduler?
In terms of background tasks, the "correct" way to do this is to send a command from your MVC app to some sort of message queue, which can then ensure that the command doesn't get dropped. I've used RabbitMQ in the past (a middleware message broker). Perhaps this is the aim of Quartz.net.
This setup typically involves another app (for me, usually a console app run on the server) that receives the command message from the message queue and runs in it's own process, entirely separate from MVC and thus the issues inherent with IIS AppPools and background tasks.
A lot of work, really... one would think it'd be easier, but that's the surefire way to do it and maintain the integrity of the task to be run.

Where to initiate and manage background operations in Asp.Net MVC

The first operation will be carrying out several calculations and updating the same tables that users also access. These processes dont depend on any indivual request/state and will always be running.
Should I put the first operation in a separate application/machine?
The second operation acts like a manager across all requests and will be running continuously.
How do I initiate and maintain the second operation? Do I start an Admin request or can I initiate at a global level automatically?
This post (https://blog.stackoverflow.com/2008/07/easy-background-tasks-in-aspnet/) explains how to implement scheduled/background tasks with asp.net mvc. Otherwise you can use Windows Services or WCF services. You can use DB tables to synchronize background jobs with requests.
You don't need a separate application/machine but it depends on your requirements, your architecture (single server or farm) and your performance goals.

How to implement background processing for ASP.Net MVC website in a shared hosting environment?

I am developing my first web application using ASP.Net MVC, and I am in a situation where I would like a background service to process status notifications outside of the application, not unlike the reputation/badge system on stackoverflow.
What is the best way to handle something like this? Is it even possible in a shared-hosting environment like Godaddy, which I am using.
I don't need to communicate with the background worker directly, since I will be adding notification records to a database table with a column set to an "unprocessed" state. Then the worker will just scan the table on a regular schedule and processes what is ready.
Thanks for your advice.
Have you tried with quartz.net? I think it may fit your needs.
also take a look at this Simulate a Windows Service using ASP.NET to run scheduled jobs article.
it explains a nice way to schedule operations with no outer dependence.
The idea is to use Cache timeout to control the schedule. I've implemented it successfully on a project which required regular temp file cleaning. This cleaning is a bit heavy so we move this clean operation in a scheduled job (using the asp.net cache) to avoid having to deploy scheduled task or custom program.
To answer whether GoDaddy will support a seperate service you need to ask them.
However there are a number of creative ways that you can "get around" this issue on shared hosting.
Have a secure page that's purpose is to execute your background work. You could have scheduled task on a machine under your control that calls to this web page at set intervals.
Use a variation of the Background Worker Thread answer from #safi. Your background worker thread could check to see if another is already processing and stop, so that only one instance is running at a time.
If only one background task is enough for you then use the WebBackgrounder
And this is the article with detailed explanation.

windows service vs scheduled task

What are the cons and pros of windows services vs scheduled tasks for running a program repeatedly (e.g. every two minutes)?
Update:
Nearly four years after my original answer and this answer is very out of date. Since TopShelf came along Windows Services development got easy. Now you just need to figure out how to support failover...
Original Answer:
I'm really not a fan of Windows Scheduler. The user's password must be provided as #moodforall points out above, which is fun when someone changes that user's password.
The other major annoyance with Windows Scheduler is that it runs interactively and not as a background process. When 15 MS-DOS windows pop up every 20 minutes during an RDP session, you'll kick yourself that didn't install them as Windows Services instead.
Whatever you choose I certainly recommend you separate out your processing code into a different component from the console app or Windows Service. Then you have the choice, either to call the worker process from a console application and hook it into Windows Scheduler, or use a Windows Service.
You'll find that scheduling a Windows Service isn't fun. A fairly common scenario is that you have a long running process that you want to run periodically. But, if you are processing a queue, then you really don't want two instances of the same worker processing the same queue. So you need to manage the timer, to make sure if your long running process has run longer than the assigned timer interval, it doesn't kick off again until the existing process has finished.
After you have written all of that, you think, why didn't I just use Thread.Sleep? That allows me to let the current thread keep running until it has finished and then the pause interval kicks in, thread goes to sleep and kicks off again after the required time. Neat!
Then you then read all the advice on the internet with lots of experts telling you how it is really bad programming practice:
http://msmvps.com/blogs/peterritchie/archive/2007/04/26/thread-sleep-is-a-sign-of-a-poorly-designed-program.aspx
So you'll scratch your head and think to yourself, WTF, Undo Pending Checkouts -> Yes, I'm sure -> Undo all today's work..... damn, damn, damn....
However, I do like this pattern, even if everyone thinks it is crap:
OnStart method for the single-thread approach.
protected override void OnStart (string args) {
// Create worker thread; this will invoke the WorkerFunction
// when we start it.
// Since we use a separate worker thread, the main service
// thread will return quickly, telling Windows that service has started
ThreadStart st = new ThreadStart(WorkerFunction);
workerThread = new Thread(st);
// set flag to indicate worker thread is active
serviceStarted = true;
// start the thread
workerThread.Start();
}
The code instantiates a separate thread and attaches our worker
function to it. Then it starts the thread and lets the OnStart event
complete, so that Windows doesn't think the service is hung.
Worker method for the single-thread approach.
/// <summary>
/// This function will do all the work
/// Once it is done with its tasks, it will be suspended for some time;
/// it will continue to repeat this until the service is stopped
/// </summary>
private void WorkerFunction() {
// start an endless loop; loop will abort only when "serviceStarted"
// flag = false
while (serviceStarted) {
// do something
// exception handling omitted here for simplicity
EventLog.WriteEntry("Service working",
System.Diagnostics.EventLogEntryType.Information);
// yield
if (serviceStarted) {
Thread.Sleep(new TimeSpan(0, interval, 0));
}
}
// time to end the thread
Thread.CurrentThread.Abort();
}
OnStop method for the single-thread approach.
protected override void OnStop() {
// flag to tell the worker process to stop
serviceStarted = false;
// give it a little time to finish any pending work
workerThread.Join(new TimeSpan(0,2,0));
}
Source: http://tutorials.csharp-online.net/Creating_a_.NET_Windows_Service%E2%80%94Alternative_1%3a_Use_a_Separate_Thread (Dead Link)
I've been running lots of Windows Services like this for years and it works for me. I still haven't seen a recommended pattern that people agree on. Just do what works for you.
Some misinformation here. Windows Scheduler is perfectly capable of running tasks in the background without windows popping up and with no password required. Run it under the NT AUTHORITY\SYSTEM account. Use this schtasks switch:
/ru SYSTEM
But yes, for accessing network resources, the best practice is a service account with a separate non-expiring password policy.
EDIT
Depending on your OS and the requirements of the task itself, you may be able to use accounts less privileged than Localsystem with the /ru option.
From the fine manual,
/RU username
A value that specifies the user context under which the task runs.
For the system account, valid values are "", "NT AUTHORITY\SYSTEM", or "SYSTEM".
For Task Scheduler 2.0 tasks, "NT AUTHORITY\LOCALSERVICE", and
"NT AUTHORITY\NETWORKSERVICE" are also valid values.
Task Scheduler 2.0 is available from Vista and Server 2008.
In XP and Server 2003, system is the only option.
In .NET development, I normally start off by developing a Console Application, which will run will all logging output to the console window. However, this is only a Console Application when it is run with the command argument /console. When it is run without this parameter, it acts as a Windows Service, which will stay running on my own custom coded scheduled timer.
Windows Services, I my mind, are normally used to manage other applications, rather than be a long running application. OR .. they are continuously-running heavyweight applications like SQL Server, BizTalk, RPC Connections, IIS (even though IIS technically offloads work to other processes).
Personally, I favour scheduled tasks over Window Services for repititive maintenance tasks and applications such as file copying/synchronisations, bulk email sending, deletion or archiving of files, data correction (when other workarounds are not available).
For one project I have been involved in the development of 8 or 9 Windows Services, but these sit around in memory, idle, eating 20MB or more memory per instance. Scheduled tasks will do their business, and release the memory immediately.
What's the overhead of starting and quitting the app? Every two minutes is pretty often. A service would probably let the system run more smoothly than executing your application so frequently.
Both solutions can run the program when user isn't logged in, so no difference there. Writing a service is somewhat more involved than a regular desktop app, though - you may need a separate GUI client that will communicate with the service app via TCP/IP, named pipes, etc.
From a user's POV, I wonder which is easier to control. Both services and scheduled tasks are pretty much out of reach for most non-technical users, i.e. they won't even realize they exist and can be configured / stopped / rescheduled and so on.
The word 'serv'ice shares something in common with 'serv'er. It is expected to always be running, and 'serv'e. A task is a task.
Role play. If I'm another operating system, application, or device and I call a service, I expect it to be running and I expect a response. If I (os, app, dev) just need to execute an isolated task, then I will execute a task, but if I expect to communicate, possibly two way communication, I want a service. This has to do with the most effective way for two things to communicate, or a single thing that wants to execute a single task.
Then there's the scheduling aspect. If you want something to run at a specific time, schedule. If you don't know when you're going to need it, or need it "on the fly", service.
My response is more philosophical in nature because this is very similar to how humans interact and work with another. The more we understand the art of communication, and "entities" understand their role, the easier this decision becomes.
All philosophy aside, when you are "rapidly prototyping", as my IT Dept often does, you do whatever you have to in order to make ends meet. Once the prototyping and proof of concept stuff is out of the way, usually in the early planning and discovering, you have to decide what's more reliable for long term sustainability.
OK, so in conclusion, it's highly dependent on a lot of factors, but hopefully this has provided insight instead of confusion.
A Windows service doesn't need to have anyone logged in, and Windows has facilities for stopping, starting, and logging the service results.
A scheduled task doesn't require you to learn how to write a Windows service.
It's easier to set up and lock down windows services with the correct permissions.
Services are more "visible" meaning that everyone (ie: techs) knows where to look.
This is an old question but I will like to share what I have faced.
Recently I was given a requirement to capture the screenshot of a radar (from a Meteorological website) and save it in the server every 10 minutes.
This required me to use WebBrowser.
I usually make windows services so I decided to make this one service too but it would keep crashing.
This is what I saw in Event Viewer
Faulting module path: C:\Windows\system32\MSHTML.dll
Since the task was urgent and I had very less time to research and experiment, I decided to use a simple console application and triggered it as a task and it executed smoothly.
I really liked the article by Jon Galloway recommended in accepted answer by Mark Ransom.
Recently passwords on the servers were changed without acknowledging me and all the services failed to execute since they could not logon.
So ppl claiming in the article comments that this is a problem. I think windows services can face same problem (Pls. correct me if I am wrong, I am jus a newbie)
Also the thing mentioned, if using task scheduler windows pop up or the console window pops up.
I have never faced that. It may pop up but it is at least very instantaneous.
Why not provide both?
In the past I've put the 'core' bits in a library and wrapped a call to Whatever.GoGoGo() in both a service as well as a console app.
With something you're firing off every two minutes the odds are decent it's not doing much (e.g. just a "ping" type function). The wrappers shouldn't have to contain much more than a single method call and some logging.
Generally, the core message is and should be that the code itself must be executable from each and every "trigger/client". So it should not be rocket science to switch from one to the other approach.
In the past we used more or less always Windows Services but since also more and more of our customers switch to Azure step by step and the swap from a Console App (deployed as a Scheduled Task) to a WebJob in Azure is much easier than from a Windows Service, we focus on Scheduled Tasks for now. If we run into limitations, we just ramp up the Windows Service project and call the same logic from there (as long as customers are working OnPrem..) :)
BR,
y
Windows services want more patience until it's done.
It has a bit hard debug and install. It's faceless.
If you need a task which must be done in every second, minute or hour,
you should choice Windows Service.
Scheduled Task is quickly developed and has a face.
If you need a daily or weekly task, you can use Scheduled Task.

Resources