My task is to deliver request content before the request has been processed.
How can I do that?
public ActionResult UpdateRecord()
{
Response.Write("OK");
Response.End();
// I need to complete the request here and
// perform some action in the work thread
Thread.Sleep(2000);
return Content("Something user never sees");
}
You cannot send content back to the client in the same request after the request has ended. It's just not possible.
It sounds to me like what you actually want is an asynchronous processing request. This is where you start a process in one request, then you continue to poll the server (refresh the page) periodically until the request is done.
This is typically accomplished using a background service of some kind, such as a Message Queue, or Web Service that triggers a Windows Service to process something. In a pinch, you can use the a Scheduled Task, or even a Cache removal callback trick as well.
Using a background thread won't generally work because threads started by an ASP.NET worker process are terminated after the request ends.
Another possibility, if you don't want to use polling is to use a push service like SignalR to signal the page that the job has been finished.
EDIT:
Well, as I said, you can't do it like that. You have to push that work into some kind of background process, however you can't just spin off a thread because threads will get terminated once the request ends.
There are many questions (and answers) here on SO on how to do this, and they all boil down to what I mentioned above. Scheduled tasks, windows service to do the processing, the Cache removal callback trick, etc.. you could also use a WCF one-way call with a WCF service or any of a number of other ways.
Related
I have a Rails API which can handle requests from the clients. Clients use that API to perform analysis of their data. Client POSTs the data to API, API checks if that data have been analysed before. If so API just respond with analysis result. If the data haven't been analyzed before API:
Tells client that analysis started.
Establishes the connection with analyzing microservice.
Performs asynchronous (or deferred or i don't know) request to the analyzing microservice and waiting for response. The analysis takes much time so neither the API nor the microservice should be blocked while doing it.
When the response from analyzing microservice is returned API hands it to the client.
The main issue for me is to set up things such way that client could receive somehow the message "Your data had been sent to analysis" right after he performed the request. And then when analysis will be done client could receive its result.
The question is what approach I have to use in that case? Async responses, deferred responses, something else? And what known solutions could help me with that? Any gems?
I'm new to that stuff so I'm really sorry if I ask dumb questions.
If using HTTP you can only have one response to every request. To send multiple responses, i.e. "work in progress", then later the "results", you would need to use a different protocol, e.g. web sockets.
Since HTTP is so very common I'd stick with that in combination with background jobs. There are a couple of options which spring to mind.
Polling: The API kicks off a background jobs (to call the microservice) and responds to the client with a URL which the client can ping periodically for the result. The URL would respond with some kind of "work in progress" status until the result is actually ready). The URL would need to include some kind of id so the API can lookup the background job.
The API would potentially have two URLS; /api/jobs/new and /api/jobs/<ID>. They would, in Rails, map to a controller new and show action.
Webhooks: Have the client include a URL of its own in the request. Once the result is available have the background job hit the given URL with the result.
Either way, if using HTTP, you will not be able to handle the whole thing within a request/response, you will have to use some kind of background processing (so request to the microservice happens in a different process). You could look at Sidekiq, for example.
Here is an example for polling:
URL: example.com/api/jobs/new
web app receives client request
generates a unique id for the request, SecureRandom.uuid.
starts a background job (Sidekiq) passing in the uuid and any other parameters needed
respond with URL such as example.com/api/jobs/
--
background job
sends request to microservice API and waits for response
saves result to database with uuid
--
URL: example.com/api/jobs/UUID
look in database for UUID, if not found respond that job is "in progress". If found return result found in database.
Depending on what kind of API you use. I assume your clients interact via HTTP.
If you want to build an asynchronous API over HTTP the first thing that you should do: accept the request, create a job, handle it in the background and immediately return.
For the client to get the response you have to 2 options:
Implement a status endpoint where clients can periodically poll the status of the job
Implement a callback via webhooks. So the client has to provide a URL which you then call after you're done.
A good start for background processing is the sidekiq gem or more general ActiveJob that ships with Rails.
I have a scenario where my rails controller action has to make a API request to a backend business logic server which does a lot of computations and returns me the result.
I'm thinking to show a loading page to the user and make the call asynchronous using Faye or any other option and redirect the user when the call is complete..
But even if I make the call asynchronous, the HTTP request needs to wait for the server to return the data after process, which would take around 20 seconds.
I would like to know what is the best way to make such calls in rails.?
I had faced a similar situation, below is the route that I took:
When the controller action is triggered
a. I fired off a 'async' request to the API using a worker(I used sidekiq)
b. Loaded a 'AJAX' spinner gif on top of a modal
The worker handling the API request runs on another thread which is synchronous and waits for the result from the API
When the processing is done, the worker fires off notification via 'Faye' which removes the modal and populates the data.
Return an HTTP response with status 202 Accepted(for the request that need to take long to process) and start making AJAX requests(to a URL, e.g /jobs/1) to check the status of the background job. Once your job has finished, update it's status so that your Javascript(AJAX) can handle the result of that background job.
I'm building an API using Rails where requests come in and they need to be executed by a cluster of workers running on a different server (these workers call remote APIs and parse the data, etc...). I'm going to be using Sidekiq or Resque to handle the queueing/processing of that.
My issue is the client needs to wait while this is happening and the controller needs to return the response to the client once it's complete. How would I handle this in the controller? We're using a redis backend, so I was thinking something along the lines of subscribing to a pub/sub channel and waiting for the worker to publish a status message. The controller would wait for a set time period and then return a 'check back later' response to the client if it doesn't receive a message in time. What would be the best way to implement that, or is there a better solution?
Do not make your clients wait! There are a lot of issues if you make the controller block for a long running job:
Other programs may assume the request timed out (proxies, browsers, scripts, etc.)
It makes your API endpoints become a source for denial of service
It requires you to put more engineering work into web servers (since a rails process can't handle another web request while it's handling the blocking call)
Part of the reason of using Sidekiq or Resque is the avoid controllers that do heavily lifting during the http request.
Instead, background jobs should report their status to the database. Then web server should query and return to the client the latest status from the database.
If clients need more immediate feedback, you can:
make clients constantly poll
post request to the client (if the API consumer is another webserver)
use another protocol mechanism (eg - websockets).
I'm planning to build a quite large application (large in term of concurrent user / number of request, not in term of features).
Basically, I'll have a service somewhere, that is waiting for commands execute them, and acknowledge the completion later. This service will use a service bus to communicate, making the execution eventual, before a acknowledge message is issued.
The consumers of this service can be any kind of application (WPF, SL, ...) but my main (and first) client will be an asp.net MVC application + WebApi (.Net 4.5) or MVC only (.Net 4.0) with ajax controller actions.
The web application will be relying on Ajax call to keep a user friendly responsive application.
I'm quite new to such full blown async architecture, and I'm having some questions to avoid future headache :
my web api calls can take some amount of times. How should I design properly the api to support long running operations (some kind of async?). I've read about the new async keyword, but for the sake of knowledge, I'd like to understand what's behind.
My calls to the service will consist is publishing a message and wait for the ack message. If I wrap this in a single method, how should I write this method? Should I "block" until the ack is received (I suppose I shouldn't)? Should I return a Task object and let the consumer decide?
I'm also wondering if SignalR can help me. With signalR, I think I can use a real "fire and forget" command issuing, and route up to the client to ack message.
Am I completely out of subject, and should I take another approach?
In term of implementation details / framework, I think I'll use :
Rabbitmq as messaging system
Masstransit to abstract the messaging system
asp.MVC 4 to build the UI
Webapi to isolate command issuing out of UI controllers, and to allow other kind of client to issue commands
my web api calls can take some amount of times.
How should I design properly the api to support long
running operations (some kind of async?).
I'm not 100% sure where you're going. You ask questions about Async but also mention message queuing, by throwing in RabbitMQ and MassTransit. Message queuing is asynchronous by default.
You also mention executing commands. If you're referring to CQRS, you seperate commands and queries. But what I'm not 100% about is what you're referring to when mentioning "long running processes".
When you query data, the data should already be present. Preferably in a way that is needed for the question at hand.
When you query data, no long-running-process should be started
When you execute commands, a long-running-processes can be started. But that's why you should use message queuing. Specify a task to start the long running process, create a message for it, throw it onto the queue, forget about it altogether. Some other process in the background will pick it up.
When the command is executed, the long-running-process can be started.
When the command is executed, a database can be updated with data
This data can be used by the API if someone requests data
When using this model, it doesn't matter that the long-running-process might take up to 10 minutes to complete. I won't go into detail on actually having a single thread take up to 10 minutes to complete, including locks on database, but I hope you get the point. Your API will be free almost instantly after throwing a message onto the queue. No need for Async there.
My calls to the service will consist is publishing a message and wait for the ack message.
I don't get this. The .NET Framework and your queuing platform take care of this for you. Why would you wait on an ack?
In MassTransit
Bus.Instance.Publish(new YourMessage{Text = "Hi"});
In NServiceBus
Bus.Publish(new YourMessage{Text = "Hi"});
I'm also wondering if SignalR can help me.
I should think so! Because of the asynchronous nature of messaging, the user has to 'wait' for updates. If you can provide this data by 'pushing' updates via SignalR to the user, all the better.
Am I completely out of subject, and should I take another approach?
Perhaps, I'm still not sure where you're going.
Perhaps read up on the following resources.
Resources:
http://www.udidahan.com/2013/04/28/queries-patterns-and-search-food-for-thought/
http://www.udidahan.com/2011/10/02/why-you-should-be-using-cqrs-almost-everywhere%E2%80%A6/
http://www.udidahan.com/2011/04/22/when-to-avoid-cqrs/
http://www.udidahan.com/2012/12/10/service-oriented-api-implementations/
http://bloggingabout.net/blogs/dennis/archive/2012/04/25/what-is-messaging.aspx
http://bloggingabout.net/blogs/dennis/archive/2013/07/30/partitioning-data-through-events.aspx
http://bloggingabout.net/blogs/dennis/archive/2013/01/04/databases-and-coupling.aspx
my web api calls can take some amount of times. How should I design
properly the api to support long running operations (some kind of
async?). I've read about the new async keyword, but for the sake of
knowledge, I'd like to understand what's behind.
Regarding Async, I saw this link being recommended on another question on stackoverflow:
http://msdn.microsoft.com/en-us/library/ee728598(v=vs.100).aspx
It says that when a request is made to an ASP .NET application, a thread is assigned to process the request from a limited thread pool.
An asynchronous controller action releases the thread back to the thread pool so that it is ready to accept addtitional requests. Within the action the operation which needs to be executed asynchronously is assigned to a callback controller action.
The asynchronous controller action is named using Async as the suffix and the callback action has a Completed suffix.
public void NewsAsync(string city) {}
public ActionResult NewsCompleted(string[] headlines) {}
Regarding when to use Async:
In general, use asynchronous pipelines when the following conditions
are true:
The operations are network-bound or I/O-bound instead of CPU-bound.
Testing shows that the blocking operations are a bottleneck in site performance and that IIS can service more requests by using
asynchronous action methods for these blocking calls.
Parallelism is more important than simplicity of code.
You want to provide a mechanism that lets users cancel a long-running request.
I think developing your service using ASP .NET MVC with Web API and using Async controllers where needed would be a good approach to developing a highly available web service.
Using a message based service framework like ServiceStack looks good too:
http://www.servicestack.net/
Additional resources:
http://msdn.microsoft.com/en-us/magazine/cc163725.aspx
http://www.codethinked.com/net-40-and-systemthreadingtasks
http://dotnet.dzone.com/news/net-zone-evolution
http://www.aaronstannard.com/post/2011/01/06/asynchonrous-controllers-ASPNET-mvc.aspx
http://channel9.msdn.com/Events/TechDays/Techdays-2012-the-Netherlands/2287
http://www.dotnetcurry.com/ShowArticle.aspx?ID=948 // also shows setup of performance tests
http://www.asp.net/mvc/tutorials/mvc-4/using-asynchronous-methods-in-aspnet-mvc-4
http://visualstudiomagazine.com/articles/2013/07/23/async-actions-in-aspnet-mvc-4.aspx
http://hanselminutes.com/327/everything-net-programmers-know-about-asynchronous-programming-is-wrong
http://www.hanselman.com/blog/TheMagicOfUsingAsynchronousMethodsInASPNET45PlusAnImportantGotcha.aspx
i have an action that executes a "possible" long running task. A possible, because it does a request to a remote server, and because of network latency, it can block the user interface and give a small delay to the user.
My question is not related with "how to send long tasks in background", but how to push a notification to the user. My idea, was that the user clicks the button, it fires a task in background, the web interface is unblocked, and the user can do whatever he wants and, when the task is done, he receives a flash message. I can do it with AJAX, polling the server, a specific action that gives me the status of my task, for example, but there is any pattern to do it event based? Kudos for answers with proof of concept or prototypes.
No proof of concept here, but you could use something like spawn or delayed_job to fire off your Rails task and unblock the interface, and then communicate back to the client with node.js or something similar. Depending on what you want to do, however, long-polling may be more practical than setting up more server software.