jquery .ajax request blocked by long running .ajax request - asp.net-mvc

I am trying to use jQuery's .ajax functionality to make a progress bar.
A request is submited via .ajax, which starts a long running process. Once submited another .ajax request is called on an interval which checks the progress of this process. Then a progress meter is updated using this information.
However, the progress .ajax call only returns once the long running process has completed. Its like its being blocked by the initial request.
The weird thing is this process works fine on dev but is failing on the deployment server. I am running on IIS using ASP.Net MVC.
Update: Apparently, it is browser related because it is working fine on IE 7 but is not working on IE 8. This is strange because IE 8 allows up to 6 connections on broadband where IE 7 only allows 2 requests per domain.
Update2: I think it's a local issue because it appears to be working fine on another IE 8 machine.

The server will only run one page at a time from each user. When you send the requests to get the progress status, they will be queued.
The solution is to make the page that returns the status sessionless, using EnableSessionState="false" in the #Page directive. That way it's not associated with any user, so the request isn't queued.
This of course means that you can't use session state to communicate the progress state from the thread running the process to the thread getting the status. You have to use a different way of keeping track of running processes and send some identifier along with the requests that gets the status so that you know which user it came from.

Some browsers (in particular, IE) only allows two requests to the same domain at the same time. If there are any other requests happening at the same time, then you might run into this limitation. One way around it is to have a few different aliases for the domain (some sites use "www1.example.com" and "www2.example.com", etc)
You should be able to use Firebug or Fiddler to determine how many requests are in progress, etc.

Create an Asynchronus handler (IHttpAsyncHandler) for your second ajax request.
use any parameter required via the .ashx querystring in order to process what you want because the HttpContext won't have what you'll need. You barely will have access to the Application object.
Behind the scenes ASP.NET will create for you a thread from the CLR pool, not the application pool, so You'll have an extra performance gain with IHttpAsyncHandler

Related

Is there a way to create connection timeout to activate a service-worker?

I'm using Electron, which is based on Chromium, to create an offline desktop application.
The application uses a remote site, and we are using a service worker to offline parts of the site. Everything is working great, except for a certain situation that I call the "airplane wifi situation".
Using Charles, I have restricted download bandwidth to 100bytes/s. The connection is sent through webview.loadURL which eventually calls LoadURLWithParams in Chromium. The problem is that it does not fail and then activate the service worker, like no connection at all would. Once the request is sent, it waits forever for the response.
My question is, how do I timeout the request after a certain amount of time and load everything from the service worker as if the user was truly offline?
An alternative to writing this yourself is to use the sw-toolbox library, which provides routing and runtime caching strategies for service workers, along with some built in options for helping with these sorts of advanced use cases. In particular, you'd want to use the networkTimeoutSeconds parameter to configure the amount of time to wait for a response from the network before you fall back to a previously cached response.
You can use it like the following:
toolbox.router.get(
new RegExp('my-api\\.com'),
toolbox.networkFirst, {
networkTimeoutSeconds: 10
}
);
That would configure a route that matched GET requests with URLs containing my-api.com, and applied a network-first strategy that will automatically fall back to the previously cached response after 10 seconds.

How to update a web page from requests made by another client (in rails)?

Here is my need:
I have to displays some information from a web page.
The web browser is actually on the same machine (localhost).
I want the data to be updated dynamically by the server initiative.
Since HTTP protocol is actually a request/response protocol, I know that to get this functionality, the connection between the server and the client (which is local here) should be kept open in some way (Websocket, Server-Sent Events, etc..)
Yes, "realtime" is really a fashion trend nowadays and there are many frameworks out there to do this (meteor, etc...)
And indeed, it seems that Rails supports this functionnality too in addition to using Websockets (Server-Sent Events in Rails 4 and ActionCable in Rails 5)
So achieving this functionnality would not be a big deal, I guess...
Nevertheless what I really want is to trigger an update of the webpage (displayed here locally) from a request made by another client..
This picture will explain that better :
At the beginning, the browser connects to the (local) server (green arrows).
I guess that a thread is executed where all the session data (instance variables) are stored.
In order to use some "realtime" mechanisms, the connection remains open and therefore the thread Y is not terminated. (I guess this is how it works)
A second user is connecting (blue arrows) to the server (could be or not be the same web page) and make some actions (eg. posting a form).
Here the response to that external client does not matter. Just an HTTP OK response is fine. But a confirmation web page could also be returned.
But in anyway the thread X (and/or the connection) has no particular reason to be kept.
Ok, here is my question (BTW thank you for reading me thus far).
How can I echo this new data on the local web browser ?
I see 2 differents ways to do this :
Path A: Before terminating, the thread X passes the data (its instance variables) to the thread Y which has its connection still open. Thus the server is able to update the web browser.
Path B: Before terminating the thread X sends a request (I mean a response since it is the server) directly to the web browser using a particular socket.
Which mechanisms should I use in either method to achieve this functionnality ?
For method A, how can I exchange data between threads ?
For method B, how can I use an already opened socket ?
But which of these two methods (or another one) is actually the best way to do that?
Again thank you for reading me thus far, and sorry for my bad english.
I hope I've been clear enough to expose my need.
You are overthinking this. There is no need to think of such low-level mechanisms as threads and sockets. Most (all?) pub-sub live-update tools (ActionCable, faye, etc.) operate in terms of "channels" and "events".
So, your flow will look like this:
Client A (web browser) makes a request to your server and subscribes to events from channel "client-a-events" (or something).
Client B (the other browser) makes a request to your server with instructions to post an event to channel "client-a-events".
Pub-sub library does its magic.
Client A gets an update and updates the UI accordingly.
Check out this intro guide: Action Cable Overview.

How to find out if a previous request is running in asp.net MVC?

i have an android application that sends requests to an asp.net website and receives the response.
asp.net mvc controller receives the request and starts the android emulator on server and does something and sends the response.
The problem is when two simultaneous requests arrive I want to either queue the second request or find out if previous request is running and if so, wait for a specified time and then start doing its thing (running emulator).
The second solution is simpler, so I wanna know if there's a way to know if a previous request is running in asp.net.
thanks all.
You could use a static boolean which you set when the process starts and clear when the process stops.
Keep in mind to check and set in a thread safe way, eg by using "lock"

ASP.NET MVC Async Controller vs Server Push(COMET/Reverse Ajax)

I'm building an ASP.NET MVC site in which the clients(browser) can make API calls that take upto 30 minutes(or more..) to process. Obviously I couldn't use normal MVC Controllers to do this as a few such requests would block all my IIS worker threads leaving other faster calls blocked.
I've looked at the following two options :
ASP.NET MVC's Asynchronous controllers
PokeIn Library which allows server push via Reverse AJAX(long holding HTTP requests for older browsers) or WebSockets(from HTML5 specification for newer browsers)
Now both of it seems like a good feasible option.
Option 1 seems easiest for me to implement. With Asynchronous Controllers, my IIS worker threads wouldn't be blocked hence allowing my other faster API calls to go through seamlessly. However from the Async Controller documentation, I perceive that, it spawns of another non IIS thread which would be blocked/waiting for my long running(30~ mins) process to complete. I've read that, "If you block or sleep in a controller no matter whether it is async or not async it is very bad."
In Option 2, if my clients are using newer browsers, which supports WebSockets, this would perhaps be most performant as I do not need to have any blocking thread on the server side. When the client triggers a slow API call I'd raise an event, on the completion of which(say 30~ mins later) I'd raise another event to update all my client's browsers with the updated content.
However with PokeIn library, if part of my clients do not have WebSocket supporting browsers(older ones..), I'm not sure If they'd be hogging one of my IIS worker threads.
Is Option 2 an overkill for my requirement ?
In Option 1 is it bad to have my Async Controller wait on the slow process ?
One other disadvantage with Option 1 is that if the user Refreshes the page before the request completes, He'd no longer get the update of the job, once it completes !
Any ideas, suggestions are welcome.
Thanks
PokeIn uses same in-memory/thread pools to push the messages for websocket and ajax connections since it has internal websocket server. The delivery time certainly differs for ajax and websocket but whatever method/option you pick, you will have that difference. Besides, probably you already know but Pokein fallback to comet ajax in case a client doesn't support websocket and you don't have to deal with it.
Hope this answers your question for option 2.

Authenticated user and multiple requests (IIS7 MVC3)

This is one of those questions that maybe should go so serverfault, but then maybe there is a code-level solution.
Anyway, here is the question. I have a regular MVC3 application which requires user login to access (uses the Authorize tag on most of the actions). I also have a Silverlight object within the application that makes HTTP GET calls to a controller action which returns an image (in fact this is a map tile). This particular controller action has no authorize tag, and is therefore public.
The Silverlight component runs slow or just blocks, because the MVC application can apparently process only ONE request at a time, as confirmed by firebug. This means that the map tiles can be served only one after the other. Moreover, regular (non-map-related) requests are enqueued too, and everything times out after a while.
So to make a test, I setup another website with the same document root, and I instructed the Silverlight component to read tiles from there. Now tiles ARE requested concurrently and it runs smoothly.
So, is there any way to resolve this situation and use one site only?
If you are using Session on the server action that would explain why requests are queued. Because the Session is not thread safe ASP.NET serializes all requests from the same session and executes them sequentially.

Resources