Authenticated user and multiple requests (IIS7 MVC3) - asp.net-mvc

This is one of those questions that maybe should go so serverfault, but then maybe there is a code-level solution.
Anyway, here is the question. I have a regular MVC3 application which requires user login to access (uses the Authorize tag on most of the actions). I also have a Silverlight object within the application that makes HTTP GET calls to a controller action which returns an image (in fact this is a map tile). This particular controller action has no authorize tag, and is therefore public.
The Silverlight component runs slow or just blocks, because the MVC application can apparently process only ONE request at a time, as confirmed by firebug. This means that the map tiles can be served only one after the other. Moreover, regular (non-map-related) requests are enqueued too, and everything times out after a while.
So to make a test, I setup another website with the same document root, and I instructed the Silverlight component to read tiles from there. Now tiles ARE requested concurrently and it runs smoothly.
So, is there any way to resolve this situation and use one site only?

If you are using Session on the server action that would explain why requests are queued. Because the Session is not thread safe ASP.NET serializes all requests from the same session and executes them sequentially.

Related

IIS instances sharing data

I don't know if this is common or something but I wanted to check. So I am building a site on an iis7 server and coming across a weird problem. Whenever I have 2 clients accessing the site it seems they are sharing info. Here is an example, when one client does a search for a particular item, the other client goes to the search page and see's the results of client's one search results. I am using a global class to store this information on my code behind.
So here is my question, my understanding of servers was that if two clients accessed the server they were running on different instances of the site, meaning that even if I have a global class in my code it would be as if two machines were running it. Am I wrong in this understanding?
Also are there settings in IIS that I need to change for this to work?
In asp.net, you can use Session variables which are unique serialized token type things stored in server memory. You can store html form info in these sessions so another page on your site can read it.
The syntax in your MVC controller action to create a Session would be:
Session["MyFormData"] = someObject;
http://msdn.microsoft.com/en-us/library/ms178581.aspx

Capture outgoing HTTP request from Controller / Service

So I have the following scenario (it's a Grails 2.1 app):
I have a Controller that can be accessed via //localhost:8080/myController
This controller in turn executes a call to another URL opening a connection using new URL("https://my.other.url").openConnection()
I want to capture the request so I can log the information
I have a Filter present in my web.xml already which does the job well for controllers mapped in my app. But as soon as a request is fired to an external URL, I don't get anything.
I understand that my filter will only be invoked to URLs inside my app, and that depends on my filter mapping which is fine.
I'm struggling to see how a solution inside the app is actually viable. I'm thinking of using a mixed approach with the DevOps team to capture such outgoing calls from the container and then log them into a separate file.
I guess my questions are:
Is there a way to do it inside the app itself?
Is the approach I'm planning a sensible one?
Cheers!
Any reason why you don't want to use http-builder? There a Grails plugin for it, and it makes remote XML calls much easier than handling the plumbing yourself. At the bottom of the linked page they describe how you can enable request logging via log4j configuration.

Asynchronous Asp.Net MVC controller methods?

I will build an Asp.net MVC 3 web page.
View: The view (web page) invoke about five Ajax(jQuery) calls against the methods, which return JsonResult, in a controller and render the results on the web page.
Control: The controller methods read a SQL Server 2008 database using EF4. Two of the SQL statements may take half a minute to execute depending on the server load.
I wish the users can at least see the contents returned from the quick controller/database calls as soon as possible. The page will not have a lot of users (maybe up to 15). Will the long run controller method calls block others if they are not asynchronous? Or is it irrelevant as long as the thread pool is big enough to handle the peak requests of the users?
From the user's view, loading the initial web page is synchronous, i.e. he has to wait until the server delivers the page. The Ajax requests however look asynchronous to him because he can already see part of the page.
From the server's view, everything is synchronous. There is an HTTP request that needs to be processed and the answer is either HTML, JSON or whatever. The client will wait until it receives the answer. And several requests can be processed in parallel.
So unless you implement some special locking (either on the web server or in the database) that blocks some of the requests, nothing will be blocked.
The proposed approach seems just fine to me.
Update:
There's one thing I forgot: ASP.NET contains a locking mechanism to synchronize access to the session data that can get into the way if you have several concurrent requests from the same user. Have a look at the SessionState attribute for a way to work around that problem.
Update 2:
And for an asynchronous behavior from the user's point of view, there's no need to use the AsyncController class. They where built for something else, which is not relevant in your case since only have 15 users.
Will the long run controller method calls block others if they are not asynchronous?
The first important thing to note is that all those controller actions should not have write access to the Session. If they write to the session whether they are sync or async they will always execute sequentially and never in parallel. That's due to the fact that ASP.NET Session is not thread safe and if multiple requests from the same session arrive they will be queued. If you are only reading from the Session it is OK.
Now, the slow controller actions will not block the fast controller actions. No matter whether they are synchronous or not. For long controller actions it could make sense to make them asynchronous only if you are using the asynchronous ADO.NET methods to access the database and thus benefit from the I/O Completion Ports. They allow you to not consume any threads during I/O operations such as database access. If you use standard blocking calls to the database you get no benefit from async actions.
I would recommend you the following article for getting deeper understanding of when asynchronous actions could be beneficial.

How to pass data from a web page to an application?

Trying to figure out a way where I can pass some data/fields from a web page back into my application. This needs to works on Windows/Linux/Mac so I can't use a DLL or ActiveX. Any ideas?
Here's the flow:
1. Application gathers some data and then sends it to a web page using POST that is either imbedded in the app or pops up a new IE window.
2. The web page does some services and then needs to relay the results back to the application.
The only way to do this that I can think of is writing the results locally from the page in a cookie or something like that and have the application monitor for a specific file in that folder.
Alternatively, make a web service that the application hits after passing control to the page and when the page is done the web service will return the data. This sounds like it might have some performance drawbacks.
Can anyone suggest any better solutions for this?
Thanks
My suggestion:
Break the processing logic out of the Web Page into a seperate assembly. You can then create a Web Service that handles all of the processing without needing to pass control over to a page.
Your application can then call the Web Service directly and then serialize the results and work with the data quite easily.
Update
Since the page is supplied by a third party, you obviously can't break anything out. The next best thing would be to handle the entire web request internal to your application (rather than popping a new Window).
With this method, you can get the raw HTTP response (and page markup) and work with it directly. You can then parse the Response stream and gather the required data from it.
During performing an HTTP request you should be able to retrieve the text returned by the page. For instance, if your HTTP POST was to hit a Java servlet, the doPost() method would be fired and you would then perform your actions, you could then use the PrintWriter object from the Response object (PrintWriter out = response.getWriter();) and write text back to the calling application. I'm not sure this helps?
The fact that
web page is hosted by a third party
and they need to be doing the
processing on their servers.
is important to this question.
I like your idea of having the app call a webservice after it passes the data to the third-paty web page. You can always call the webservice asynchronously if you're worried about blocking your application while waiting for results from this webservice.
Another option is that your application implements an XML-RPC server that can be called from the web page using PHP, Python or whatever you use to build the website
A REST server will do the job also...

jquery .ajax request blocked by long running .ajax request

I am trying to use jQuery's .ajax functionality to make a progress bar.
A request is submited via .ajax, which starts a long running process. Once submited another .ajax request is called on an interval which checks the progress of this process. Then a progress meter is updated using this information.
However, the progress .ajax call only returns once the long running process has completed. Its like its being blocked by the initial request.
The weird thing is this process works fine on dev but is failing on the deployment server. I am running on IIS using ASP.Net MVC.
Update: Apparently, it is browser related because it is working fine on IE 7 but is not working on IE 8. This is strange because IE 8 allows up to 6 connections on broadband where IE 7 only allows 2 requests per domain.
Update2: I think it's a local issue because it appears to be working fine on another IE 8 machine.
The server will only run one page at a time from each user. When you send the requests to get the progress status, they will be queued.
The solution is to make the page that returns the status sessionless, using EnableSessionState="false" in the #Page directive. That way it's not associated with any user, so the request isn't queued.
This of course means that you can't use session state to communicate the progress state from the thread running the process to the thread getting the status. You have to use a different way of keeping track of running processes and send some identifier along with the requests that gets the status so that you know which user it came from.
Some browsers (in particular, IE) only allows two requests to the same domain at the same time. If there are any other requests happening at the same time, then you might run into this limitation. One way around it is to have a few different aliases for the domain (some sites use "www1.example.com" and "www2.example.com", etc)
You should be able to use Firebug or Fiddler to determine how many requests are in progress, etc.
Create an Asynchronus handler (IHttpAsyncHandler) for your second ajax request.
use any parameter required via the .ashx querystring in order to process what you want because the HttpContext won't have what you'll need. You barely will have access to the Application object.
Behind the scenes ASP.NET will create for you a thread from the CLR pool, not the application pool, so You'll have an extra performance gain with IHttpAsyncHandler

Resources