Request URL Too Long (20K characters) IIS 7 - ios

I have an iPad application which sends data to a .NET application. The iPad application was written by a bunch of monkeys who implemented all the requests as GET instead of POST.
The application is live now, and with the client's data is sending requests over 20k characters, which is giving me this response (using Safari, which has been tested to work with URLs of at least 80k characters):
Generic 414 Error
Instead of the detailed IIS response I would get if, say, the request exceeded the requestFiltering/maxURL value in the web.config, which looks like this:
IIS 414.14 Error
Since I am getting the generic error message instead of the IIS-specific message, it makes me think this is not due to something I can fix in configuration settings (I have maxURL set to 2 billion, just to be safe...)
I understand that the requests should be using POST, but I don't really have time to rewrite the iPad application at the moment, and all of my research has only turned up unhelpful responses which say "you should limit GET requests to 2K characters" "you should use a POST instead of a GET". If that is the only feedback you have, please don't bother answering. (For instance, I am aware of this question and it's answers.)
I need to know if I can throw in a quick workaround to make this function until I have time to do it the right way. And I'm also wondering if anyone knows about hard limitations URL lengths from either the iOS or IIS side, because I can't find any specifics.
Edit: My httpRuntime parameters are also set to accept far more than 20k characters.

I know this is an old one, but in case someone will face it like I did yesterday - setting web.config parameters didn't help me either. What I've found was this MS article: http://support.microsoft.com/kb/820129/en-us.
I've added two DWORD keys to HKEY_LOCAL_MACHINE\System\CurrentControlSet\Services\HTTP\Parameters:
MaxFieldLength = 65534
MaxRequestBytes = 100000
NOTE: you need to restart your server or at least HTTP service to make these keys work. After restart I've managed to send a request with query string length up to ~32k characters (don't know why only ~32k though, maybe character encoding?). So I guess this is the limit for URL lenght in Windows 2003 and up.

If you are seeing this in multiple clients, it's likely to be your server settings. In the .NET 4.0's httpRuntime section, the maximum values for both maxUrlLength and maxQueryStringLength are a 32 bit signed integer, or 2147483647.
<httpRuntime maxUrlLength="2147483647" maxQueryStringLength="2147483647" />

Related

How to block requests to server with user name / password?

We have realized that this URL http://Keyword:redacted#example.com/ redirects to http://example.com/ when copied and pasted into the browser's address bar.
As far as I understand this might be used in some ftp connections but we have no such use on our website. We are suspecting that we are targeted by an attack and have been warned by Google that we are passing PII (mostly email addresses) in our URL requests to their Google Adsense network. We have not been able to find the source, but we have been warned that the violation is in the form of http://Keyword:redacted#example.com/
How can we stop this from happening?
What URL redirect method we can use to not accept this and return an error message?
FYI I experienced a similar issue for a client website and followed up with Adsense support. The matter was escalated to a specialist team who investigated and determined that flagged violations with the format http://Keyword:redacted#example.com/ will be considered false positives. I'm not sure if this applies to all publishers or was specific to our case, but it might be worth following up with Adsense support.
There is nothing you can do. This is handled entirely by your browser long before it even thinks about "talking" to your server.
That's a strange URL for people to copy/paste into the browser's address bar unless they have been told/trained to do so. Your best bet is to tell them to STOP IT! :-)
I suppose you could look at the HTTP Authorization Headers and report an error if they come in populated... (This would $_SERVER['PHP_AUTH_USER'] in PHP.) I've never looked at these values when the header doesn't request them, so I'm not sure if it would work or not...
The syntax http://abc:def#something.com means you're sending userid='abc', password='def' as basic authentication parameters. Your browser will pull out the userid & password and send them along as authentication information, leaving the url without them.
As Peter Bowers mentioned, you could check the authorization headers and see if they're coming in that way, but you can't stop others from doing it if they want. If it happens a lot then I'd suspect that somewhere there's a web form asking users to enter their user/password and it's getting encoded that way. One way to sleuth it out would be to see if you can identify someone by the userid specified.
Having Keyword:redacted sounds odd. It's possible Google Adsense changed the values to avoid including confidential info.

Request Length ERR_CONNECTION_RESET in MVC 4 Application on IIS 7.5

I am having an issue with my application using MVC 4 and IIS 7.5 getting a net::ERR_CONNECTION_RESET errror. The page request size is 4.9MB. All of the content loads but the request says that is has not finished yet, and none of my javascript is applied. I have other pages in the application that all load fine and the Javascript is applied with no issues. There seems to be something going on with this particular page.
Checking around I discovered I needed to set the MaxRequestLength and the MaxAllowedContentLength in the web.config. I set both to 8Mb, with MaxRequestLength being in KiloBytes and MaxAllowedContentLength being in Bytes. This still resulted in the same thing, I double checked in IIS to make sure that the MaxRequestLength and MaxAllowedContentLength were both being set correctly, which they were.
Next I adjusted my query to bring back a smaller amount of data and the page request size was well under 900KB and everything seemed to work fine. I kept modifying my query to bring back more results to see what was the max request size I could go to with the page loading fine. To my surprise once the page request length reached 918KB and greater the request would keep going for about 2 minutes before resulting in a net::ERR_CONNECTION_RESET errror. Keep in mind this error only shows in firebug as the page seems to display all data fine except none of the Javascript was applied.
I only discovered this issue when putting my application on the production server. Everything worked fine on localhost. I beleive this to be something going on with the server and IIS 7.5 as putting ELMAH in the application I was unable to capture any errors.
At this point I have run out of ideas and things to try. Any additional help would be great.
I had the same problem,
That error usually is 'cause there is a problem with the Database connection.
try this
FireWall permissions
If your string connection has Trusted Connection, and your user doesn't have password it can be a problem, you must set a password or change string connection for sa.
i really hope that it works

SOAP server couldn't work correctly behind some proxy/firewall

I have a SOAP server/client application written in Delphi XE working fine for some time until a user who runs it on Windows 7 x64 behind a corporate proxy/firewall. The application sends and receives TSOAPAttachment object in the request.
The Problem:
Once the first request from this user is received and processed, the server could not process any request (from any user) successfully coming after this.
The server still response to the request, but the SOAPAttachment of the request
seems corrupted after the first one from this user, that's why it couldn't process the request successfully.
After putting may debug logs to the server, I noticed the TSOAPAttachment.SourceStream in the request's parameter become inaccessible (or empty), and TSOAPAttachment.CacheFile also empty. Therefore whenever trying to use the SourceStream, it will return Access Violation error.
Further investigation found that the BorlandSoapAttachment(n) file generated in the temp folder by the first request still exist and locked (which should be deleted when a request is completed normally), and BorlandSoapAttachment(n+1) files of the following request are piling up.
The SOAP server will work again after restarting IIS or recycle the application pool.
It is quite certain that it is caused by the proxy or the user’s networks because when the same machine runs outside this networks, it will work fine.
To add more mystery to the problem, running the application on WinXP behind the same proxy have no problem AT ALL!
Any help or recommendation is very appreciated as we have stuck in this situation for some time.
Big thanks in advance.
If you are really sure that you debugged all your server logic that handles the attachments to attempt do discover any piece of code that could failed specifically on Windows 7, I would suggest:
1) Use some network sniffer Wireshark is good for this task, make two subsequent requests with the same data/parameters values, and compare the HTTP contents. This analyze should be done both in the client (to see if the data is always leaving the client machine with the same content) and also in the server, to analyze the incoming data;
2) I faced a similar situation in the past, and my attempts to really understand the problem was not well succeed. I did workaround the problem sending files as Base64 encoded strings parameters, and not using the SOAP attachments. The side affect of using Base64 its an increase of ~30% in the data size to be sent, and this could be significant if you are transferring large files.
Remember that SOAP attachments create temp files in the server, and Windows 7 has different file access rules than Windows XP. I don't know if this could explain the first call being processed ant the others not, but maybe there are something related with file access.
Maybe it is UAC (User Access Control) problem under Win 7. Try running the client in win 7 "As Administrator" and see if it is working properly.

Delphi - Connecting and logging in to a webpage

EDIT
There has been quite a development. The current problem is this:
I compared requests sent from a browser and sent from my app. There have been some differences and I managed to correct most of them. Some are still unfixed, since I haven't figured it out how yet. I am using INDY.
How can i send (or add) cookies into the request?
I tried this:IdHTTP.CookieManager.AddCookie('bakatheme=BrectanTheme',IdHTTP1.URL) but it doesn't work. Also, in INDY help they say that it is supposed to be AddCookie(String, String), but my Delphi only accept (String, TIdURI) - I am not sure if it is the right URI I am calling.
In the Headers I have this code: AcceptEncoding:='gzip,deflate,sdch'; yet when I parse the outgoing request, it states this: AcceptEncoding:gzip,deflate,sdch,identitybut I am certain I don't have "identity" anywhere in the code.
Those are the two things in which my request differs from the browser's. Now, I am getting a 500 Internal Server Error at return, can it be caused by the lack of cookies or the second thing?
Thank you very much.
Haven't exactly tried it myself but here's an example I found about website login using indy
http://www.ciuly.com/delphi/indy/persistent-login-example-for-geocacheing-no-ssl/
Ok. Lets comment:
How can i send (or add) cookies into the request?
You should not do that. Indy handles this for you (but if really want to, there is a TidCookieManager). But it seems to me that you dont know how cookies work. Its not a thing you can add to a request. It cames from the server and it identifies you.
In the Headers I have this code: AcceptEncoding:='gzip,deflate,sdch';
AcceptEnconding tells to the server that it can compact the response using those algorithms. Indy supports gzip,deflate,sdch,identity and indy is updating que header request to add the one you forgot to put.
You should take a look at those links to learn how http works:
W3
Wikipedia

Web Hosting URL Length Limit?

I am designing a web application which is a tie in to my iPhone application. It sends massively large URLs to the web server (15000 about.) I was using NearlyFreeSpeech.net, but they only support URLS up to 2000 characters. I was wondering if anybody knows of web hosting that will support really large URLs? Thanks, Isaac
Edit: My program needs to open a picture in Safari. I could do this 2 ways:
send it base64 encoded in the URL and just echo the query parameters.
first POST it to the server in my application, then the server would send back a unique ID after storing the photo in a database, which I would append to a URL which I would open in Safari which retrieved the photo from the database and delete it from the database.
You see, I am lazy, and I know Mobile Safari can support URI's up to 80 000 characters, so I think this is a OK way to do it. If there is something really wrong with this, please tell me.
Edit: I ended up doing it the proper POST way. Thanks.
If you're sending 15,000 character long URLs, in all likelyhood:
alt text http://img16.imageshack.us/img16/3847/youredoingitwronga.jpg
Use something like an HTTP POST instead.
The limitations you're running up against aren't so much an issue with the hosts - it's more the fact that web servers have a limit for the length of a URL. According to this page, Apache limits you to around 4k characters, and IIS limits you to 16k by default.
Although it's not directly answering your question, and there is no official maximum length of a URL, browsers and servers have practical limits - see http://www.boutell.com/newfaq/misc/urllength.html for some details. In short, since IE (at least some versions in use) doesn't support URLs over 2,083 characters, it's probably wise to stay below that length.
If you need to just open it in Safari, and the server doesn't need to be involved, why not use a data: URI?
Sending long URIs over the network is basically never the right thing to do. As you noticed, some web hosts don't support long URIs. Some proxy servers may also choke on long URLs, which means that your app might not work for users who are behind those proxies. If you ever need to port your app to a different browser, other browsers may not support URIs that long.
If you need to get data up to a server, use a POST. Yes, it's an extra round trip, but it will be much more reliable.
Also, if you are uploading data to the server using a GET request, then you are vulnerable to all kinds of cross-site request forgery attacks; basically, an attacker can trick the user into uploading, say, goatse to their account simply by getting them to click on a link (perhaps hidden by TinyURL or another URL shortening service, or just embedded as a link in a web page when they don't look closely at the URL they're clicking on).
You should never use GET for sending data to the server, beyond query parameters that don't actually change anything on the server.

Resources