URL Encoded characters in IE 9 - url

I have a problem hitting a URL and getting HTTP 404 error (No web page found). Some findings led me to conclude it is an encoding issue for one of the query parameters so I tried to apply urlEncoding i.e. HttpUtility.UrlEncode("<= 1 Week"). This produces "%3c%3d1+Week". But the problem persists. If no UrlEncode is done, the querystring will be "%3C=1%20Week"
Anyone knows why this is happening?

Related

400 code error when URL contains % symbol? (NGINX)

How to prevent a server from returning an error 400 code error when the URL contains % symbol using NGINX server?
Nginx configuration for my website:
....
rewrite ^/download/(.+)$ /download.php?id=$1 last;
....
When I tried to get access to this URL:
http://mywebsite.net/download/some-string-100%-for-example
I got this error:
400 Bad Request
With this url :
http://mywebsite.net/download/some-string-%25-for-example
it's work fine !
It's because it needs to be URL encoded first.
This will explain:
http://www.w3schools.com/tags/ref_urlencode.asp
URLs can only be sent over the Internet using the ASCII character-set.
Since URLs often contain characters outside the ASCII set, the URL has to be converted into a valid ASCII format.
URL encoding replaces unsafe ASCII characters with a "%" followed by two hexadecimal digits.
URLs cannot contain spaces. URL encoding normally replaces a space with a plus (+) sign or with %20.
The URL interpreter is confused to see a % without hexadecimals after it.
Why would you think of solving by changing Nginx configuration???
It's impossible to solve from the server side. It's a problem from the client side.
https://headteacherofgreenfield.wordpress.com/2016/03/23/100-celebrations/
In that URL, the title is 100% Celebrations! but the permalink is autogenerated to 100-celebrations. It's because they know putting 100% will cause a URL encode problem.
If even Wordpress doesn't do it your way, then why should you do it?

ERROR STATUS: URL contains outer http

I have a COLDFUSION page which except parameters from url and show them in fields. My url looks like this which is working.
http://www.example.com/test.cfm?activeUrl=www.msn.com&secure=False
But following is not working. I have added http before www in activeUrl value.
http://www.example.com/test.cfm?activeUrl=httpwww.msn.com&secure=False
It is giving me following error. "ERRROR STATUS: URL contains outer http"
Can any one help me to solve this problem?
For me, It seems that something related to iis configuration.
The string in httpwww.msn.com is causing your browser error. It should be http://www.msn.com, but should also be URL encoded.
ohh, i have not checked the application.cfc ONREQUESTSTART method. it has some condition for query string, which is showing this error. :(

SOAPUI REST url having + in JSON issue

I am using SOAPUI v4.6.4.
I am testing REST SOAPUI test. I have the media type set to application/json.
I am trying to pass this value as part of JSON string and I get 404 server error.
{"enteredTime":"2001-12-17T09:30:47+05:00"}
It is having issues with + sign.
I tried encoding it as 2001-12-17T09%3A30%3A47%2B05%3A00. The 404 server is gone but the application code is NOT expecting the encoded value.
Any solutions are highly appreciated.

Large number of likes but now realise it is to an invalid url

My site at www.kruaklaibaan.com (yes I know it's hideous) currently has 3.7 million likes but while working to build a proper site that doesn't use some flowery phpBB monstrosity I noticed that all those likes are registered against an invalid URL that doesn't actually link back to my site's URL at all. Instead the likes have all been registered against a URL-encoded version:
www.kruaklaibaan.com%2Fviewtopic.php%3Ff%3D42%26t%3D370
This is obviously incorrect. Since I already have so many likes I was hoping to either get those likes updated to the correct URL or get them to just point to the base url of www.kruaklaibaan.com
The correct url they SHOULD have been registered against is (not url-encoded):
www.kruaklaibaan.com/viewtopic.php?f=42&t=370
Is there someone at Facebook I can discuss this with? 3.7m likes is a little too many to start over with without a lot of heartache. It took 2 years to build those up.
Short of getting someone at Facebook to update the URL, the only option within your control that I could think of that would work is to create a custom 404 error page. I have tested such a page with your URL and the following works.
First you need to set the Apache directive for ErrorDocument (or equivalent in another server).
ErrorDocument 404 /path/to/404.php
This will cause any 404 pages to hit the script, which in turn will do the necessary check and redirect if appropriate.
I tested the following script and it works perfectly.
<?php
if ( $_SERVER['REQUEST_URI'] == '/%2Fviewtopic.php%3Ff%3D42%26t%3D370' ) {
Header("HTTP/1.1 301 Moved Permanently");
Header("Location: /viewtopic.php?f=42&t=370");
exit();
} else {
header('HTTP/1.0 404 Not Found');
}
?><html><body>
<h1>HTTP 404 Not Found</h1>
<?php echo $_SERVER['REQUEST_URI']; ?>
</body></html>
This is a semi-dirty way of achieving this, however I tried several variations in Apache2.2 using mod_alias's Redirect and mod_rewrite's RewriteRule, neither of which I have been able to get working with a URL containing percent encoded chars. I suspect that with nginx you may have better success at a more graceful way to handle this in the server.

Maximum length of URL fragments (hash)

Is there a length limit for the fragment part of an URL (also known as the hash)?
The hash is client side only, so the rules for HTTP may not apply to it.
It depends on the browser.
I found that in safari, chrome, and Firefox, an URL with a long hash is legal, but if it is a param send to the server, the browser will display an 414 or 413 error.
for example:
an URL like http://www.stackoverflow.com/?abc#{hash value with 100 thousand characters} will be ok. and you can use location.hash to get the hash value in javascript but an URL like http://www.stackoverflow.com/?abc&{query with 100 thousand characters} will be illegal, if you paste this link in the address bar, a 413 error code will be given and the message is the client issued a request that was too long. If that is a link in a web page, in my computer, Nginx response the 414 error message.
I don't know the situation in IE.
So I think, the limitation of the length of URL is just for transmission or HTTP server, the browser will check it sometimes, but not every time, and it will always be allowed to be used as a hash.
There is definitely a length for the whole url.
Read
RFC2616 - Hypertext Transfer Protocol
Maximum URL length is 2,083 characters in Internet Explorer

Resources