Do web page URLs always have a subdomain - url

I'm writing a url parser. Do url's I'd type into an internet browser always have a subDomain?
I mean, will there always be something where www is in this example, and is there always a '.' between the sub-domain and the domain?
http://www.domain.com/path/page.htm
Thanks.

No they don't.
If you go to http://example.com, you will see it doesn't redirect to a www subdomain.
And yes, there will always be a . between the parts of the hierarchy.

There is tons of various possibilities, you can find :
http://www.example.com/path/paeg.html?query=string
http://example.com/path/paeg.html?query=string
http://sub.www.example.com/path/paeg.html?query=string
http://user:password#www.example.com/path/paeg.html?query=string
http://user#www.example.com/path/paeg.html?query=string
http://www.example.com:4040/path/paeg.html?query=string
http://user#www.example.com:4040/path/paeg.html?query=string
etc ...
So NO, there is not necessary a subdomain in the url.
And there isn't a '.' between the sub-domain and the domain, when the sub-domain is empty.
If you are using PHP, you need to know that it already exists similar parser, like parse_url : http://fr.php.net/parse_url
And I'm sure there is also a parser in every (major) languages

Related

Can friendly-id gem work with capital letters in url e.g. /users/joe-blogs and /users/Joe-Blogs both work

I like the friendly id gem but one problem i'm seeing is when I type in a url with a capitol letter in it such as /users/Joe-Blogs it cant find the page. Its a little trivial but most sites can handle something like this and will generate the page whether it has a capitol letter or not. Does anyone know a fix for this?
Edit: to clarify this is for when users enter a url manually and put capitals in it just because its a name like author/Joe-Blogs. I've seen other sites handle this but rails seems to just give a 404.
friendly_id uses parameterize to create the slugs.
I think the best way to solve your problem is to parameterize the params before using it to find.
# controller
User.find(params[:id].parameterize)
Or parameterize the url where the link originated from.
As an addition to Vic's answer, you'll want to look at url normalization:
The following normalizations are described in RFC 3986 to result in equivalent URLs:
Converting the scheme and host to lower case.
The scheme and host components of the URL are case-insensitive. Most normalizers will convert them to lowercase.
Example: HTTP://www.Example.com/ → http://www.example.com/
In short - it's against convention to use capitalization in your urls.
You may also wish to look at URI normalize; more importantly, you should work to remove the capitalization from your URLs:
URI.parse(params[:id]).normalize

Can I redirect www.abc.com/a b c/test.html to www.abc.com/a-b-c/test.html with MVC3

I am changing the way links show on my web site. I changed from allowing space in the URL to a new format where the URL has dashes where spaces used to be.
This effects only ONE string in the middle of the URL.
Google has indexed many of my pages with the old spaces in the URL but now they show up as 404s. Is it possible for me to put some code in place (temporary) that can redirect those URLs with spaces to the ones with dashes. I think it's a 403 redirect. A permanent redirect.
Thanks,
We wen't through the same thing recently. We ended up creating a LegacyController, which basically called into RedirectToActionPermanent or RedirectToRoutePermanent. (HTTP 301 - Moved Permanently).
Ideally, you should let IIS7 do the redirects, but we couldn't, because we needed to call our DB in order to figure out where to go.
If your redirect is as simple as you say it is (e.g no "dynamic" info in the URL), then you should use IIS.
Why don't you try to configure you routing to support both: legacy and new routes?
Basically /a b c/page and /a-b-c/page should be mapped to the same action of controller.

Can I create a clean URL using WebBroker and Delphi?

Can I create a clean URL for WebBroker webpages/applications ?
A typical WebBroker URL normally looks like:
hxxp://www.mywebsite.com/myapp.dll?name=fred
or
hxxp://www.mywebsite.com/myapp.dll/names/fred
What I would prefer is:
hxxp://www.mywebsite.com/names/fred
Any idea how I can achieve this with Delphi/WebBroker ? (ISAPI/Apache)
The typical way of doing this is to use apache's mod_rewrite to redirect requests to the url w/ parameters. Many, many applications do this to create 'human readable' and more search engine friendly urls.
For example, you might add this rule to make action=sales&year=2009 look like sales-2009.htm:
RewriteRule ^sales-2009.htm?$ index.php?action=sales&y=2009 [L]
When the user goes to 'sales-2009.htm', its actually redirected to the php page with the appropriate parameters. To the end user, though, it still displays sales-2009.htm in the browser's url bar.
You can, of course, use regular expressions w/ mod_rewrite, such that you can make the redirections much more flexible. You could, for example, make a single expression in the above example that would map any year to the correct parameter.

Uppercase and lowercase urls in PHP

I have created folders in my root example: http://www.zipholidays.co.uk/Cuba or http://www.zipholidays.co.uk/Florida
When I type http://www.zipholidays.co.uk/cuba (Cube in lowercase), it shows page not found.
I'm using Apache server. People are linking to pages with lowercase, uppercase, mixed case - whatever. What do I do to make the pages case insensitive?
mod_spelling perhaps? mod_spelling
If you make your pages case insensitive, you'll have some duplicate content problems as you will have two pages with the same content.
A good solution would be to do some 301 redirect on every 404 page when the equivalent in lowercase exists.
For example in your 404 default page, you put :
<?php
$lower = strtolower($_SERVER['REQUEST_URL']);
if (file_exists(PATH_TO_YOUR_APPLICATION . $lower) {
header('location: ' . $lower, true, 301);
die();
}
?>
So when you load a 404 page, if the same url in lower cases exists, you redirect there. Otherwise you can display your own missing page content.
I wouldn’t make my URLs case insensitive. Instead I would follow a strict guideline for creating such URLs. I would for example only use lowercase URL paths and redirect requests with URL paths with uppercase letters to the lowercase variant.
You can even do that with mod_rewrite (requires rewrite map to internal tolower function):
RewriteCond %{tolower:%{REQUEST_URI}} .+
RewriteRule ^[^A-Z]*[A-Z] %0 [L,R=301]
Maybe you can replace linux for windows as the server OS.
Apache on Linux is case sensitive in the filepaths, so also in the URI's. Windows file system isn't case sensitive, so it doesn't matter there.
You do not need to do that. Because Google will think that you have duplicate pages on your site, and gonna ban you. It is the rule of SEO science.

Redirect 301 with hash part (anchor) #

One of our website has URL like this : example.oursite.com. We decided to move our site with an URL like this www.oursite.com/example. To do this, we wrote a rewrite rule in our Apache server that redirect to our new URL with a code 301.
Many websites link to us with URLs of the form example.oursite.com/#id=23. The problem is that the redirection erase the hash part of the URL with IE. As far as I know, the hash part is never sent to the server.
I wanted to implement the redirection with javascript to keep the hash part, but the Search Engine will not be aware that our URL changed. (no code 301 returned)
I want the Search Engine to be notified of our new URL(301) because we need to transfer the page rank to our new URL.
Is there a way to redirect with a 301 code and keep the hash part(#id=23) of in the URL ?
Search engines do in fact care about hash tags, they frequently use them to highlight specific content on a page.
To the question, however, anchor locations are unfortunately not sent to the server as part of the HTTP request. If you want to redirect a user, you will need to do this in Javascript on the client side.
Good article: http://web.archive.org/web/20090508005814/http://www.mikeduncan.com/named-anchors-are-not-sent/
Seeing as the server will never see the # (ruling out 301 Redirects) and Google has deprecated their AJAX Crawling scheme, it seems that a front-end solution is the only way!
How I did it:
(function() {
var redirects = [
['#!/about', '/about'],
['#!/contact', '/contact'],
['#!/page-x', '/pageX']
]
for (var i=0; i<redirects.length; i++) {
if (window.location.hash == redirects[i][0]) {
window.location.replace(redirects[i][1]);
}
}
})();
I'm assuming that because Google crawlers do indeed execute Javascript, the new pages will be indexed properly.
I've put it in a <script> tag directly underneath the <title> tag, so that it get executed before any other JS/CSS. Note that this script should only be required for your index file.
I am fairly certain that the hash/page anchor/bookmark part of a URL is not indexed by search engines, and therefore has no effect on your page ranking. Doing a google search for "inurl:#" returns zero documents, so that backs up my assumption. Links from external sites will be indexed without the hash.
You are right in that the hash part isn't sent to the server, so as far as I am aware, there isn't a good way to be able to create a redirection url with the hash in it.
Because of this, it's up to the browser to correctly manage the hash during a redirect. Firefox 3.5 appears to do this successfully. If you append a hash to a URL that has a known redirect, you will see the URL change in the address bar to the new location, but the hash stays on there successfully.
Edit: In response to the comment below, if there isn't a hash sign in the external URL for the part you need, then it is entirely possible to rewrite the URL. An Apache rewrite rule would take care of it:
RewriteCond %{HTTP_HOST} !^exemple\.oursite\.com [NC]
RewriteCond %{HTTP_HOST} !^$
RewriteRule ^/(.*) http://www.oursite.com/exemple/$1 [L,R]
If you're not using Apache, then you'll have to look into the server docs for something similar.
Google has a special syntax for AJAX applications that is based on hash URLs: http://code.google.com/web/ajaxcrawling/docs/getting-started.html
You could create a page on the old address that catches all requests and redirects to the new site with the correct address and code.
I did something like that, but it was in asp.net, which I guess it's not the language you use. Anyway there should be a way to do this in any language.
When returning status 301, your server is supposed to return a 'Location:' header which points to the new location. In practice, the way this is implemented varies; some servers provide the full URL (netloc and path), some just provide the new path and expect the browser to look for that path on the original netloc. It sounds like your rewrite rule is stripping the path.
An easy way to see what the returned Location header is, in the python shell:
>>> import httplib
>>> conn = httplib.HTTPConnection('exemple.oursite.com')
>>> conn.request('HEAD', '/')
>>> res = conn.getresponse()
>>> print res.getheader('location')
I'm afraid I don't know enough about mod_rewrite to tell you how to do the rewrite rule correctly, but this should give you an idea of what your server is actually telling clients to do.
The search bots don't care about hash tags. And if you are using them for some kind of flash or AJAX calls, you have more serious problems than your 301 redirects don't work. Because unless you have the content in an alternate form, the search engines are not indexing your site and you are definitely suffering as far as SEO goes.
I registered my account so I can't edit.
zombat : I'm sorry I made a mistake in my comment. The link to our video is exemple.oursite.com/#video_id=233. In this case, my rewrite rule in Apache doesn't work.
Nick Berardi: We changed the way our links work. We don't use # anymore, only for backward compatibility

Resources