Unicode url of website and seo issue - url

I am working on a persina website. i want to change url structure of pages to be more seo friendly.but i don't know using unicode urls will have positive effect on SEO of website or not.
The pages unicode is UTF-8 . When i copy the link location in firefox and paste it in address bar something like this (for example) will appears:
http://mysite.com/pages/36161-%D8%B4%DB%8C%D9%85%D9%89.html
it is ok with search engines and seo ?

I encountered a similar problem on my site after a few tests and a long time I have concluded that Google deal well with these addresses and you have no reason to worry.
In my case the URLs was in Hebrew and there is not much difference between the two languages for Googlebot.
The major problems i has was with URLs in the site map they looked really bad, but google still indexed them.
Is this transition will be good for seo? I guess you it will but do not allow friendly URLs confuse you is only one criterion and there is no reason to trust him.
You get +1 on friendly URLs but there's no reason to forget about the rest of the onsite site Seo.
It is very important that you redirect the old URLs with 301 redirect to the new ones.
To not receive a 401 error that will cause you to be punished by the search engines

Related

What are URL codes called?

I came across a website with a blog post teaching all how to clear cache for web development purposes. My personal favourite one is to do /? on the end of a web address at the URL bar.
Are there any more little codes like that? if so what are they and where can I find a cheat sheet?
Appending /? may work for some URLs, but not for all.
It works if the server/site is configured in a way that, for example, http://example.com/foo and http://example.com/foo/? deliver the same document. But this is not the case for all servers/sites, and the defaults can be changed anyway.
There is no name for this. You just manipulate the canonical URL, hoping to craft a URL that points to the same document, without getting redirected.
Other common variants?
I’d expect appending ? would work even more often than /? (both, of course, only work if the URL has no query component already).
http://example.com/foo
http://example.com/foo?
You’ll also find sites that allow any number of additional slashes where only one slash used to be.
http://example.com/foo/bar
http://example.com/foo////bar
Not sure if it affects the cache, but specifying the domain as FQDN, by adding a dot after the TLD, would work for many sites, too.
http://example.com/foo
http://example.com./foo
Some sites might not have case-sensitive paths.
http://example.com/foo
http://example.com/fOo

Bingbot converts unicode characters to not understandable symbols

I get a lot of errors from my site when bing trying to index some pages which have unicode characters.
For example:
http://www.example.com/kjøp
Bing is trying to index
http://www.example.com/kjøp
Then I get en error "System.NullReferenceException: Object reference not set to an instance of an object." because there is no such controller.
Google works good with such links. How to help bing to understand norwegian letters?
You can confirm that Bing does not index these URLs correctly by doing an "INURL:" search like this... https://www.bing.com/search?q=inurl%3A%C3%B8
Only 6 pages are indexed which cannot be correct.
Unfortunately you won't be able to fix Bing. You may be able to do compensate for its shortcoming by making some changes to your site however. It is a burden that you shouldn't have to deal with. However the other option is to do nothing and continue not getting pages properly linked.
Bing will likely have issues with URLs containing characters in this list...
https://www.i18nqa.com/debug/utf8-debug.html
Your webserver needs to look for URL requests containing these characters. You will then replace the wrong characters with the correct ones and do a 301 redirect to the correct page. The specifics depend on what kind of server and programming language you are using. In your case it is most likely IIS and MVC so you would most likely look into Microsoft's URL Rewrite extension. https://www.iis.net/downloads/microsoft/url-rewrite
Before doing this however I would see what errors Bing's webmaster tools might provide.
https://www.bing.com/toolbox/webmaster
The other option is to not use those characters in your URL. My recommendation is to take the time to use the wrong to right translation. Bing will eventually fix this but it could be quite a while.

Rails - search engine indexing of redirect action

I have a multilingual site with the same content in different languages with descriptive seo urls incorporating the title of each pages article. To switch between said languages of translated articles I have an action which looks up the translated title using the previous language and redirects to it. This all works fine except I noticed, despite there being no view, google has indexed said redirect urls.
Is this bad practice? I don't want to 301 redirect as it seems having links on every page to 301 redirects is a really bad idea. Do I somehow include a meta tag or is there some other approach?
The reason I currently have this is I want each article page to link to all of its translations using flags at the top of each page. The more I think about it I should just generate the direct url as this itself may have seo benefits. The reason I didn't go down this path originally was page rendering speed. I'd have to look up multiple articles solely for their url slug and expire caches of all languages upon any title change (it's a wiki style user generated content). Also, in some cases a translation wouldn't exist in which case I would need to link instead, say, to the category of article with a flash message.
So thinking through this while writing maybe this seems the preferable if more difficult to implement solution?
Hey Mark, from a search engine perspective you definitely don't want to rely on redirects everywhere, if for nothing other than performance. Search engines allocate a certain amount of bandwidth to each site based on ranking, if you're redirecting every page, you're eating up more of that bandwidth than you need to, and potentially not getting as much content crawled as you could otherwise.
Your second solution of generating the localized URLs and sticking them at the top of the page is the best option for search engines. That will give a unique URL for each page, and will provide a direct link to each page that Google and Bing (e.g. Yahoo) can follow and index.
I provided a set of best practices for SEO & Localized sites on another stackoverflow Q&A, here's a link, I think you'll find it valuable too: Internationalization and Search Engine Optimization
Good luck!
I have an app that I'm building that supports ten languages: English, simplified and traditional Chinese, French, Spanish, Russian, Japanese, German, and Hindi.
I tried a number of things but what I ended up doing was making :en default and then switching by where the request was coming from and then when uses signup they can set a default language. So if it was coming from mainland China I use :scn, and if it comes from Hong Kong I use :tcn traditional Chinese/simplified Chinese.
This way the application maintains a state of a language and there is no redirection.
I think any redirection is going to be troublesome so I wouldn't do that. Also, I am working on a dynamic site map that will list all of the links to google, which will have 10 different translations per 'page'.
I haven't deployed my application yet so I cannot check the Chinese search engines etc... to see if they are indexing my content.

What should i know about search engine crawling?

I don't mean SEO things. What should i know. Such as
Do engines run javascript?
Do they use cookies?
Will cookies carry across crawl sessions (say cookies from today and a craw next week or month).
Are selected JS filters not loaded for any reason? (Such as suspected ad which is ignored for optimization reasons?)
I don't want to accidental have all index page say some kind of error or warning msg like please turn on your cookie, browser not supported, or not be indexed because i did something silly such as having my sitemap point to /r?id=5 and not have then index because it is a redirect (i would use 301 however).
From here: http://www.google.com/support/webmasters/bin/answer.py?answer=35769
Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.
Read Google's Webmaster guidelines

Is there any disadvantage (in SEO terms) to using a country-specific subdomain over the country's TLD?

I'm developing a site at the moment which requires localization to a number of different countries. We own our site's name on many of the countries' TLDs (though not all of them). From a developer's perspective, many things are simplified if we could simply redirect all traffic to "domainname.co.uk" to "uk.domainnname.com" (or "domainname.fr" to "fr.domainname.com") — but my boss is concerned that there may be an adverse SEO impact from doing this.
So, I'm wondering if anyone knows if there is indeed any SEO impact from doing this. The country-specific content is still there, just served from a country-specific subdomain rather than the TLD.
Sorry if this is all a bit confusing! If anyone can offer any help, that would be fantastic.
Many thanks.
From the SEO point of view, it is always better to do domainname.com/fr Why? Because all the links to domainname.com/uk and domainname.com/fr are added to the same PageRank. If you have individual domains, the links are diluted between domains.
What Richie says is not right, because you can tell Google the specific geo target using Google WebMasters Tools
Here is an example, searching only sites "from argentina" (.ar TLD) where the top result is a generic .com
alt text http://img2.imageshack.us/img2/8862/capturejl.png
A country-specific search engine like google.co.uk will understand that domainname.co.uk is a UK site, but it won't understand that about uk.domainnname.com.
If I select google.co.uk's pages from the UK option I'd expect to see the former but not the latter.
(Edit: Yes, you can configure this for Google and some other search engines, but there's more to SEO than one or two specific search engines.)

Resources