How to set different languages for different spiders on a website? - search-engine

I have multilanguage website. Actually, the website language is chosen according to the web browser language.
Is there any way to set the language according to the search engine spider? For example:
Display the website in Chinese for Baidu search engine spider,
Display the website in Russian for Yandex spider?

This is called crawler identification. When a request is made to your website, User-Agent field contains the information about the browser or the crawler.
Depending on the crawler, the value of this field will be different. You can then associate different values with different languages. You can also take a look at the large list of user agents.
I'm still pretty sure that by doing this, you'll lower your rank in search engines since you provide different responses to crawlers than to real users, but I don't have solid references to support this statement.
In all cases, crawlers are expected to gather resources in different languages, and those crawlers know how to deal with multilingual websites, except maybe the ones which try to follow every worst practice. Also, the search engines you quoted are not limited to one language. Yandex is available for example in Turkish. As for Baidu, According to Wikipedia, it serves China, Japan, Thailand, Egypt and India.

Related

Good or Bad for SEO: Keeping URLs in English for a non-english website?

I'm planning to release a community website that doesn't have a PRIMARY audience that is english speaking. This means that URLs that point to /profile /forums and so on will be in english and not in their native language. I'm not concerned if a user is using the website while accessing different URL paths in English, but I am concerned if I were to use non english URLs then would a search engine pickup on pages on the website better or worse?
Anyone care to share their opinions?
In my opinion, it would be better to have URLs that reflect the primary language of your users as it would make them finding your website easier on search engines (supposing they search using their primary language). From a SEO perspective, if possible try to also include in your URLs the relevant search terms you think would be used by your audience. If you have a forum, for example, include in the thread URLs the full thread title if possible, and so on.
Sources: my own experience with building and managing powershell.it and sqlserver.it, two of the most important Italian technology-related communities.
The best place to start on this issue would be Google's Webmaster Central section on Internationalization.
If you will have versions of the same URL in multiple languages, you can connect them using the rel="alternate"mechanism, which is explained at Google's Webmaster Tools page.
1. Summary
Using non-English URLs for non-English websites is fine.
2. Argumentation
Google Senior Webmaster Trends Analyst John Mueller said in a recent SEO snippets video that using non-English URLs for non-English websites is fine and that Google is able to crawl, index and rank them.
This includes non-Latin characters in your URLs. John Mueller said “as long as URLs are valid and unique, that’s fine.” He added, “So to sum it up, yes, non-English words and URLs are fine, and we recommend using them for non-English websites.”
Read full article here.
3. Disclaimer
Data of this answer were relevant in March 2018 and may be obsolete in the future.

Rails - search engine indexing of redirect action

I have a multilingual site with the same content in different languages with descriptive seo urls incorporating the title of each pages article. To switch between said languages of translated articles I have an action which looks up the translated title using the previous language and redirects to it. This all works fine except I noticed, despite there being no view, google has indexed said redirect urls.
Is this bad practice? I don't want to 301 redirect as it seems having links on every page to 301 redirects is a really bad idea. Do I somehow include a meta tag or is there some other approach?
The reason I currently have this is I want each article page to link to all of its translations using flags at the top of each page. The more I think about it I should just generate the direct url as this itself may have seo benefits. The reason I didn't go down this path originally was page rendering speed. I'd have to look up multiple articles solely for their url slug and expire caches of all languages upon any title change (it's a wiki style user generated content). Also, in some cases a translation wouldn't exist in which case I would need to link instead, say, to the category of article with a flash message.
So thinking through this while writing maybe this seems the preferable if more difficult to implement solution?
Hey Mark, from a search engine perspective you definitely don't want to rely on redirects everywhere, if for nothing other than performance. Search engines allocate a certain amount of bandwidth to each site based on ranking, if you're redirecting every page, you're eating up more of that bandwidth than you need to, and potentially not getting as much content crawled as you could otherwise.
Your second solution of generating the localized URLs and sticking them at the top of the page is the best option for search engines. That will give a unique URL for each page, and will provide a direct link to each page that Google and Bing (e.g. Yahoo) can follow and index.
I provided a set of best practices for SEO & Localized sites on another stackoverflow Q&A, here's a link, I think you'll find it valuable too: Internationalization and Search Engine Optimization
Good luck!
I have an app that I'm building that supports ten languages: English, simplified and traditional Chinese, French, Spanish, Russian, Japanese, German, and Hindi.
I tried a number of things but what I ended up doing was making :en default and then switching by where the request was coming from and then when uses signup they can set a default language. So if it was coming from mainland China I use :scn, and if it comes from Hong Kong I use :tcn traditional Chinese/simplified Chinese.
This way the application maintains a state of a language and there is no redirection.
I think any redirection is going to be troublesome so I wouldn't do that. Also, I am working on a dynamic site map that will list all of the links to google, which will have 10 different translations per 'page'.
I haven't deployed my application yet so I cannot check the Chinese search engines etc... to see if they are indexing my content.

Is there any disadvantage (in SEO terms) to using a country-specific subdomain over the country's TLD?

I'm developing a site at the moment which requires localization to a number of different countries. We own our site's name on many of the countries' TLDs (though not all of them). From a developer's perspective, many things are simplified if we could simply redirect all traffic to "domainname.co.uk" to "uk.domainnname.com" (or "domainname.fr" to "fr.domainname.com") — but my boss is concerned that there may be an adverse SEO impact from doing this.
So, I'm wondering if anyone knows if there is indeed any SEO impact from doing this. The country-specific content is still there, just served from a country-specific subdomain rather than the TLD.
Sorry if this is all a bit confusing! If anyone can offer any help, that would be fantastic.
Many thanks.
From the SEO point of view, it is always better to do domainname.com/fr Why? Because all the links to domainname.com/uk and domainname.com/fr are added to the same PageRank. If you have individual domains, the links are diluted between domains.
What Richie says is not right, because you can tell Google the specific geo target using Google WebMasters Tools
Here is an example, searching only sites "from argentina" (.ar TLD) where the top result is a generic .com
alt text http://img2.imageshack.us/img2/8862/capturejl.png
A country-specific search engine like google.co.uk will understand that domainname.co.uk is a UK site, but it won't understand that about uk.domainnname.com.
If I select google.co.uk's pages from the UK option I'd expect to see the former but not the latter.
(Edit: Yes, you can configure this for Google and some other search engines, but there's more to SEO than one or two specific search engines.)

Url format for internationalized web app?

Scenario
The web server gets a request for http://domain.com/folder/page. The Accept-Language header tells us the user prefers Greek, with the language code el. That's good, since we have a Greek version of page.
Now we could do one of the following with the URL:
Return a Greek version keeping the current URL: http://domain.com/folder/page
Redirect to http://domain.com/folder/page/el
Redirect to http://domain.com/el/folder/page
Redirect to http://el.domain.com/folder/page
Redirect to http://domain.com/folder/page?hl=el
...other alternatives?
Which one is best? Pros, cons from a user perspective? developer perspective?
I would not go for option 1, if your pages are publically available, i.e. you are not required to log in to view the pages.
The reason is that a search engine will not scan the different language versions of the page.
The same reason goes agains option no 5. A search engine is less likely to identify two pages as separate pages, if the language identification goes in the query string.
Lets look at option 4, placing the language in the host name. I would use that option if the different language versions of the site contains completely different content. On a site like Wikipedia for example, the Greek version contains its own complete set of articles, and the English version contains another set of articles.
So if you don't have completely different content (which it doesn't seem like from your post), you are left with option 2 or 3. I don't know if there are any compelling arguments for one over the other, but no. 3 looks nicer in my eyes. So that is what I would use.
But just a comment for inspiration. I'm currently working on a web application that has 3 major parts, one public, and two parts for two different user types. I have chosen the following url scheme (with en referring to language of course):
http://www.example.com/en/x/y/z for the public part.
http://www.example.com/part1/en/x/y/z for the one private part
http://www.example.com/part2/en/x/y/z for the other private part.
The reason for this is that if I were to split the three parts up into separate applications, it will be a simple reconfiguration in the web server when I have the name of the part at the top of the path. E.g. if we were to use a commercial CMS system for the public part of the site
Edit:
Another argument against option no. 1 is that if you ONLY listen to accept-language, you are not giving the user a choice. The user may not know how to change the language set up in a browser, or may be using a frinds computer setup to a different language. You should at least give the user a choice (storing it in a cookie or the user's profile)
I'd choose number 3, redirect to http://example.com/el/folder/page, because:
Language selection is more important than a page selection, thus selected language should go first in a true human-readable URL.
Only one domain gets all Google's PR. That's good for SEO.
You could advert your site locally with a language code built-in. E.g. in Greece you would advert as http://example.com/el/, so every local visitor will get to a site in Greece and would avoid language-choosing frustration.
Alternatively, you can go for number 5: it is fine for Google and friends, but not as nice for a user.
Also, we should refrain to redirect a user anywhere, unless required. Thus, in my mind, a user opening http://example.com/folder/page should get not a redirect, but a page in a default language.
Number four is the best option, because it specifies the language code pretty early. If you are going to provide any redirects always be sure to use a canonical link tag.
Pick option 5, and I don't believe it is bad for SEO.
This option is good because it shows that the content for say:
http://domain.com/about/corporate/locations is the same as the content in
http://domain.com/about/corporate/locations?hl=el except that the language differs.
The hl parameter should override the Accept-language header so that the user can easily control the matter. The header would only be used when the hl parameter is missing. Granted linking is a little complicated by this, and should probably be addressed through either a cookie which would keep the redirection going to the language chosen by the hl parameter (as it may have been changed by the user from the Accept-language setting, or by having all the links on the page be processed for adding on the current hl parameter.
The SEO issues can be addressed by creating index files for everything like stackoverflow does, these could include multiple sets of indices for the different languages, hopefully encouraging showing up in results for the non-default language.
The use of 1 takes away the differentiator in the URL. The use of 2 and 3 suggest that the page is different, possibly beyond just language, like wikipedia is. And the use of 4 suggests that the server itself is separated, perhaps even geographically.
Because there is a surprisingly poor correlation of geographic location to language preferences, the issue of providing geographically close servers should be left to a proper CDN setup.
My own choice is #3: http://domain.com/el/folder/page. It seems to be the most popular out there on the web. All the other alternatives have problems:
http://domain.com/folder/page --- Bad for SEO?
http://domain.com/folder/page/el --- Doesn't work for pages with parameters. This looks weird: ...page?par1=x&par2=y/el
http://domain.com/el/folder/page --- Looks good!
http://el.domain.com/folder/page --- More work needed to deploy since it requires adding subdomains.
http://domain.com/folder/page?hl=el --- Bad for SEO?
It depends. I would choose number four personally, but many successful companies have different ways of achieving this.
Wikipedia uses subdomains for various
languages (el.wikipedia.org).
So does Yahoo (es.yahoo.com for Spanish), although it doesn't support Greek.
So does Gravatar (el.gravatar.com)
Google uses a /intl/el/ directory.
Apple uses a /gr/ directory (albeit in English and limited to an iPhone page)
It's really up to you. What do you think your customers will like the most?
None of them. A 'normal user' wouldn't understand (and so remember) any of those abbreviations.
In order of preference I'd suggest:
http://www.domain.gr/folder/page
http://www.domain.com/
http://domain.com/gr/folder/page
3 or 4.
3: Can be easily dealt with using htaccess/mod_rewrite. The only downside is that you'd have to write some method of automatically injecting the language code as the first segment of the URI.
4: Probably the best method. Using host headers, it can all be sent to the same web application/content but you can then use code to extract the language code and go from there.
Simples. ;)
I prefer 3 or 4

Is there a search engine including indexing bot which can be used to make up special catalogue by feeding the bot with certain properties?

Our application (C#/.NET) needs a lot of queries to search. Google's 50,000 policy per day is not enough. We need something that would crawl Internet websites by specific rules we set (for ex. country domains) and gather URLs, Texts, keywords, name of websites and create our own internal catalogue so we wouldn't be limited to any massive external search engine like Google or Yahoo.
Is there any free open source solution we could use to install it on our server?
No point in re-inventing the wheel.
DataparkSearch might be the one you need. Or review this list of other Open Source Search Engines.

Resources