This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
IP to Country?
I would like to know how I can detect the language and country of a visitor to my website. It is my understanding that it is best to get this information from the IP but I don't know how to do that. Is this the method that large e-commerce sites use? If there is a database of location based on IP will it remain valid for a while or does that sort of information change rapidly?
It looks like using the IP is not the ultimate nor the best way to detect at least the language and also the country of a visitor.
As for the language, a good and effective way would be to use the http headers to get the language of the user´s browser or OS and then to adapt the language of the site to the collected data.
As MCannon mentions, in any case, it is always good to let the user change the language (as an example, my OS and browser are not in English but still, for some sites, I´d rather read and thus have the English version of the site).
As for the location of the user, I ran myself into such a problem and I was adviced by lethalMango a very good website and API that I am advising to you, so check the site IP info, where you can find a wonderful database and access it using their API
Hope it helps!
Related
I'm currently working on a site that shows results which'pop up'. It's still on the same page, this means we can't share it via url since those are all the same. Check out the current Beta to understand what I mean. www.wheretrip.com/prelaunch. We use node.js with sockets for the backend, and javascript for most of the frontend.
So if people want to share the trip they found to Spain on a certain date for example, this can't be shared.
Does someone knows a way this problem could be solved?
Kind regards,
Sando
What is the best way to determine the language of twitter posts.
There is the language parameter that comes with the streaming API but it doesn't really seem to be very accurate. Even many Japanese posts are labelled as English.
What have others done to sort out the langauges?
I've had very good results with this PHP package:
http://pear.php.net/package/Text_LanguageDetect/
It is fast and open source. We use it to select English only posts for a site we run at http://2012twit.com.
google have language detection within their Translate API if using evil external services is a go-er?
http://code.google.com/apis/language/translate/v1/reference.html#detectResult
I am wondering is there any (programming) way to block that any search engine indexes the content of a website.
You can specify it in robots.txt
User-agent: *
Disallow: /
As the other answers already say, Robots.txt is the standard that every proper search engine adheres to. This should be enough in most cases.
If you really want try to programmatically block malicious bots that do not listen to robots.txt, check out this question I asked a few months ago on how to tell bots apart from human visitors. You may find some good starting points there.
Create a robots.txt file for your site. For more info - see this link.
Most search engine bots identify themselves using a unique user agent.
You can block specific user agents using robots.txt
Here is a list of some user agents.
Since you did not mention programming language, I'll give my input on this as from a php perspective - there is a wordpress plugin called bad behavior, which does exactly what you are looking for, it is configurable via a code script listing an array of search agent's strings. And based on what the agent is crawling on your site, the plugin automatically checks the user-agent's string and id, or IP address and based on the array, if there's a match, it either rejects or accepts the agent.
It might be worth your while to have a peek at the code to see how is it done from a programmer's perspective of the code.
If the language is other than php, and not satisfy what you are looking for, then I apologize for posting this answer.
Hope this helps,
Best regards,
Tom.
I'm wondering if there is an easy way to look up a user's local time zone in Rails using only an IP address. I don't want users to have to input their time zone themselves. Do I have to use JavaScript or is there a different way?
The maxmind GeoLite IP->city database seems to support timezones and there's a FAQ on their site referring to this. You could either do a two-step process of IP->Location then Location->timezone using the Maxmind GeoLite City database and then use one of the solutions provided in the FAQ.
Or for a simple 1-step javascript, using getTimezoneOffset() seems to be the crux of the solution.
There appear to be several vendors offering APIs and callable services to go from ip address to location, and clearly once you have that determining the timezone is only a further lookup.
Your alternative of using javascript to ask the browser "where am I, what's the time zone" and Ajaxing that down to your server also sounds plausible.
Of course a sufficiently determined user can probably spoof their way to appearing to be at a different ip address, but presumably that doesn't matter too much to you ... their choice.
you can use the ip address-to-time API to find time by IP address.
Look here http://worldtimeengine.com/ for more details.
Dan
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
What's the best way to make a website localized to multi languages?
I'm working on a website, and our manager wants it to be like:
http://www.website.com - default to english
http://fr.website.com - french
http://de.website.com - german
He says it's good for SEO, some developers wants to make it based on cookie and user's accept-language, so the url would always be http://website.com but content would be based on cookie/accept-language.
What you think?
thanks!
This article appears to have a good guide to your question: http://www.antezeta.com/blog/domains-seo/
Essentially, they recommend localizing by TLD most, followed by Subdomain, followed by directories
Cookies are a bad idea because Google will not be able to index your localized content.
This might be late answer but I will give you anyway (my hope is it will benefit others).
Should http://www.example.com/ default to English?
No. You should always detect User's preferred language. That is, web browser will give you AcceptLanguage header with languages that end user is able to understand. If it happens that the most preferred one is not the one that your web site/web application supports, you should try to fall back to next language from AcceptLanguage. Only when nothing fits, you should fall back to your default language (usually English, United States).
Should we used languages as part of domain?
It seems a good idea. When you detected the language, you might want to redirect user to appropriate page. It could be something like http://french.example.com/, http://german.example.com/ or http://www.example.com/index.html?lang=fr.
It is good to have such mechanism implemented - in this case one could actually bookmark correct language. Of course, if somebody navigates to your web site with language as a parameter, you will skip detection as it is pointless at this time.
To sum up:
You may should detect language that web browser serves you and appear as you have multiple web sites (one language each). That is how user might choose which one to bookmark. And of course web search engines will probably index the contents separately, but they would rather look for robots.txt, so... Either way it is good to appear as several language-specific web sites.
I once heard a teacher of mine say that when he does this, he simple makes php files called "eng.php" "fr.php" and so on...
In these files are associative arrays. The key's are always the same but the translation is different.
Then you need only require the correct language file at the top of you PHP files and if you parse the keys, it'll always be in the correct language.
Most open-sourced approaches to localization and globalization involve a lot of developer overhead and complexity in maintenance as copy and code become more complex.
My current company Localize.js solves this complex pain point seamlessly, by tracking website phrase changes, automated ordering of translations, as well as dynamic rendering of languages for you.
https://localizejs.com/
Feel free to email me # johnny#localizejs.com, if you have any questions