I have a rails app which is localized in multiple languages. Some time for the localization to complete it takes more time after the some enhancements are done in english language. So it was decided to release the latest version of the website in english language. and retain the older version for other non-english languages. So that the release dates need not be pushed till localization were complete. I am not sure how to implemented. Any idea about how should i go about doing it will be very helpful.
Thanks
seeing your comments to Chubas's answer You can bind to http://website.com simple application which will check user's locale (or whatever else) and redirect him to http://website.com/en or http://website.com/int, each running a separate application.
It may seem not very pretty, but it's simple and many enterprise websites actually use it. With a bit more setupping you can even have configurable names like fr, de, etc. instead of int (all pointing to the same application).
There is a whole Internationalization (abbrev. i18n) module available in Rails. You can easily switch languages this way, or serve them separately based on some parameter (say, the URL).
Related
When there is a ready to use translator plugin available from companies like microsoft,google and yahoo why one has to implement globalization feature in an application using resource files like .resx. Why not simply plugin any one of those translator in the application and give the user with the freedom of choosing his own language/culture? Which one is better over other? Thanks in advance.
Automated machine translation is not the same as providing customized translations for different languages. Machine translation gets things wrong far too often, and can easily phrase something in a way that is offensive or embarrassing. It also doesn't take into account localization at all.
And more importantly, public machine translation services only work on public sites. Most globalized sites have pages only signed in users can reach. In that case, it is easier to provide translations yourself.
If you're making money off your customers, you're better off investing in real translation over a free service that ultimately marginalizes any users who don't speak the primary language the site is written in.
As I understand, these translators aren't as accurate. My last boss hired a translator and we translated the data into a separate language database table.
Autotranslate is not efficient and WILL get you in trouble when serious application is regarded. There is a very simple linguistic test you can perform on your application. First you translate from the original language to the target language. Then, you take the result and translate it back. If you get satisfactory results, you're good to go.
In fact, for some simple applications, that would be a recommended way. However, it MIGHT come back and bite you in the buttocks.
We're working on some Ruby on Rails web application which is currently in English but should be translated to more languages. We deploy new version of application to production every 2 weeks. Translators are a separate team.
We have a special page in admin area for making translations. Dictionaries are stored as YAML-files. We can let translator edit dictionaries in production and open language for visitors then it will be ready. Another way is to let translators to work on staging server and merge translations before deployment to productions.
Do somebody know a good process to synchronize work of developers and translators?
Thanks in advance.
37signals launched Tolk, a tool they used to translate Basecamp into several languages. I've not used it myself, but seems a handy tool to automate some parts of the process. It may be worth giving it a look.
Did not you try http://www.gnu.org/software/gettext/ ?
If you go with that, translators and developers can work parallel. Translators can use POEDIT to find the text in application and make the translation.
I am creating a multilingual site that will need to support at minimum five different languages, including Korean and Chinese. The site language is ColdFusion, so Java is the native language. I want to make the site as easy as possible for the next programmer to update, and for third-party translators to use tools that will work easier for them than digging through a SQL database.
So far I have come across Resource Bundles, GNU's GetText, and TMX.
Which do you recommend and why?
Resource Bundles are my preferred choice. I have found they are kind to future programmers, self documenting in many ways, and, make it easy to manage subcontracting the translators. They have kept things simple, reduced my technology stack and have yet to fail me.
Good luck with your project, and, thank you for making life easier for the next developer. I wish more people thought like you.
comparing rb & TMX is an apples to oranges comparison. rb & getText are sort of the same thing. TMX is a format for computer aided xlation & xfer between tools/translators.
what you seem to be overlooking is manging rb or whatever. for large i18n projects rb get big & complicated. if you have multiple translation vendors (not recommended if at all possible) it all becomes like herding cats (what's translated, in what languages, by what translator, etc.). find a management tool you like (icu4j's rbmanager is what we normally use, jason sheedy's rbman is pretty decent too) then see what it uses.
and once again (in case anybody's forgotten my "bah humbug" stance) i urge you not to use machine translators like google or bing for anything serious. people could die.
An internationalisation library for Adobe ColdFusion which uses the resource bundle package style that Adobe Flex uses.
http://resourcemanager.riaforge.org/
You may try i18n support by MVC framework like ColdBox.
Use i18N Resource Bundles for coldfusion. Paul Hastings has a great set of cfcs at: http://www.sustainablegis.com/blog/cfg11n/index.cfm?mode=cat&catid=F46401DD-50FC-543B-1F1FBE4F2BAD6B83
I have a freeware scientific app that is used by thousands of people in nearly 100 countries. Many have offered to translate for free. Now that D2009 makes this easier (with integrated and external localization tools, plus native Unicode support) I'd like to make this happen for a few languages and steadily add as many as user energy will support.
I'm thinking that I'll distribute a spreadsheet with a list of strings (dozens but not hundreds) to be translated, have them return it, and compare submissions in the same language from 2-3 users then work to resolve discrepancies by consensus. Then I'll incorporate the localizations using the Integrated Translation Environment, and distribute localized updates.
Has anyone delegated translation to users? Any gotchas, D2009-specific or otherwise?
EDIT: Has anyone compared the localization support built into D2009 versus dxgettext?
I have never been a friend of proprietary localization tools for Freeware or Open Source applications. Using dxgettext, the Delphi port of GNU gettext looks like a much better option to me:
Integration into the program (even much later than its development) is easy.
Extraction of translatable strings can be done by command line programs and is therefore easily introduced into an automated build.
A new translation can be added simply by creating a new directory with the correct structure, copying the empty translation file into it, and starting to translate the strings. This is something each user can do for themselves, there's no need to involve the original author for creation of a new translation. There is also instant gratification with this process - once the program is restarted the new translations are shown immediately.
Changing an existing translation is even easier than creating a new one. Thus if a user finds spelling or other errors or needs for improvement in the translation they can correct them easily and send the changes to the author.
New program versions work with old translations, the system degrades very gracefully - new and untranslated strings are simply shown unmodified.
Translations can be made using only notepad, but there are several free tools for creating and managing translation files too; see the links on the dxgettext page. They are localized themselves, and have some advantages over a spreadsheet as well:
The location of the strings in the source code can be shown (makes sense only for Open Source apps, of course).
The percentage of translated strings is shown.
Modifications to already translated strings are highlighted too.
The whole system is mature and future-proof - I have used dxgettext for Delphi 4 programs, and there should be no changes necessary for Delphi 2009 even - translation files have always been UTF-8 encoded.
Using a spreadsheet for the translation doesn't seem a workable solution to me once you have more than a few languages. Suppose a new program version adds 2 new strings and changes 10 strings only slightly - wouldn't you need to add the new strings to and highlight the changed strings in all of the several dozen spreadsheet files and send them again to your translators? Using dxgettext you just mail the changed po file to all of them.
Edit:
There is an interesting comment about the problems there may be with dxgettext and libraries. I did never experience this, as I have stopped using resource strings altogether. The biggest part of our programs are in German, and only a few are in English or translated into several languages.
Our internal libraries use "_(...)" around all translatable strings. There are defines ENGLISH and USEGETTEXT that are set on a per-project basis. If ENGLISH or USEGETTEXT are defined, then the English texts are compiled into the DCUs, else the German text is compiled into the DCUs. If USEGETTEXT is not defined "_()" is compiled as a function that returns its parameter as-is, else the dxgettext translation lookup is used.
I have... There can be some challenges.
a string does not mean much in itself, it needs a context.
corollary, the same string can need to have more than one translation.
screen real estate: beware of varying length depending on the language, for instance, French tends to be more verbose than English.
unless you are proficient in a given language, you won't be able to evaluate the discrepancies.
I've used TsiLang Translation Suite for enabling end users to translate. I modified the code to allow encryption so that if someone does a really good job they can protect their name against a translation file, but in general the idea is that people can share their translations, and add/edit any small part they wish to. Given that it all happens within the app, and with instant visibility, it works really nicely.
As you have mentioned, D2009 comes with localization tools. Why not simply using them? AFAIK you can distribute the external translation manager (etm.exe). Do you need anything else?
Also, localization is more than just translating text. ETM also supports translation of .dfm resources.
For completeness, here is another Delphi localization tool called Delphi Localizer I recently found that looks to be well designed and polished. The tool is free for commercial use with the exception of Government projects (not exactly sure why the exception).
FWIW I have uses TsiLang Translation Suite in the past and am currently working on another project using the localization tools shipped with DevExpress VCL. The later integrates nicely with their components as well as third-party components.
I am a ruby on rails developer and I have developed several plugins, may be i will be selling them to some web sites and they can use my plugin in there application. But i want to assure that the plugin code once given to them is not used for any other application, if they do so i must know where is it deployed.
I just need a way to track number of deployments, for a given plugin.
You can't. Ruby is interpreted, so they can simply remove the tracking code from it, and use it where they want. You might want to build C extensions for ruby if you really want resellable components.
You cannot assure it. You can make it unprofitable by creating some obfuscated code, which will limit the use. On assumption that analyzing that code will cost a lot more, than just paying you another license. Of course that doesn't guarantee anything at all.