We use a desktop software to localize our .NET application. It can export the localizable strings to XLS, that I send to our users who help us to translate the program strings.
All is OK if only one user is involved into the translation process with a certain language. But if two users start to localize the program to the same language - we face with translation collisions.
I thought to start using SVN to host our translation XLS files, but maybe someone could recommend me an inexpensive online service that allows us to avoid using SVN (cause it requires some learning curve for our customers).
You may want to consider crowdin.net.
It is really comfortable workflow for localization project.
P.S. prices are acceptable by the way.
Related
I am currently conducting some research on externalised translation services, and how to integrate them into our development workflow.
I have come across various services, and find it difficult to compare them:
transifex
crowdin
localizejs
tran.sl
oneskyapp
smartling
We are managing a large content website, using 2 methods:
gettext for the "static" text
different versions of the content (1 for each language) managed through a CMS.
The difficulty for us is to commission translations manually, it just doesn't well. We would like to automate the process instead.
whenever the gettext files are updated, content is sent automatically to a translation service.
whenever the content is updated, it is also pushed to a translation service.
It seems that all services above are designed to meet those requirements. So the question is which criteria to use to compare those various services?
The answer has a couple of different aspects.
Firstly, it will depend on the specific CMS you are using and how much you want to change your system. Namely, some of those services have to incorporate your website into their system or incorporate quite a few things into your system to achieve that degree of full automation. So you'll have to check with each one, I guess (although I have no direct experience of these services).
Otherwise, you might be advised to consult with a localisation agency and choose a localisation plug-in for your CMS and settle for a degree less automation. The process has to detect the changes, export the modified pages/contents as an XML or XLIFF file and send it to the translation service, it also has to recognise receipt of the translated file and re-import it. How much of that will happen without a supportive, prompting click or two from you, I'm not sure.
(The gettext PO files can usually be provided direct to a translator or agency who import them into their translation memory system and recognise changes since the last time they had them, only translating the changed segments. Here it pays off to work with the same translator or agency over time.)
Secondly, it depends on how many languages? All of the above gets potentially more complicated when it is being translated into more than one language.
Thirdly, who is translating? The services that offer real 'crowdservicing' often end up in a scenario where very short pieces of text without context are delivered to whichever translator happens to be awake and online at that moment. My recommendation would be specialised localisation agencies working with the relevant CMS plug-ins who should be able to offer higher quality by giving your contents to the same translator every time, or one of a small team, who then get a chance to develop a feel for your contents and translate accordingly.
Hope that helps,
I have several projects I've worked on that are setup for internationalization.
From the programming perspective, I have everything pretty much setup and put all of the string into an xml file or properties file. I wish to get these files translated into other languages, such as: Italian (it), Spanish (es), Germany (de), Brazillian Portugese (pt-br), Chinese Simplified (zh-cn), Chinese Traditional (zh-tw), Japanese (ja), Russian (ru), Hugarian (hu), Polish (pl), and French (fr).
I've considered using services like google translate, however I feel that this automatic translation tools are still a bit weak.
In summsary, I'm curious on if others have used professional translation services for their programs, if so which ones would people recommend and how did you coordinate the translation updates with the translation teams? Any idea on what I should expect to pay? Or is there a better way of doing this that I'm not aware of?
Machine translation services like Google, Bing etc. are not a good choice. As you mention, these services are in reality still in their infancy, and more importantly using them will most likely give your non-English customers a bad impression of your application.
If you want top quality translation, you will need to employ the services of a professional translation agency. Translators need to understand your application in order to translate the text correctly, so providing them with the application itself or screen captures of the English product will help.
You will pay per word - the rates vary from agency to agency, and also from language to language.
The other alternative is using crowd-sourced translations, from GetLocalization for example.
To summarize, proper localization is not just a matter of translating the text - you need to build a relationship with your translators, and ensure they understand your application and the context of the strings that they are translating, otherwise you will end up with a linguistically poor application, that will reflect badly on your company.
When there is a ready to use translator plugin available from companies like microsoft,google and yahoo why one has to implement globalization feature in an application using resource files like .resx. Why not simply plugin any one of those translator in the application and give the user with the freedom of choosing his own language/culture? Which one is better over other? Thanks in advance.
Automated machine translation is not the same as providing customized translations for different languages. Machine translation gets things wrong far too often, and can easily phrase something in a way that is offensive or embarrassing. It also doesn't take into account localization at all.
And more importantly, public machine translation services only work on public sites. Most globalized sites have pages only signed in users can reach. In that case, it is easier to provide translations yourself.
If you're making money off your customers, you're better off investing in real translation over a free service that ultimately marginalizes any users who don't speak the primary language the site is written in.
As I understand, these translators aren't as accurate. My last boss hired a translator and we translated the data into a separate language database table.
Autotranslate is not efficient and WILL get you in trouble when serious application is regarded. There is a very simple linguistic test you can perform on your application. First you translate from the original language to the target language. Then, you take the result and translate it back. If you get satisfactory results, you're good to go.
In fact, for some simple applications, that would be a recommended way. However, it MIGHT come back and bite you in the buttocks.
We're working on some Ruby on Rails web application which is currently in English but should be translated to more languages. We deploy new version of application to production every 2 weeks. Translators are a separate team.
We have a special page in admin area for making translations. Dictionaries are stored as YAML-files. We can let translator edit dictionaries in production and open language for visitors then it will be ready. Another way is to let translators to work on staging server and merge translations before deployment to productions.
Do somebody know a good process to synchronize work of developers and translators?
Thanks in advance.
37signals launched Tolk, a tool they used to translate Basecamp into several languages. I've not used it myself, but seems a handy tool to automate some parts of the process. It may be worth giving it a look.
Did not you try http://www.gnu.org/software/gettext/ ?
If you go with that, translators and developers can work parallel. Translators can use POEDIT to find the text in application and make the translation.
I have a freeware scientific app that is used by thousands of people in nearly 100 countries. Many have offered to translate for free. Now that D2009 makes this easier (with integrated and external localization tools, plus native Unicode support) I'd like to make this happen for a few languages and steadily add as many as user energy will support.
I'm thinking that I'll distribute a spreadsheet with a list of strings (dozens but not hundreds) to be translated, have them return it, and compare submissions in the same language from 2-3 users then work to resolve discrepancies by consensus. Then I'll incorporate the localizations using the Integrated Translation Environment, and distribute localized updates.
Has anyone delegated translation to users? Any gotchas, D2009-specific or otherwise?
EDIT: Has anyone compared the localization support built into D2009 versus dxgettext?
I have never been a friend of proprietary localization tools for Freeware or Open Source applications. Using dxgettext, the Delphi port of GNU gettext looks like a much better option to me:
Integration into the program (even much later than its development) is easy.
Extraction of translatable strings can be done by command line programs and is therefore easily introduced into an automated build.
A new translation can be added simply by creating a new directory with the correct structure, copying the empty translation file into it, and starting to translate the strings. This is something each user can do for themselves, there's no need to involve the original author for creation of a new translation. There is also instant gratification with this process - once the program is restarted the new translations are shown immediately.
Changing an existing translation is even easier than creating a new one. Thus if a user finds spelling or other errors or needs for improvement in the translation they can correct them easily and send the changes to the author.
New program versions work with old translations, the system degrades very gracefully - new and untranslated strings are simply shown unmodified.
Translations can be made using only notepad, but there are several free tools for creating and managing translation files too; see the links on the dxgettext page. They are localized themselves, and have some advantages over a spreadsheet as well:
The location of the strings in the source code can be shown (makes sense only for Open Source apps, of course).
The percentage of translated strings is shown.
Modifications to already translated strings are highlighted too.
The whole system is mature and future-proof - I have used dxgettext for Delphi 4 programs, and there should be no changes necessary for Delphi 2009 even - translation files have always been UTF-8 encoded.
Using a spreadsheet for the translation doesn't seem a workable solution to me once you have more than a few languages. Suppose a new program version adds 2 new strings and changes 10 strings only slightly - wouldn't you need to add the new strings to and highlight the changed strings in all of the several dozen spreadsheet files and send them again to your translators? Using dxgettext you just mail the changed po file to all of them.
Edit:
There is an interesting comment about the problems there may be with dxgettext and libraries. I did never experience this, as I have stopped using resource strings altogether. The biggest part of our programs are in German, and only a few are in English or translated into several languages.
Our internal libraries use "_(...)" around all translatable strings. There are defines ENGLISH and USEGETTEXT that are set on a per-project basis. If ENGLISH or USEGETTEXT are defined, then the English texts are compiled into the DCUs, else the German text is compiled into the DCUs. If USEGETTEXT is not defined "_()" is compiled as a function that returns its parameter as-is, else the dxgettext translation lookup is used.
I have... There can be some challenges.
a string does not mean much in itself, it needs a context.
corollary, the same string can need to have more than one translation.
screen real estate: beware of varying length depending on the language, for instance, French tends to be more verbose than English.
unless you are proficient in a given language, you won't be able to evaluate the discrepancies.
I've used TsiLang Translation Suite for enabling end users to translate. I modified the code to allow encryption so that if someone does a really good job they can protect their name against a translation file, but in general the idea is that people can share their translations, and add/edit any small part they wish to. Given that it all happens within the app, and with instant visibility, it works really nicely.
As you have mentioned, D2009 comes with localization tools. Why not simply using them? AFAIK you can distribute the external translation manager (etm.exe). Do you need anything else?
Also, localization is more than just translating text. ETM also supports translation of .dfm resources.
For completeness, here is another Delphi localization tool called Delphi Localizer I recently found that looks to be well designed and polished. The tool is free for commercial use with the exception of Government projects (not exactly sure why the exception).
FWIW I have uses TsiLang Translation Suite in the past and am currently working on another project using the localization tools shipped with DevExpress VCL. The later integrates nicely with their components as well as third-party components.