We have several large products we'd like to integrate with a consistent localization strategy.
We're already doing the right things from a code point of view - ie. strings in resource files.
I'm looking for something that will organize localized strings in a database, and generate the appropriate resource files (ie. .RESX files for .NET, .js files, etc.) during the build process. Ideally, it would also be able to read in the files as well (detecting strings that have been added/removed).
The database would allow us to reuse translations in different products, switch to different technologies, and track what translations are missing in each release.
Has anyone found a good product that handles these requirements? What have others done to manage localized assets?
Found some good links in the answers for this question: Do you know of a good program for editing/translating resource (.rc) files?
There's a number of products which we're now evaluating:
http://www.lingobit.com/
http://www.sisulizer.com/
http://www.multilizer.com/
WinTrans - http://www.schaudin.com/
None of have quite the database-based approach we were initially looking for, but they seem to have the core functionality. Lingobit is an early favorite, but we haven't trialed in too much detail yet. Does anyone have a recommendation between those products (or similar)?
Check out GlobalSight or Alchemy's Catalyst
Catalyst is a standalone translation memory and localization engine that can be used in your build process (and is used by many large software companies). GlobalSight is a relatively new and open source translation database and workflow tool that looks very promising.
Related
I am working on automating the translation workflow and improving the Localization process as a whole of a Rails website. I am using SimpleBackend so only YAML files are used for storing translations.
The current locales directory consists of folders, then sub-folders (in some cases) and those sub-folders containing yml files. I am considering to integrate the project with some third-party tool like Transifex for translation management so may be using a single YAML file for each language may be good for management of workflow.
If someone can highlight the pros and cons of both structures then it would be really helpful to decide whether I should switch from nested file structure to single file pattern or not. Also, the project is an Open-Source project with active contributors and so thinking for a long-term solution.
Thanks!
I think whatever tools you are using to make the process flow smoothly factors a lot in this decision. You should explore how exactly Transifex wants things to be structured in output, and try to keep your current input structure, and give that a shot before making a decision.
However, in my opinion, for a large app with a lot of translatable text, my preference would be to allow for multiple yaml files in your default locale, and one or two consolidated yaml files for each foreign translation. If there isn't a lot of translatable text in your app, maybe a single file is fine for you, but given it's already split up, there's a good chance that's the better choice. On a team with many contributors you can end up with a very high churn file (maybe with a lot of merge conflicts) that everyone changes all the time.
Splitting into separate files lets you logically separate out text to match a domain in your app, like a separate yaml file for mailers (or even each mailer), and one for each domain (or controller). Either way, it puts you in control of your organization strategy.
However, there isn't a lot of value, IMO in separating your foreign translations to mirror that structure. The systems I have experience with (not Transifex) generate your foreign translation files for you, so you just need to sync with the web interface and commit the results.
We have a SaaS web application (multi-tenant) for creating online documentation (like a wiki).
Right now we only support English. We want to support other languages. We have all the interface controls set up to work, but we just need to figure out how to handle the actual user content.
It seems we are thinking about storing each page for each language separately.
So if they user create a page "how to reset password", and wants it in English, Dutch, and Spanish, we would store 3 different version of that page in the database.
I would love any advice on this!!
Using my own experience as examples - you have two choices really:
Each language is completely seperate
When a user wants to translate the contents of their documentation - they simply start a new, separate, complete version of the docs - and translate each page as they go. You could copy the existing docs as a start point but each language would in any case be independent. For example, this is how the docs for CakePHP are currently organized. Each language is effectively independent which allows the most flexibility. By flexibility I mean, for example, a page in one language doesn't have to exist in all the others - or with the same url/name.
The disadvantage here is keeping the versions in sync, or knowing when a translation needs updating (new changes to master language version).
Store translations as versions of the original
Alternatively you could store translations as versions of the master/main language. that is how we did it before. It made it easier to track what was being translated but at the cost of flexibility.
The disadvantage here is it's potentially too rigid - it may not be easy to add a page that is language specific, adding a new page in not-main-language could also cause problems
This is analogous to "we would store 3 different version of that page in the database".
Both of the mentioned approaches are quite valid, pick the one that solves the most problems whist giving the least issues :)
The best way to deal with documentation in multilanguage applications is to avoid it by embedding the documentation in your application, but not as entire pages, just as sentences..
Try to make the usage of your application as simple and intuitive as possible, include hints in the UI and for those who need more provide a forum split by languages.
Now, be sure that you have at least one person for each language that will report back any issues in the interface, so you could improve the texts or the design based on the input from them.
I am looking for a good way to translate an excisting Sitecore installation (English language is available) to 4 other languages (Russian , Chinese, Portuguese etc.) A dedicated translation company will translate all texts we deliver to the specified languages, but I'm curious on how other companies set this up. I thought about just exporting all Sitecore items which have to be translated using the Database language Export function in Sitecore and having the translation company edit those files. By just replacing the language tags in the XML we should be able to import this file as the newly created other language, however I'm affraid that this XML structure will be totally useless for a translation company and that they will drown in the codes inside this XML. How can we efficiently do this? Is there any other way then just giving those translation people access to the Sitecore environment and having them edit the languages here? Any Shared Source Module to achieve this? I still have alot of questions, is there anyone with some experience in achieving this?
Your primary options are either the language export/import functionality (as you mention), or a workflow-based solution that integrates with your translation agency's Translation Management System (if they have one -- hopefully they do).
The former is better for the initial translation. Typically, your agency should be able to handle translation of content within XML files. A good one can. If you create all needed language versions beforehand and copy english content into them, it will make the files easier to work with as they'll have tags for the new languages in them already. I've seen the creation of these layers done with Revolver (http://www.codeflood.net/revolver/) but could also be done with custom code or workflow.
For ongoing maintenance of your translated content, you'll probably want to integrate through workflow. Clay Tablet Technologies (http://www.clay-tablet.com/) have a middleware component w/ Sitecore integration that can make this easier, depending on your translation agency. You can also do your own workflow-based integration, with workflow commands that allow your users to send content for translation. Then you'd need some sort of listener that pulls the translated content back in, and continues the workflow.
Hope this helps!
You could also check out Lionbrdige (http://en-us.lionbridge.com/sitecore-and-lionbridge-announce-partnership-to-help-companies-thrive-across-borders.htm) as a solution.
From my own experience our customers normally use the Sitecore import/export function as a first step and then use Lionbridge or Clay Tablet as a service.
One important thing to think about with translations is the ongoing work. The initial translation is rather simple, but the second and so on might be more troublesome. What if different changes has been made in different languages. If local changes were made in the content for sat the french version you couldn´t just send the English version (second translate then) since you would also have to accomodate for the regional changes in the content.
Having worked with literally dozens of Sitecore clients worldwide — and helped get content to and from all the largest, and many smaller translation firms —, I can attest to the ineffeciency of trying to do translation in situ, that is in Sitecore. I liken it to asking an electrician to come over and rewire your house, but as they reach for their toolbox from the truck you tell them, "Nope — you need to do it by hand".
The very best way to manage anything more than a page or two of content for translation is to export it seamlessly. Deliver it to the LSP in a proper format (XML or XLIFF) and, when possible, auto import it to their TMS. Once translated, the content should then flow seamlessly back into Sitecore.
You can code this yourself — but the pitfalls are non-trivial just on the Sitecore side. (If you want intuitive UI's, scalability, and all the features that meet the needs of translation). Let alone the challenges of connecting to the systems LSP's use. (For example, who here knows the relative merits/risks of using SLD's Nexus connector versus their CTA for connecting to TMS?)
As kindly mentioned above, there are commecially available solutions that meet all these needs and more. So if you've got even a modest amount of content — and want to send that to any translation provider of your choice — I'd be happy to discuss how we can help.
The main issue with translation isn't technical at all, the XML export is a simple enough format and all agencies should be able to deal with it with no porblems. as others have suggested, maintenance after the initial translation is slightly more problematic but they also point to tools to achieve this.
The main issue we've found with translation is actually linguistic: how to achieve consistency of phrasing and that matches the original but is sufficiently adjusted to local requirements. Translation companies usually have software to aid this - libraries of of the phrases they translate etc. - working with an exported XML file doesn't provide the context of seeing content in situ. A particular item may be translated correctly and the site consistently, but as each page may be built from multiple items there can easily be conflicts between content as presented.
That makes working with the Sitecore backend (maybe with field security settings to limit ) or in the page editor (possibly pre filling fields with English values) a viable idea.
My web site (on Linux servers) needs to support multiple languages.
What is the best practice to have/store multiple languages versions of the same site?
Some I can think of:
store in DB
different view file for each language
gettex
hard coded words in PHP files (like in phpBB)
With web sites, you really have several categories of content to consider for localization:
The article-type content elements that you would in many cases create, edit and publish in a CMS.
The smaller content blocks that are common to every page (or a sub-group of pages), such as tagline, blurb, text around a contact form, but also imported content such as a news ticker or ads and affiliate links. Some of these may only appear for one language (for example, if you don't offer some services in some regions, or don't have, say, language-appropriate imported content for a particular language: it can be better to remove an element rather than offering English to people who may not speak it).
The purely functional elements, like "Click here to comment", "More...", high-level navigation, etc., which are sometimes part of your template. Some of these may be inside images.
For 1. the main decision is using a CMS or not. If yes, you absolutely need to choose one that supports multiple languages. I'm not up-to-date with recent developments in PHP CMS's, but several of the Django CMS apps (Django-CMS-2, FeinCMS) support multi-language content. Don't forget that date stamps, for example, need to be localized, too (or you can get around this by choosing ISO dates, though that may not always be possible). If you don't use a CMS, and everything is in your HTML files, then gettext is the way to go, and keep the .mo files (and your offline .po files) in folders by language.
For 2. if you have a CMS with good multi-lingual support, get as much as possible inside the CMS. The reason is that these bits do change, and you want to edit your template as little as possible. If you write code yourself, think of ways of exporting all in-CMS strings per language, to hand them to translators. Otherwise, again, gettext. The main issue is that these elements may require hard-coding language-selection code (if $language = X display content1 ...)
For 3., if it's in your template, use gettext. For images, the per-language folders will come in handy, and for heaven's sake make choose images the generation of which can be automated, or you (or your graphic artist) will go mad with editing 100s of custom images with strings in languages you don't understand.
For both 2. and 3., abstracting from the language selection may help selecting the appropriate blocks or content directory (where localized images or .mo files are kept).
What you definitely want to avoid is keeping a pile of HTML files with extensive text content in them that would be a nightmare to maintain.
EDIT: Everything about gettext, .po and .mo files is in the GNU gettext manual (more than you ever wanted to know) or a slightly dated but friendlier tutorial. For PHP, there's are the PHP gettext functions, and also the Zend Locale documentation
I recommend using Zend_Translate's Gettext adapter which parses mo files. Very efficient + caching. Your calls would be like
echo $translation->_("Hello World");
Which would find the locale specific key for that specified string.
Check out i18n support for php: http://php-flp.sourceforge.net/getting_started_english.htm
OK, true confession first: Maybe it's just me, but sometimes "best practices for program settings" on Windows machines feels like it's changed more than Microsoft's data access strategies. I'm still running XP, and somewhere along the way just kind "glazed over" about where MS wanted me to store all my app's data, etc. I controlled all the machines I coded for, so it really didn't matter.
Now I'm writing apps for "in the wild," supporting Win98SE on up. I have to pay attention to all this again. :-\
For reasons having to do mostly with easy migration to new computers, I'm not a big fan of using the registry for app settings -- I prefer using INI files, and have some older INI components that I use for the task (Raize, usually). I'm open to suggestions to other third-party components, if they'll make this easier / less hassle.
Basically, I need to store app settings (like remembering option settings, etc).
I've read:
Registry vs. INI file for storing user configurable application settings
Where to store program settings instead of HKEY_LOCAL_MACHINE?
Where should my win32 program keep its files?
Best place to store configuration files and log files on Windows for my program?
...so at least I'm not alone in this question... ; )
(w/apologies for what is somewhat of a repeat question, albeit from a slightly different angle).
It SOUNDS like I can just use %APPDATA%/MyProgram, and store all data there, BUT is this UNIVERSALLY TRUE across all Windows flavors from Win98SE on up? If not, what's the best approach, and when did that approach come into existence?
What I'm really looking for, honestly, is the easiest way to make this problem go away -- I just want one (if possible), simple, easy, reliable way to grab "My Program's Data Folder" in any and all instances. Will the above accomplish that?
Head over to Making Delphi programs Vista-Ready and scroll down to the bottom for: "Where to save your application data?"
function GetRoamingUserAppDataPath : string;
//works so long as people have at least IE 4. (and Win95 or better)
%APPDATA% doesn't seem to be present on Win95. I would use SHGetSpecialFolderPath which is available on Win98 or Win95 w/IE4.
The answers you got so far are good, I think however that you are mixing things in your question that don't belong together, so it's not possible to completely answer your question.
Your question title states
INI-type settings and/or DB files
and these can be two quite different things. You have to differentiate between files that are per-user, files that are common for all users but read-only, and common files that every user should be able to work with.
The first category is easy, use a suitable directory under one of the CSIDL_APPDATA or CSIDL_PERSONAL folders.
The second category is also easy, files just need to be installed in the proper location by the setup program, which must be run with proper permissions, because standard (limited) users will not be able to write to the correct locations.
The last category however is difficult, because there simply isn't a directory in all Windows installations that can be assumed to be writeable for all users. Especially in business settings with locked-down user accounts it can be that the user has no write permissions for the local disc at all, only to their user directory somewhere on a server in the network. So there is no single simple way to grab that data location for your program and be sure it will work - it's something you will possibly have to revisit for all programs and all use cases anew.
For bonus points you should also always consider whether the files should go into the roaming profile and be available on all machines in a domain, or whether they are machine-specific.
One tip I would give is to move away from desktop style database files like Paradox or Access files for applications that need to share data between users on a machine. Only with real (local) database servers will you be able to have multi-user data on locked-down accounts / machines.
I can't comment on whether it works on all versions but I certainly agree that .ini files are the right answer. When XP said to use the registry I tried it--for one program. In-house deployment showed that the settings needed to be copied from machine to machine--the next update was back to .ini files.
I think the whole registry bit was an anti-piracy measure of theirs that they wanted hide as a best practice.
I tend to use xml files in %APPDATA% to hold my settings. Only really because xml is easier than ini's for C#. Regardless, %APPDATA% does seem a safe bet.
I can't advise on whether 98SE supports this.