I have a Python script that loads a Glade-GUI that can be translated. Everything works fine under Linux, but I am having a lot of trouble understanding the necessary steps on Windows.
All that seems necessary under Linux is:
import locale
[...]
locale.setlocale(locale.LC_ALL, locale.getlocale())
locale.bindtextdomain(APP_NAME, LOCALE_DIR)
[...]
class SomeClass():
self.builder = Gtk.Builder()
self.builder.set_translation_domain(APP_NAME)
locale.getlocale() returns for example ('de_DE', 'UTF-8'), the LOCALE_DIR just points at the folder that has the compiled mo-files.
Under Windows this makes things more difficult:
locale.getlocale() in the Python console returns (None, None) and locale.getdefaultlocale() returns ("de_DE", "cp1252"). Furthermore when one tries to set locale.setlocale(locale.LC_ALL, "de_DE") will spit out this error:
locale.setlocale(locale.LC_ALL, "de_DE")
File "C:\Python34\lib\locale.py", line 592, in setlocale
return _setlocale(category, locale)
locale.Error: unsupported locale setting
I leave it to the reader to speculate why Windows does not accept the most common language codes. So instead one is forced to use one of the below lines:
locale.setlocale(locale.LC_ALL, "deu_deu")
locale.setlocale(locale.LC_ALL, "german_germany")
Furthermore the locale module on Windows does not have the bintextdomain function. In order to use it one needs to import ctypes:
import ctypes
libintl = ctypes.cdll.LoadLibrary("intl.dll")
libintl.bindtextdomain(APP_NAME, LOCALE_DIR)
libintl.bind_textdomain_codeset(APP_NAME, "UTF-8")
So my questions, apart from how this works, is:
Which intl.dll do I need to include? (I tried the gnome/libintl-8.dll from this source: http://sourceforge.net/projects/pygobjectwin32/, (pygi-aio-3.14.0_rev19-setup.exe))
How can I check if the e.g. locale deu_deu gets the correct /mo/de/LC_MESSAGES/appname.mo/?
Edit
My folder structure (Is it enough to have a de folder? I tried using a deu_deu folder but that did not help):
├── gnome_preamble.py
├── installer.cfg
├── pygibank
│ ├── __init__.py
│ ├── __main__.py
│ ├── mo
│ │ └── de
│ │ └── LC_MESSAGES
│ │ └── pygibank.mo
│ ├── po
│ │ ├── de.po
│ │ └── pygibank.pot
│ ├── pygibank.py
│ └── ui.glade
└── README.md
I put the repository here: https://github.com/tobias47n9e/pygobject-locale
And the compiled Windows installer (64 bit) is here: https://www.dropbox.com/s/qdd5q57ntaymfr4/pygibank_1.0.exe?dl=0
Short summary of the answer
The mo-files should go into the gnome-packages in this way:
├── gnome
│ └── share
│ └── locale
│ └── de
| └── LC_MESSAGES
| └── pygibank.mo
You are close. This is a very complicated subject.
As I wrote in Question 10094335 and in Question 3678174:
To setup the locale to user current locale do not call:
locale.setlocale(locale.LC_ALL, locale.getlocale())
Simply call:
locale.setlocale(locale.LC_ALL, '')
As explained in Python setlocale reference documentation.
This sets the locale for all categories to the user’s default setting (typically specified in the LANG environment variable).
Note that Windows doesn't have the LANG environment variable set up, so, you need to do this before that line:
import sys
import os
import locale
if sys.platform.startswith('win'):
if os.getenv('LANG') is None:
lang, enc = locale.getdefaultlocale()
os.environ['LANG'] = lang
This will also make gettext to work for in-Python translations.
How this work you can check it in the source code here:
https://github.com/python/cpython/blob/master/Modules/_localemodule.c#L90
In particular, the error you're getting:
locale.Error: unsupported locale setting
Is expressed here:
https://github.com/python/cpython/blob/master/Modules/_localemodule.c#L112
Which is just a generic error message that the C call setlocale failed with the given parameters.
The C call setlocale is defined in the locale.h header. In Linux, this is:
Linux locale.h
In Windows, this is the one used:
Windows locale.h
In Windows locale.h documentation you can read:
The set of language and country/region strings supported by setlocale are listed in Language Strings and Country/Region Strings.
And that points to:
Visual Studio 2010 Language Strings
Visual Studio 2010 Country/Region Strings
As you can see, for the 2010 version the setlocale function expects the locale in the format you found out: deu_deu, that differ from the one expected by the Linux version de_DE. Your only option is to use a list of os-dependent locales to setup the locale. Very very sad indeed.
There is another issue here. If you change the version of the toolchain you can see that newer version of the setlocale function now work more closelly to what Linux/POSIX does:
Visual Studio 2015 Language Strings
american english en-US
Visual Studio 2010 is the last release to support the old format, starting from version 2012 the new locale format is expected.
As you can imagine, the one you need to use depends on the version of the toolchain for which the CPython interpreter you're using was built to. I don't know which version are you using but according to the official Python Developer's Guide:
Python 3.5 and later use Microsoft Visual Studio 2015. [...]
Python 3.3 and 3.4 use Microsoft Visual Studio 2010. [...]
Most Python versions prior to 3.3 use Microsoft Visual Studio 2008. [...]
That is all related to the Python locale module. Now, for the gettext module or the gettext related functions in the locale module, this is another C library called libintl.
libintl is what is called the C library that is part of gettext for all this translation magic:
libintl reference documentation
One relevant part of this documentation says:
Note that on GNU systems, you don’t need to link with libintl because the gettext library functions are already contained in GNU libc.
But in Windows, because of the issues explained in Question 10094335 you need to load the libintl library that is being used by PyGObject, that is, the very same that it was linked during build. That is done doing the steps you already wrote.
Which intl.dll do I need to include? (I tried the gnome/libintl-8.dll from this source: http://sourceforge.net/projects/pygobjectwin32/, (pygi-aio-3.14.0_rev19-setup.exe))
So, yes. The one that was used to link against when the pygobject AIO was build.
How can I check if the e.g. locale deu_deu gets the correct /mo/de/LC_MESSAGES/appname.mo/
Configure a few messages and note if they show translated. Just a note, is not a folder "/mo/de/LC_MESSAGES/appname.mo/", appname.mo is a file.
Check my first answer to how to create the translation .po file from the Glade file.
Related
I have a complex config search path consisting of multiple locations where each location looks similar to this:
├── conf
│ └── foo
│ ├── foo.yaml
│ └── bar.yaml
└── files
├── foo.txt
└── bar.txt
with foo.yaml:
# #package _group_
path: "../../files/foo.txt"
and bar.yaml:
# #package _group_
path: "../../files/bar.txt"
Now the problem is: how do I find the correct location of the files specified in the configurations? I am aware of the to_absolute_path() method provided by hydra, but it interprets the path relative to the directory in which the application was started. However, I would like to interpret that path relative to the position of the configuration file. I cannot do this manually in my code, because I don't know how hydra resolved the configuration file and where exactly it is used to.
Is there some mechanism to determine the location of a config file from hydra? I really want to refrain from putting hard coded absolute paths in my configurations.
You can't get the path of a config file. In fact, it may not be a file at all (such as the case for Structured Configs), or it can be inside a python wheel (even in a zipped wheel).
You can do something like
path = os.path.join(os.path.dirname(__file__), "relative_path_from_config")
You can use also APIs designed for loading resources files from Python modules.
Here is a good answer in the topic.
I've an external dependency declared in WORKSPACE as a new_git_repository and provided a BUILD file for it.
proj/
├── BUILD
├── external
│ ├── BUILD.myDep
│ └── code.bzl
└── WORKSPACE
in the BUILD.myDep file, I want to load code.bzl nearby, but when I load it (load("//:external/code.bzl", "some_func")) bazel tries to load #myDep//:external/code.bzl instead!
Of course it's not a target in #myDep repository, but in my local worksapce.
Seems I Rubber Duck ed the Stackoverflow. since the solution appeared when writing the question!
However, the solution is to explicitly mention the local workspace when loading the .bzl file:
Suppose we have declared the name in the WORKSPACE as below:
workspace(name = "local_proj")
Now instead of load("//:external/code.bzl", "some_func"), just load it explicitly as a local workspace file:
load("#local_proj//:external/code.bzl", "some_func")
NOTE: When using this trick just be careful about potential dependency loops (i.e. loading a generated file that itself is produced by a rule depending on the same external repo!)
Problem
I write down lectures at university in LaTeX (which is really convenient for this purpose), and i want tex files to automatically compile in pdf.
I have couple of .tex files in my repository like this:
.
├── .gitlab-ci.yml
└── lectures
├── math
| ├── differentiation
| | ├── lecture_math_diff.tex
| | ├── chapter_1.tex
| | └── chapter_2.tex
| └── integration
| ├── lecture_math_int.tex
| ├── chapter_1.tex
| └── chapter_2.tex
└── physics
└── mechanics
├── lecture_physics_mech.tex
├── chapter_1.tex
└── chapter_2.tex
So main file, for example, lecture_math_diff.tex using
\include{chapter_1}
\include{chapter_2}
tags, to form whole lecture.
And as result, i want to have my build artifacts in pdf like this:
├── math
| ├── lecture_math_diff.pdf
| └── lecture_math_int.pdf
└── physics
└── lecture_physics_mech.pdf
What can be done here? Do i have to write any sh script to collect all tex files or use gitlab runners?
One approach would be to use a short script (e.g python or bash) and to run latexmk to generate the PDF files.
latexmk is a perl script, which compiles latex files automatically. A short introduction can be found here
With python3 the script could look like the following one:
# filename: make_lectures.py
import os
from subprocess import call
# configuration:
keyword_for_main_tex = "lecture"
if __name__ == "__main__":
tex_root_directory = os.getcwd()
for root, _, files in os.walk("."):
for file_name in files:
# check, if file name ends with `tex` and starts with the keyword
if file_name[-3:] == "tex" and file_name[0:len(keyword_for_main_tex)] == keyword_for_main_tex:
os.chdir(root) # go in the direcotry
os.system("latexmk -lualatex "+ file_name) # run latexmk on the mainfile
os.chdir(tex_root_directory) # go back to root directory in case of relative pathes
This script assumes, that only files to be compiled to PDF start with the keyword lecture (as in the question). But the if statement, which checks for files to build, could also be extended to more elaborate comparison as matching regular expressions.
latexmk is called with the command line flag -lualatex here to demonstrate how to configure the build process gloally. A local configuration possibility (for each single project) is given with .latexmkrc files, which are read and processed by latexmk.
If we call latexmk as shell command, we have to make sure, that it is installed on our gitlab runner (and also texlive). If Dockercontainer runners are registered (see here how it is done), then you just need to specify the name of an image from DockerHub, which leads to the example gitlab-ci.yml file below:
compile_latex_to_pdf:
image: philipptempel/docker-ubuntu-tug-texlive:latest
script: python3 make_lectures.py
artifacts:
paths:
- ./*.pdf
expire_in: 1 week
Fell free, to change the image to any other image you like (e.g. blang/latex:latest). Note, that the artifacts extraction assumes, that no other PDF files are in the repository.
A final remark: I did not try it, but it should also be possible to install texlive and latexmk directly on the gitlab runner (if you have access to it).
You can have a look at https://github.com/reallyinsane/mathan-latex-maven-plugin. With the maven or gradle plugin you can also use "dependencies" for your projects.
I have this setup:
├── bin
│ ├── all.dart
│ ├── details
│ │ ├── script1.dart
│ │ └── script2.dart
| | .....
all.dart simply imports script1.dart and script2.dart and calls their main. The goal is to have a bunch of scripts under details that can be run individually. Additionally I want a separate all.dart script that can run them all at once. This will make debugging individual scripts simpler, yet still allowing all to run.
all.dart
import 'details/script1.dart' as script1;
import 'details/script2.dart' as script2;
main() {
script1.main();
script2.main();
}
script1.dart
main() => print('script1 run');
script2.dart
main() => print('script2 run');
So, this is working and I see the print statements expected when running all.dart but I have two issues.
First, I have to softlink packages under details. Apparently pub does not propagate packages softlinks down to subfolders. Is this expected or is there a workaround?
Second, there are errors flagged in all.dart at the point of the second import statement. The analyzer error is:
The imported libraries 'script1.dart' and 'script2.dart' should not have the same name ''
So my guess is since I'm importing other scripts as if they are libraries and since they do not have the library script[12]; statement at the top they both have the same name - the empty name?
Note: Originally I had all of these under lib and I could run them as scripts specifying a suitable --package-root on the command line even though they were libraries with main. But then to debug I need to run in Dart Editor, which is why I'm moving them to bin. Perhaps the editor should allow libraries under lib with a main to be run as a script since they run outside the editor just fine? The actual differences between script/library seems a bit unnecessary (as other scripting languages allow files to be both).
How do I clean this up?
I'm not sure what the actual question is.
If a library has not library statement then the empty string is used as a name.
Just add a library statement with an unique name to fix this.
Adding symlinks to subdirectories solves the problem with the imports for scripts in subdirectories.
I do this regularily.
It was mentioned several times in dartbug.com that symlinks should go away entirely but I have no idea how long this will take.
I have never tried to put script files with a main in lib but it is just against the package layout conventions and I guess this is why DartEditor doesn't support it.
In the blog How to generate language fragment bundles to localize Carbon products by Tanya Madurapperuma, I am having the following problem. Once generated the language bundles with ant localize command, these bundles are generated in the CARBON_HOME/repository/components/dropins/ folder. The problem is that when I run the tool I'm not looking to change the language to Spanish. I would appreciate help to correct what I may be missing to do?
Note: All resources.properties files are translated into Spanish.
If you have the jars with translated resources.properties files in you dropins folder, you need to restart the server and set the Locale setting of your browser to Spanish.
Locale should be changed in the browser, and then the server will pick the matching resources files to use.
UPDATE:
There are some problems here.
First, there's a bug if you have multiple directories in /resources directory. For now, you can make sure that you have only one directory inside resources directory when you run localize task.
You should have the properties files inside a directory with the bundle name, without the tree structure. So your resources directory should look like this.
../resources/
└── org.wso2.carbon.i18n_4.2.0
├── JSResources_es.properties
└── Resources_es.properties
You need to include the locale code as _es in your files as shown above.
Also the localize tool seems to append i18n at the end of the folder structure of the built jar. This works with ui bundles but in the case of org.wso2.carbon.i18n it looks as org/wso2/carbon/i18n/i18n. So open the built jar in dropins folder and remove the extra i18n folder so that the jar tree structure looks like following.
../repository/components/dropins/org.wso2.carbon.i18n.languageBundle_4.2.0.jar
├── META-INF
│ └── MANIFEST.MF
└── org
└── wso2
└── carbon
└── i18n
├── JSResources_es.properties
└── Resources_es.properties
Did you get this to work?
The place I doubt that you might have gone wrong is the folder structure in the resources folder. (You can place your resource files anywhere and execute command as ant localize -Dresources.directory=path_to_your_resources_directory)
Also note that a resource folder should have the proper naming conventions of the osgi bundle.
Ex: org.wso2.carbon.claim.mgt.ui_4.2.0 (This entire thing is name of the folder)
If you still couldn't get this to work mail me your resources folder to tanyamadurapperuma#gmail.com