So I've documented my whole API with swagger editor, and now I have my .yaml file. I'm really confused how I take that and generate the whole nodejs stuff now so that all those functions are already defined and then I just fill them in with the appropriate code.
Swagger Codegen generates server stubs and client SDKs for a variety of languages and frameworks, including Node.js.
To generate a Node.js server stub, run codegen with the -l nodejs-server argument.
Windows example:
java -jar swagger-codegen-cli-2-2-2.jar generate -i petstore.yaml -l nodejs-server -o .\PetstoreServer
You get:
.
├── api
| └── swagger.yaml
├── controllers
| ├── Pet.js
| ├── PetService.js
| ├── Store.js
| ├── StoreService.js
| ├── User.js
| └── UserService.js
├── index.js
├── package.json
├── README.md
└── .swagger-codegen-ignore
Related
Here is my issue: I want to include/exclude specific file when building a service called django. But when playing with Dockerfile.dockerignore file, I do not succeed what I want and I can't figure out which syntax to adopt in order to do so.
Repo structure is the following:
.
└── root_dir
├── django
│ ├── Dockerfile
│ ├── Dockerfile.ignore
│ └── stuff
├── documentation
│ └── stuff
├── useless dir
└── another useless dir
My docker-compose.yml used to make the build is the following:
version: '3.5'
services:
[...]
django:
build:
context: $ROOT_DIR
dockerfile: $ROOT_DIR/django/Dockerfile
[...]
As you can see, context is root_dir and not django (because I need to copy a few things from documentation dir at build time).
What I need when building my image:
django rep
documentation rep
What I want to ignore when building my image:
everything but django and documentation dirs
Dockerfile and Dockerfile.dockerignore
and a few other things (virtual env, *.pyc etc, but we'll stick with Dockerfile and Dockerfile.dockerignore for that question!)
What my Dockerfile.dockerignore looks like in order to do so:
# Ignore everything
*
# Allows django and documentation rep (context is root of project)
!/django
!/documentation
# Ignoring some files
**/Dockerfile
**/Dockerfile.dockerignore
but my issue is that the files I want to ignore are still present in my image! And this is not good for me. I think I tried any syntax possible (**/, */*/......) but I can't find one that suits my need.
Many thanks if you can help me with that!
EDIT: for those wondering about that Dockerfile.dockerignore file, you can have a look here: https://docs.docker.com/engine/reference/commandline/build/#use-a-dockerignore-file
Docker ignore file name should be .dockerignore
You can check the documentation here https://docs.docker.com/engine/reference/builder/#dockerignore-file
Running this command before the 'docker build' fixed it for me:
export DOCKER_BUILDKIT=1
FYI: I did not need to install 'buildkit' first.
Problem
I write down lectures at university in LaTeX (which is really convenient for this purpose), and i want tex files to automatically compile in pdf.
I have couple of .tex files in my repository like this:
.
├── .gitlab-ci.yml
└── lectures
├── math
| ├── differentiation
| | ├── lecture_math_diff.tex
| | ├── chapter_1.tex
| | └── chapter_2.tex
| └── integration
| ├── lecture_math_int.tex
| ├── chapter_1.tex
| └── chapter_2.tex
└── physics
└── mechanics
├── lecture_physics_mech.tex
├── chapter_1.tex
└── chapter_2.tex
So main file, for example, lecture_math_diff.tex using
\include{chapter_1}
\include{chapter_2}
tags, to form whole lecture.
And as result, i want to have my build artifacts in pdf like this:
├── math
| ├── lecture_math_diff.pdf
| └── lecture_math_int.pdf
└── physics
└── lecture_physics_mech.pdf
What can be done here? Do i have to write any sh script to collect all tex files or use gitlab runners?
One approach would be to use a short script (e.g python or bash) and to run latexmk to generate the PDF files.
latexmk is a perl script, which compiles latex files automatically. A short introduction can be found here
With python3 the script could look like the following one:
# filename: make_lectures.py
import os
from subprocess import call
# configuration:
keyword_for_main_tex = "lecture"
if __name__ == "__main__":
tex_root_directory = os.getcwd()
for root, _, files in os.walk("."):
for file_name in files:
# check, if file name ends with `tex` and starts with the keyword
if file_name[-3:] == "tex" and file_name[0:len(keyword_for_main_tex)] == keyword_for_main_tex:
os.chdir(root) # go in the direcotry
os.system("latexmk -lualatex "+ file_name) # run latexmk on the mainfile
os.chdir(tex_root_directory) # go back to root directory in case of relative pathes
This script assumes, that only files to be compiled to PDF start with the keyword lecture (as in the question). But the if statement, which checks for files to build, could also be extended to more elaborate comparison as matching regular expressions.
latexmk is called with the command line flag -lualatex here to demonstrate how to configure the build process gloally. A local configuration possibility (for each single project) is given with .latexmkrc files, which are read and processed by latexmk.
If we call latexmk as shell command, we have to make sure, that it is installed on our gitlab runner (and also texlive). If Dockercontainer runners are registered (see here how it is done), then you just need to specify the name of an image from DockerHub, which leads to the example gitlab-ci.yml file below:
compile_latex_to_pdf:
image: philipptempel/docker-ubuntu-tug-texlive:latest
script: python3 make_lectures.py
artifacts:
paths:
- ./*.pdf
expire_in: 1 week
Fell free, to change the image to any other image you like (e.g. blang/latex:latest). Note, that the artifacts extraction assumes, that no other PDF files are in the repository.
A final remark: I did not try it, but it should also be possible to install texlive and latexmk directly on the gitlab runner (if you have access to it).
You can have a look at https://github.com/reallyinsane/mathan-latex-maven-plugin. With the maven or gradle plugin you can also use "dependencies" for your projects.
I have a Python script that loads a Glade-GUI that can be translated. Everything works fine under Linux, but I am having a lot of trouble understanding the necessary steps on Windows.
All that seems necessary under Linux is:
import locale
[...]
locale.setlocale(locale.LC_ALL, locale.getlocale())
locale.bindtextdomain(APP_NAME, LOCALE_DIR)
[...]
class SomeClass():
self.builder = Gtk.Builder()
self.builder.set_translation_domain(APP_NAME)
locale.getlocale() returns for example ('de_DE', 'UTF-8'), the LOCALE_DIR just points at the folder that has the compiled mo-files.
Under Windows this makes things more difficult:
locale.getlocale() in the Python console returns (None, None) and locale.getdefaultlocale() returns ("de_DE", "cp1252"). Furthermore when one tries to set locale.setlocale(locale.LC_ALL, "de_DE") will spit out this error:
locale.setlocale(locale.LC_ALL, "de_DE")
File "C:\Python34\lib\locale.py", line 592, in setlocale
return _setlocale(category, locale)
locale.Error: unsupported locale setting
I leave it to the reader to speculate why Windows does not accept the most common language codes. So instead one is forced to use one of the below lines:
locale.setlocale(locale.LC_ALL, "deu_deu")
locale.setlocale(locale.LC_ALL, "german_germany")
Furthermore the locale module on Windows does not have the bintextdomain function. In order to use it one needs to import ctypes:
import ctypes
libintl = ctypes.cdll.LoadLibrary("intl.dll")
libintl.bindtextdomain(APP_NAME, LOCALE_DIR)
libintl.bind_textdomain_codeset(APP_NAME, "UTF-8")
So my questions, apart from how this works, is:
Which intl.dll do I need to include? (I tried the gnome/libintl-8.dll from this source: http://sourceforge.net/projects/pygobjectwin32/, (pygi-aio-3.14.0_rev19-setup.exe))
How can I check if the e.g. locale deu_deu gets the correct /mo/de/LC_MESSAGES/appname.mo/?
Edit
My folder structure (Is it enough to have a de folder? I tried using a deu_deu folder but that did not help):
├── gnome_preamble.py
├── installer.cfg
├── pygibank
│ ├── __init__.py
│ ├── __main__.py
│ ├── mo
│ │ └── de
│ │ └── LC_MESSAGES
│ │ └── pygibank.mo
│ ├── po
│ │ ├── de.po
│ │ └── pygibank.pot
│ ├── pygibank.py
│ └── ui.glade
└── README.md
I put the repository here: https://github.com/tobias47n9e/pygobject-locale
And the compiled Windows installer (64 bit) is here: https://www.dropbox.com/s/qdd5q57ntaymfr4/pygibank_1.0.exe?dl=0
Short summary of the answer
The mo-files should go into the gnome-packages in this way:
├── gnome
│ └── share
│ └── locale
│ └── de
| └── LC_MESSAGES
| └── pygibank.mo
You are close. This is a very complicated subject.
As I wrote in Question 10094335 and in Question 3678174:
To setup the locale to user current locale do not call:
locale.setlocale(locale.LC_ALL, locale.getlocale())
Simply call:
locale.setlocale(locale.LC_ALL, '')
As explained in Python setlocale reference documentation.
This sets the locale for all categories to the user’s default setting (typically specified in the LANG environment variable).
Note that Windows doesn't have the LANG environment variable set up, so, you need to do this before that line:
import sys
import os
import locale
if sys.platform.startswith('win'):
if os.getenv('LANG') is None:
lang, enc = locale.getdefaultlocale()
os.environ['LANG'] = lang
This will also make gettext to work for in-Python translations.
How this work you can check it in the source code here:
https://github.com/python/cpython/blob/master/Modules/_localemodule.c#L90
In particular, the error you're getting:
locale.Error: unsupported locale setting
Is expressed here:
https://github.com/python/cpython/blob/master/Modules/_localemodule.c#L112
Which is just a generic error message that the C call setlocale failed with the given parameters.
The C call setlocale is defined in the locale.h header. In Linux, this is:
Linux locale.h
In Windows, this is the one used:
Windows locale.h
In Windows locale.h documentation you can read:
The set of language and country/region strings supported by setlocale are listed in Language Strings and Country/Region Strings.
And that points to:
Visual Studio 2010 Language Strings
Visual Studio 2010 Country/Region Strings
As you can see, for the 2010 version the setlocale function expects the locale in the format you found out: deu_deu, that differ from the one expected by the Linux version de_DE. Your only option is to use a list of os-dependent locales to setup the locale. Very very sad indeed.
There is another issue here. If you change the version of the toolchain you can see that newer version of the setlocale function now work more closelly to what Linux/POSIX does:
Visual Studio 2015 Language Strings
american english en-US
Visual Studio 2010 is the last release to support the old format, starting from version 2012 the new locale format is expected.
As you can imagine, the one you need to use depends on the version of the toolchain for which the CPython interpreter you're using was built to. I don't know which version are you using but according to the official Python Developer's Guide:
Python 3.5 and later use Microsoft Visual Studio 2015. [...]
Python 3.3 and 3.4 use Microsoft Visual Studio 2010. [...]
Most Python versions prior to 3.3 use Microsoft Visual Studio 2008. [...]
That is all related to the Python locale module. Now, for the gettext module or the gettext related functions in the locale module, this is another C library called libintl.
libintl is what is called the C library that is part of gettext for all this translation magic:
libintl reference documentation
One relevant part of this documentation says:
Note that on GNU systems, you don’t need to link with libintl because the gettext library functions are already contained in GNU libc.
But in Windows, because of the issues explained in Question 10094335 you need to load the libintl library that is being used by PyGObject, that is, the very same that it was linked during build. That is done doing the steps you already wrote.
Which intl.dll do I need to include? (I tried the gnome/libintl-8.dll from this source: http://sourceforge.net/projects/pygobjectwin32/, (pygi-aio-3.14.0_rev19-setup.exe))
So, yes. The one that was used to link against when the pygobject AIO was build.
How can I check if the e.g. locale deu_deu gets the correct /mo/de/LC_MESSAGES/appname.mo/
Configure a few messages and note if they show translated. Just a note, is not a folder "/mo/de/LC_MESSAGES/appname.mo/", appname.mo is a file.
Check my first answer to how to create the translation .po file from the Glade file.
my problem is that I can't run eunit tests for a single app or module without including the root app. My directory laylout looks a bit like this:
├── apps
│ ├── app1
│ └── app2
├── deps
│ ├── amqp_client
│ ├── meck
│ ├── rabbit_common
│ └── ranch
├── rebar.config
├── rel
└── src
├── rootapp.app.src
├── rootapp.erl
├── rootapp.erl
└── rootapp.erl
Now, what I can do is:
$ rebar eunit skip_deps=true
which runs the tests for all apps. Also, I can do:
$ cd apps/app1/
$ rebar eunit skip_deps=true
which runs the tests for app1 (I have a rebar.config in apps/app1 as well.
However, if I try
$ rebar eunit skip_deps=true apps=app1
does...nothing. no output. Trying verbose mode gives me:
$ rebar -vv eunit skip_deps=true apps=app1
DEBUG: Consult config file "/Users/myuser/Development/erlang/rootapp/rebar.config"
DEBUG: Rebar location: "/usr/local/bin/rebar"
DEBUG: Consult config file "/Users/myuser/Development/erlang/erlactive/src/rootapp.app.src"
DEBUG: Skipping app: rootapp
When I include the root app, it works:
$ rebar eunit skip_deps=true apps=rootapp,app1
Despite the fact, that I actually want to test app1, not rootapp, this is really uncomfortable since the SublimeErl plugin for SublimeText 2 will always set the apps to the app that the module under test is contained in. So the tests will always fail because actually no tests will run at all.
Long story short: Is there something I can configure in any of the rebar.config files to make it possible to run the tests for one app in /apps without including the root app?
Personally I prefer to put the main app into its own OTP compliant folder in apps. Just create a new app rootapp in apps and include it in your rebar.config:
{sub_dirs, ["apps/app1",
"apps/app2",
"apps/rootapp"]}.
You might also have to include the apps directory into your lib path:
{lib_dirs, ["apps"]}.
You might want to have a look into Fred Herbert's blog post “As bad as anything else”.
With this set up you should be able to run:
rebar skip_deps=true eunit
which will run all eunit tests of the apps in apps.
When developing OpenLaszlo applications, it's sometimes useful to generate the ActionScript 3 source code of an application written in lzx, e.g. when you want to compile OpenLaszlo into an Adobe AIR application.
What is the simplest way to generate the ActionScript 3 source code into a predefined folder?
The lzc command line tool which can be found in the $LPS_HOME/WEB-INF/lps/server/bin/ has on option for that:
--lzxonly
for as3 runtime, emit intermediate as files,
but don't call backend as3 compiler
By default the OpenLaszlo compiler will generate the ActionScript 3 code into the system specific Java temp folder, but the $JAVA_OPTS environment variable can be used to change that folder.
Here's an example of how the command can be used in combination with $JAVA_OPTS on Linux:
a) Create a simple LZX file, e.g.
<canvas>
<button text="Hello world" />
</canvas>
and save it as test.lzx.
b) Set the $JAVA_OPTS variable
The following syntax is for Linux or OS X:
export JAVA_OPTS="-Djava.io.tmpdir=./tmp -DXmx1024M"
c) Compile the LZX into ActionScript 3
> lzc --lzxonly test.lzx --runtime=swf10
Compiling: test.lzx to test.swf10.swf
The tmp folder will contain the generated ActionScript 3 files
tmp
├── lzccache
└── lzswf9
└── build
└── test
├── app.swf
├── build.sh
├── LzApplication.as
├── $lzc$class_basebutton.as
├── $lzc$class_basecomponent.as
├── $lzc$class_basefocusview.as
├── $lzc$class_button.as
├── $lzc$class__componentmanager.as
├── $lzc$class_focusoverlay.as
├── $lzc$class__m2u.as
├── $lzc$class__m2v.as
├── $lzc$class__m2w.as
├── $lzc$class__m2x.as
├── $lzc$class__m2y.as
├── $lzc$class__m2z.as
├── $lzc$class__m30.as
├── $lzc$class__m31.as
├── $lzc$class__mm.as
├── $lzc$class__mn.as
├── $lzc$class__mo.as
├── $lzc$class__mp.as
├── $lzc$class_statictext.as
├── $lzc$class_style.as
├── $lzc$class_swatchview.as
├── LZC_COMPILER_OPTIONS
├── LzPreloader.as
└── LzSpriteApplication.as
The folder structure follows the following scheme:
{JAVA_TEMP_FOLDER}/lzswf9/build/{LZX_FILENAME_WITHOUT_ENDING}, therefore in our case
tmp/lzswf9/build/test/
The main applicaton file is LzSpriteApplication.as, and you can look into the build.sh file to get an idea how the Flex SDK's mxmlc command is used to compile the generated source code.