.bzl file in external dependencies - bazel

I've an external dependency declared in WORKSPACE as a new_git_repository and provided a BUILD file for it.
proj/
├── BUILD
├── external
│   ├── BUILD.myDep
│   └── code.bzl
└── WORKSPACE
in the BUILD.myDep file, I want to load code.bzl nearby, but when I load it (load("//:external/code.bzl", "some_func")) bazel tries to load #myDep//:external/code.bzl instead!
Of course it's not a target in #myDep repository, but in my local worksapce.

Seems I Rubber Duck ed the Stackoverflow. since the solution appeared when writing the question!
However, the solution is to explicitly mention the local workspace when loading the .bzl file:
Suppose we have declared the name in the WORKSPACE as below:
workspace(name = "local_proj")
Now instead of load("//:external/code.bzl", "some_func"), just load it explicitly as a local workspace file:
load("#local_proj//:external/code.bzl", "some_func")
NOTE: When using this trick just be careful about potential dependency loops (i.e. loading a generated file that itself is produced by a rule depending on the same external repo!)

Related

With Bazels `http_archive` - is there a way to add already existing files to the extracted sources?

With Bazel I'm building an external library using http_archive together with
some patches which bring additional features:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name="some-lib",
build_file="#my-project//some-lib:BUILD.some-lib",
url="https://example/download/some-lib.tar.gz",
sha256="8d9405baf113a9f25e4fb961d56f9f231da02e3ada0f41dbb0fa4654534f717b",
patches=[
"//some-lib/patches:01-add-additional-features.dif",
],
patch_args=["-p1"],
patch_tool="patch",
)
The file structure looks like this:
some-lib
├── BUILD
├── BUILD.some-lib
├── include/additional_features.h
├── some-lib.bzl
└── patches
    ├── 01-add-additional-features.dif
    └── BUILD
This basically works but I still struggle with adding include/additional_features.h
into the extracted source folder.
I first tried to just list the file in the filegroup I use to later run
configure_make like this:
filegroup(
name="all_srcs",
srcs=glob(["**"]) + ["#my-project//some-lib:include/additional_features.h"],
)
then I'm getting
no such target '//some-lib:includes/additional_features.h': target 'includes/additional_features.h' not declared in package 'some-lib'; however, a source file of this name exists.
My next idea was to use the tools http_archive provides to make the file part of the source folder.
While you can use patches to modify the extracted folder you'd need a dedicated
dif file just to create the extra header file, which then you would have to
create in advance (i.E. create a Bazel rule) and declare it a dependency etc..
which I'd like to avoid to keep things simple.
There is also patch_cmds which next to patches can be used to modify the
extracted source folder by running arbitrary bash commands so I tried something like this:
patch_cmds=[
"cp #my-project//some-lib:include/additional_features.h include/",
],
but this does not work for me, I'm getting
Error in fail: Error applying patch command cp #my-project//some-lib:include/additional_features.h include/:
cp: cannot stat '#my-project//some-lib:include/additional_features.h': No such file or directory
So it looks like the syntax for specifying a path like I do with build_file does
not work with patch_cmds or that file can't be accessed at that specific stage.
Does one of the approaches I tried actual work and I just didn't use the right
syntax?
What's the Bazel-way to add (a bunch of) readily available files (i.e. in the same
repository as the Bazel-rules I provide) to a http_archive based source directory?
Try putting exports_files(["include/additional_features.h"], visibility=["//visibility:public"]) in some-lib/BUILD, so that Bazel will let you reference source files from the //some-lib package in the external repository (and elsewhere).
I even thought the "no such target" error message suggested exports_files?

Is it possible to specify paths in a configuration file that are relative to the configuration file location?

I have a complex config search path consisting of multiple locations where each location looks similar to this:
├── conf
│ └── foo
│ ├── foo.yaml
│ └── bar.yaml
└── files
├── foo.txt
└── bar.txt
with foo.yaml:
# #package _group_
path: "../../files/foo.txt"
and bar.yaml:
# #package _group_
path: "../../files/bar.txt"
Now the problem is: how do I find the correct location of the files specified in the configurations? I am aware of the to_absolute_path() method provided by hydra, but it interprets the path relative to the directory in which the application was started. However, I would like to interpret that path relative to the position of the configuration file. I cannot do this manually in my code, because I don't know how hydra resolved the configuration file and where exactly it is used to.
Is there some mechanism to determine the location of a config file from hydra? I really want to refrain from putting hard coded absolute paths in my configurations.
You can't get the path of a config file. In fact, it may not be a file at all (such as the case for Structured Configs), or it can be inside a python wheel (even in a zipped wheel).
You can do something like
path = os.path.join(os.path.dirname(__file__), "relative_path_from_config")
You can use also APIs designed for loading resources files from Python modules.
Here is a good answer in the topic.

Compile multiple LaTeX files w/ GitLab CI

Problem
I write down lectures at university in LaTeX (which is really convenient for this purpose), and i want tex files to automatically compile in pdf.
I have couple of .tex files in my repository like this:
.
├── .gitlab-ci.yml
└── lectures
├── math
| ├── differentiation
| | ├── lecture_math_diff.tex
| | ├── chapter_1.tex
| | └── chapter_2.tex
| └── integration
| ├── lecture_math_int.tex
| ├── chapter_1.tex
| └── chapter_2.tex
└── physics
└── mechanics
├── lecture_physics_mech.tex
├── chapter_1.tex
└── chapter_2.tex
So main file, for example, lecture_math_diff.tex using
\include{chapter_1}
\include{chapter_2}
tags, to form whole lecture.
And as result, i want to have my build artifacts in pdf like this:
├── math
| ├── lecture_math_diff.pdf
| └── lecture_math_int.pdf
└── physics
└── lecture_physics_mech.pdf
What can be done here? Do i have to write any sh script to collect all tex files or use gitlab runners?
One approach would be to use a short script (e.g python or bash) and to run latexmk to generate the PDF files.
latexmk is a perl script, which compiles latex files automatically. A short introduction can be found here
With python3 the script could look like the following one:
# filename: make_lectures.py
import os
from subprocess import call
# configuration:
keyword_for_main_tex = "lecture"
if __name__ == "__main__":
tex_root_directory = os.getcwd()
for root, _, files in os.walk("."):
for file_name in files:
# check, if file name ends with `tex` and starts with the keyword
if file_name[-3:] == "tex" and file_name[0:len(keyword_for_main_tex)] == keyword_for_main_tex:
os.chdir(root) # go in the direcotry
os.system("latexmk -lualatex "+ file_name) # run latexmk on the mainfile
os.chdir(tex_root_directory) # go back to root directory in case of relative pathes
This script assumes, that only files to be compiled to PDF start with the keyword lecture (as in the question). But the if statement, which checks for files to build, could also be extended to more elaborate comparison as matching regular expressions.
latexmk is called with the command line flag -lualatex here to demonstrate how to configure the build process gloally. A local configuration possibility (for each single project) is given with .latexmkrc files, which are read and processed by latexmk.
If we call latexmk as shell command, we have to make sure, that it is installed on our gitlab runner (and also texlive). If Dockercontainer runners are registered (see here how it is done), then you just need to specify the name of an image from DockerHub, which leads to the example gitlab-ci.yml file below:
compile_latex_to_pdf:
image: philipptempel/docker-ubuntu-tug-texlive:latest
script: python3 make_lectures.py
artifacts:
paths:
- ./*.pdf
expire_in: 1 week
Fell free, to change the image to any other image you like (e.g. blang/latex:latest). Note, that the artifacts extraction assumes, that no other PDF files are in the repository.
A final remark: I did not try it, but it should also be possible to install texlive and latexmk directly on the gitlab runner (if you have access to it).
You can have a look at https://github.com/reallyinsane/mathan-latex-maven-plugin. With the maven or gradle plugin you can also use "dependencies" for your projects.

Dart scripts that invoke scripts by importing them

I have this setup:
├── bin
│   ├── all.dart
│   ├── details
│   │   ├── script1.dart
│   │   └── script2.dart
| | .....
all.dart simply imports script1.dart and script2.dart and calls their main. The goal is to have a bunch of scripts under details that can be run individually. Additionally I want a separate all.dart script that can run them all at once. This will make debugging individual scripts simpler, yet still allowing all to run.
all.dart
import 'details/script1.dart' as script1;
import 'details/script2.dart' as script2;
main() {
script1.main();
script2.main();
}
script1.dart
main() => print('script1 run');
script2.dart
main() => print('script2 run');
So, this is working and I see the print statements expected when running all.dart but I have two issues.
First, I have to softlink packages under details. Apparently pub does not propagate packages softlinks down to subfolders. Is this expected or is there a workaround?
Second, there are errors flagged in all.dart at the point of the second import statement. The analyzer error is:
The imported libraries 'script1.dart' and 'script2.dart' should not have the same name ''
So my guess is since I'm importing other scripts as if they are libraries and since they do not have the library script[12]; statement at the top they both have the same name - the empty name?
Note: Originally I had all of these under lib and I could run them as scripts specifying a suitable --package-root on the command line even though they were libraries with main. But then to debug I need to run in Dart Editor, which is why I'm moving them to bin. Perhaps the editor should allow libraries under lib with a main to be run as a script since they run outside the editor just fine? The actual differences between script/library seems a bit unnecessary (as other scripting languages allow files to be both).
How do I clean this up?
I'm not sure what the actual question is.
If a library has not library statement then the empty string is used as a name.
Just add a library statement with an unique name to fix this.
Adding symlinks to subdirectories solves the problem with the imports for scripts in subdirectories.
I do this regularily.
It was mentioned several times in dartbug.com that symlinks should go away entirely but I have no idea how long this will take.
I have never tried to put script files with a main in lib but it is just against the package layout conventions and I guess this is why DartEditor doesn't support it.

How to generate language fragment bundles to localize Carbon products

In the blog How to generate language fragment bundles to localize Carbon products by Tanya Madurapperuma, I am having the following problem. Once generated the language bundles with ant localize command, these bundles are generated in the CARBON_HOME/repository/components/dropins/ folder. The problem is that when I run the tool I'm not looking to change the language to Spanish. I would appreciate help to correct what I may be missing to do?
Note: All resources.properties files are translated into Spanish.
If you have the jars with translated resources.properties files in you dropins folder, you need to restart the server and set the Locale setting of your browser to Spanish.
Locale should be changed in the browser, and then the server will pick the matching resources files to use.
UPDATE:
There are some problems here.
First, there's a bug if you have multiple directories in /resources directory. For now, you can make sure that you have only one directory inside resources directory when you run localize task.
You should have the properties files inside a directory with the bundle name, without the tree structure. So your resources directory should look like this.
../resources/
└── org.wso2.carbon.i18n_4.2.0
├── JSResources_es.properties
└── Resources_es.properties
You need to include the locale code as _es in your files as shown above.
Also the localize tool seems to append i18n at the end of the folder structure of the built jar. This works with ui bundles but in the case of org.wso2.carbon.i18n it looks as org/wso2/carbon/i18n/i18n. So open the built jar in dropins folder and remove the extra i18n folder so that the jar tree structure looks like following.
../repository/components/dropins/org.wso2.carbon.i18n.languageBundle_4.2.0.jar
├── META-INF
│   └── MANIFEST.MF
└── org
└── wso2
└── carbon
└── i18n
├── JSResources_es.properties
└── Resources_es.properties
Did you get this to work?
The place I doubt that you might have gone wrong is the folder structure in the resources folder. (You can place your resource files anywhere and execute command as ant localize -Dresources.directory=path_to_your_resources_directory)
Also note that a resource folder should have the proper naming conventions of the osgi bundle.
Ex: org.wso2.carbon.claim.mgt.ui_4.2.0 (This entire thing is name of the folder)
If you still couldn't get this to work mail me your resources folder to tanyamadurapperuma#gmail.com

Resources