Bazel Package: How to export build artifacts to another directory with external dependency? - bazel

I am new to Bazel. I have a project which is built with Bazel. It also used some third-party libraries which are also build from bazel (from source).
I am now trying to export my project as a standalone library to others. I can compile my project as a so and copy related hpp files to a directory.
But I don't know how to deal with the third party libraries. Is there a fancy way to do this? Or any examples I can refer to?
My goal is to :
Comile the project as a so file. Copy to a specfic directory. (DONE)
Copy all the header files with tree structure. (DONE).
Copy all the external libraries into the specfic directory as well (NEED HELP)
Copy the external libraries' header files into the same header directory (NEED HELP)
output:
include/
my_library_name/
third_party_name1/
third_party_name2/
library/
libmy_library.so
libthird_party_name1.so
libthird_party_name2.so

If you have control over the BUILD files of the third-party libraries, you could expose the header files separately and then use pkg_tar rules to collect the target files in the main project.
For example, assuming a folder structure like the following:
.
|-- project
| |-- BUILD
| |-- mylib
| | |-- BUILD
| | |-- mylib.cpp
| | `-- mylib.h
| `-- WORKSPACE
`-- thirdparty
|-- lib1
| |-- BUILD
| |-- lib1.cpp
| `-- lib1.h
|-- lib2
| |-- BUILD
| |-- lib2.cpp
| `-- lib2.h
`-- WORKSPACE
your third-party libraries BUILD files could expose a filegroup and a cc_library:
# thirdparty/lib1/BUILD
filegroup(
name = "headers",
srcs = ["lib1.h"],
visibility = ["//visibility:public"],
)
cc_library(
name = "lib1",
hdrs = [":headers"],
srcs=["lib1.cpp"],
visibility = ["//visibility:public"],
)
The same pattern will apply to thirdparty/lib2/BUILD and project/mylib/BUILD.
Now in the main project you could have a main BUILD script that collects all the files into a tar archive:
# project/BUILD
load("#bazel_tools//tools/build_defs/pkg:pkg.bzl", "pkg_tar")
pkg_tar(
name = "release_headers",
srcs = [
"//mylib:headers",
"#thirdparty//lib1:headers",
"#thirdparty//lib2:headers",
],
package_dir = "include",
)
pkg_tar(
name = "release_libs",
srcs = [
"//mylib",
"#thirdparty//lib1",
"#thirdparty//lib2",
],
package_dir = "lib",
)
pkg_tar(
name = "release",
deps = [
":release_headers",
":release_libs",
],
)
Building :release should create a tar file with the desired structure:
$> bazel build :release
...
Target //:release up-to-date:
bazel-bin/release.tar
...
$> tar -tf bazel-bin/release.tar
./
./include/
./include/mylib.h
./include/lib1.h
./include/lib2.h
./lib/
./lib/libmylib.a
./lib/libmylib.so
./lib/liblib1.a
./lib/liblib1.so
./lib/liblib2.a
./lib/liblib2.so
You can also have the pkg_tar rules for the third-party libraries in the third-party workspace for better modularity.

Related

Esbuild cannot resolve node_modules that contains relative paths

I have a node_module with the structure like this.
#module_name
+-- dist
| +-- folder1
| +-- folder2
| +-- file1.d.ts
| +-- src
| +-- file2.d.ts
|
The file1.dt.ts is to expose, through export, some types from the file file2.d.ts.
//file1.dt.ts
export * from '../../src/file2';
But when I try to run the esbuild I receive the error
✘ [ERROR] Could not resolve "../../src/file2"
I have seen this problem in different libraries that use relative paths.

Backup Jenkins log folder via ThinBackup

I want to backup all .log* files under $JENKINS_HOME/logs via thinBackups Backup additional files Feature. Folder structure is as follows:
/var/lib/jenkins/logs/
|-- health-checker.log
|-- slaves
| |-- Slave\ 1
| | `-- slave.log
| |-- Slave\ 2
| | `-- slave.log
| `-- Slave\ 3
| `-- slave.log
`-- tasks
|-- Connection\ Activity\ monitoring\ to\ agents.log
|-- Connection\ Activity\ monitoring\ to\ agents.log.1
|-- Download\ metadata.log
|-- Download\ metadata.log.1
|-- Fingerprint\ cleanup.log
|-- Fingerprint\ cleanup.log.1
|-- Periodic\ background\ build\ discarder.log
|-- Periodic\ background\ build\ discarder.log.1
|-- Workspace\ clean-up.log
|-- Workspace\ clean-up.log.1
|-- telemetry\ collection.log
|-- telemetry\ collection.log.1
What I have now is the following regex:
^(logs|tasks|.*\.log.*)
which catches the top level health-checker.log and also all logs below tasks. But how do I extend this regex to also include all logs from all slaves (Slave 1, Slave 2 and Slave 3)?
I tried the following regex
^(logs|tasks|.*\.log.*)^(logs|slaves|Slave\ 1|.*\.log.*)
which does not work. I also cannot find any further explanation for the regex formatting thinBackup is using.
P.S. Our security departement requires backup of all logs for the last 90 days.
I think this plugin is buggy when it comes to include and exclude regex. If you exclude .*xml it shouldn't backup any files ending in xml but some files (not all) are still backed up.

How to include a link to index.html of content generated by another program in reST

My RST directory looks like this:
.
|-- Makefile
|-- build
| |-- doctrees
| `-- html
| |-- codecov <-- Generated by coverage.py
| |-- index.html
| | .... <-- Generated by Sphinx
|-- make.bat
`-- source
|-- _static
|-- changelog.rst
|-- conf.py
|-- contact.rst
|-- getting_started.rst
|-- index.rst
`-- introduction.rst
In my index.rst, I would like to create a relative link titled Code Coverage that points to codecov/index.html. I am not sure how to do that because it is outside my source folder. The 'codecov' folder is auto generated when I run code coverage in python. How to accomplish this?
.. toctree::
:caption: Table of Contents
:maxdepth: 2
introduction
getting_started
changelog
contact
Indices and tables
==================
* :ref:`genindex`
* :ref:`search`
You have at least two options.
Use an external link.
`Code Coverage <../_build/codecov/index.html>`_
Put this in a toctree directive.
.. toctree::
Code Coverage <https://www.example.com/_build/codecov/index.html>
There may be other options, but let's see if either satisfies your need.

waf cross-project dependencies

I have a trivial waf project:
$ root
|-- a
| `-- wscript
|-- b
| `-- wscript
`-- wscript
Root wscript is
def configure(conf):
pass
def build(bld):
bld.recurse('a b')
a wscript is
def build(bld):
bld (rule = 'touch ${TGT}', target = 'a.target' )
b wscript is
def build(bld):
bld (rule = 'cp ${SRC} ${TGT}', source='a.target', target='b.target')
What I'm trying to achieve is have a build system that first touches a.target, then copies it to b.target. I want to have rules for a.target and b.target to stay in their respective wscript files.
When I'm trying to launch it, I get the following error instead:
source not found: 'a.target' in bld(target=['b.target'], idx=1, tg_idx_count=2, meths=['process_rule', 'process_source'], rule='cp ${SRC} ${TGT}', _name='b.target', source='a.target', path=/<skip>/waf_playground/b, posted=True, features=[]) in /<skip>/waf_playground/b
If I put both rules into a single wscript file, everything works like a charm.
Is there a way for a target to depend on a another target defined in other wscript?
When you specify source/target, that is expressed relative to the current wscript file.
$ waf configure build
...
source not found: 'a.target' in bld(source='a.target, ...)
...
$ tree build
build/
├── a
│ └── a.target
...
Knowing that, the fix is to refer to the a.target source file correctly in b/wscript:
def build(bld):
bld (rule = 'cp ${SRC} ${TGT}', source='../a/a.target', target='b.target')
The task now correctly finds the source file:
$ waf build
Waf: Entering directory `.../build'
[1/2] Creating build/a/a.target
[2/2] Compiling build/a/a.target
Waf: Leaving directory `.../build'
'build' finished successfully (0.055s)
$ tree build
build/
├── a
│   └── a.target
├── b
│   └── b.target
...

how to include .hrl files across multiple modules rebar3

I have several modules's directories. For each module, I have include (containing *.hrl files) and src (containing *.erl files) folder separeated. How can I share *.hrl file from a module to another without duplicating them?
With rebar, I added {erl_opts, [{i, "folderContainsIncludeFile"}]} and it worked.
But with rebar3, the compilation is failed by saying can't find include file "include/xx.hrl"
I take it then you don't have an umbrella project, you just have multiple directories at the same level, each one with its own project managed with rebar3.
Something like:
root_folder
|- project1
| |- src
| |- include
| | `- one.hrl
| `- rebar.config
|- project2
| |- src
| | `- two.erl
| |- include
| `- rebar.config
…
And you want to include one.hrl into two.erl.
If that's the case, you should consider one of these alternatives:
A. Moving to an umbrella project structure, like…
root_folder
|- rebar.config <<<<<<<< notice this file is here now
`- apps
|- project1
| |- src
| `- include
| `- one.hrl
|- project2
| |- src
| | `- two.erl
| `- include
…
B. Using individual repos for each project and configuring them as dependencies from each other. The structure is like the one you currently have, but now you have to add deps to rebar.config's. For instance, in our example, you should add the following line to project2's rebar.config:
{deps, [project1]}.
(if you manage to publish project1 in hex.pm)
C. Properly setting $ERL_LIBS so that it includes the paths where your apps are built with rebar3. Something like…
ERL_LIBS=$ERL_LIBS:/path/to/root_folder/project1/_build/lib:/path/to/root_folder/project2/_build/lib:…
Hope this helps :)

Resources