Configure bazel toolchain without modifying project/workspace - bazel

I have a Bazel project (the new tcmalloc) I'm trying to integrate into a typical GNU Make project that uses it's own build of compiler/libc++. The goal is to not fork the upstream project.
If I pass all the C++ options correctly to bazel (one set of which is -nostdinc++ -I<path to libc++>), Bazel is uhappy The include path '/home/vlovich/myproject/deps/toolchain/libc++/trunk/include' references a path outside of the execution root. (tcmalloc is a git submodule sibling # deps/tcmalloc). It's possible to get this "working" by giving Bazel a custom script to invoke as the compiler that injects those flags so that Bazel never sees them. However, I'd like to just define a toolchain to work properly.
I've read all the documentation I could find on this topic but it's not clear to me how to glue all these docs together.
Specifically not really clear where I should place the toolchain definition files or how to tell Bazel to find those definitions. Is there a way to give bazel a directory that it uses to find toolchain definitions? Am I expected to create a top-level WORKSPACE # /home/vlovich/myproject & register tcmalloc and my toolchain there, & then invoke bazel from /home/vlovich/myproject instead of /home/vlovich/myproject/deps/tcmalloc?

Toolchain support is rather complicated, and it is hard to understand, if you are not a bazel maintainer.
You can use CC and CXX environment variables to set a different compiler like: CC=your_c_compiler CXX=your_c++_compiler bazel build .... You can write your own custom script wrapper which will act as a normal C++ compiler
That -I<path to libc++> does not work, because all normal include paths have to be defined in srcs attribute or via dependencies indicated by the deps attribute. For system-wide dependencies use -isystem Read more about it https://stackoverflow.com/a/44061589/4638604

Related

What is the role of "py_binary"?

py_binary finally generates an executable file or an alias for a py script? What are its benefits? If it is an executable file, it will lose the meaning of python.
Making something executable can be just adding a chmod +x and slapping a #!/foo/bar line on top, the thing itself is still whatever interpreter code it was before.
In the case of bazel, it will add a wrapper script that will set up an execution environment before dispatching to the Python code. Consider e.g. Bazel's runfiles, but also other py_library targets.
In addition, you can use the target in places where an executable is required as attribute for another target. A single Python file doesn't have any dependencies Bazel knows about, so that would technically fit there but would not integrate well with Bazel.

How can I pass a specific macro to each compile in bazel?

Here's an easy version of the BUILD file:
cc_library(
name = "ab",
srcs = ['a.c', 'b.c', 'logger.h'],
)
logger.h contains the implementation of a logging function that uses the macro XOC_FILE_ID. XOC_FILE_ID has to contain the name of the source file.
Using __FILE__ instead would not help because __FILE__ expands to the string "logger.h" inside the file logger.h.
That's why I need to compile these files with different defines:
gcc -c [...] -DXOC_FILE_ID="a.c" a.c
gcc -c [...] -DXOC_FILE_ID="b.c" b.c
My failed approaches:
set the attribute local_defines using the value{source_file}: local_defines = ['XOC_FILE_ID="{source_file}"]: does not get replaced
set the attribute local_defines using the make variable $<: local_defines = ['XOC_FILE_ID="$<"]: Bazel aborts telling me that $(<) is not defined
same approach for attribute copts
Of course, I could try to make Bazel call a compiler wrapper script. However, this would mean that I have to explicitly set PATH to my wrapper script(s) before each call to Bazel. Isn't there a better solution?
You have access to {source_file} in a toolchain definition.
This means you have to write your own toolchain definition.
I tried two ways of writing a toolchain:
Use the Bazel tutorial on toolchains. Afterwards my build was broken: The default compile options of Bazel were missing. cc_library did not create shared libraries any more.
Use a hint pointing to a post in bazel-discuss and use the toolchain that Bazel itself creates using your environment. That's what I'm going to describe now (for Bazel 3.5.1)
If you want to use a compiler that is not in $PATH, do bazel clean and update $PATH to make compiler available. Bazel will pick it up.
create a toolchain directory (maybe my-toolchain/) in your workspace
bazel build #bazel_tools//tools/cpp:toolchain
copy BUILD, all *.bzl files, cc_wrapper.sh and builtin_include_directory_paths from $(bazel info output_base)/external/local_config_cc/ to your toolchain directory; copy the files the symbolic links are pointing to instead of copying the symbolic links
Adapt the BUILD file in my-toolchain/ to your needs—like adding '-DXOC_FILE_ID=\\"%{source_file}\\"' to compile_flags of cc_toolchain_config.
add these lines to your .bazelrc to make Bazel use your new toolchain by default:
build:my-toolchain --crosstool_top=//my-toolchain:toolchain
build --config=my-toolchain

Can I ignore some folder (containing bazel configuration) while building the project recursively?

For some reasons, practical or not, rxjs npm package stores BAZEL.build configuration in the package, so when I'm trying to build my project (which has node_modules folder) bazel tries automatically to build something that it's not supposed to build at all.
My question would be - what is canonical way of ignoring some specific folder while building bazel project recursively?
The only way to achieve what I'm looking for that I know of is to point to it explicitly in the command line
bazel build //... --deleted_packages=node_modules/rxjs/src (see user manual)
But I don't want to type this every time.
Bazel recently added a feature for ignoring folders (similar to gitignore).
Simply add node_modules to the .bazelignore file in the root of your project.
Yes, this is expressible as a bazel target pattern:
bazel build -- //... -//node_modules/rxjs/src/...
Full documentation is available at https://docs.bazel.build/versions/master/user-manual.html#target-patterns

gtest dependency for Bazel java_tools build?

I am trying to follow the instructions for contributors here:
https://bazel.build/contributing.html
I have a successful build off of master (i.e. bazel build //src:bazel), but the doc suggests also "you might want to build the various tools Bazel uses." I am trying to do that, for example:
cd src/java_tools/singlejar
bazel build //...
but it fails with:
ERROR: /Users/.../bazel/third_party/protobuf/3.2.0/BUILD:621:1: no such target '//external:gtest': target 'gtest' not declared in package 'external' defined by /Users/plaird/scone/public/bazel/WORKSPACE and referenced by '//third_party/protobuf/3.2.0:test_plugin'.
Do I need to build gtest locally, and then add it to the WORKSPACE file?
bazel build //..., no matter where you invoke it, will build everything in the project. It looks like what you probably want is bazel build //src/java_tools/singlejar/..., which will build all targets under that directory.
In general, though, you probably don't need to compile singlejar separately. I've been working on Bazel for several years and 99% of the time you don't have to build the tools separately.
In terms of the error you're getting, it would be nice if we could get //... building, but it hasn't been a huge priority. The protobuf code build is weird and I don't recommend trying to debug it, just jump into whatever you want to actually work on.

build Compiler 'protobuf' from source and use it with it's shared objects from within cmake

I'm using a CMake build in a Jenkins environment and want to build the protobuf compiler from source.
This all works but in the next step I'm trying to use the compiler to translate my proto files which doesn't work, cause it cannot find it's own shared objects. I've tried defining the search path in the CMakeLists.txt file but it won't detect the shared object location in my repository tree $PRJ_MIDDLEWARE/protobuf/lib. I've tried telling cmake or the system where to search by defining:
set(CMAKE_LIBRARY_PATH ${CMAKE_LIBRARY_PATH} "$ENV{PRJ_MIDDLEWARE}/protobuf/lib")
set(ENV{LD_LIBRARY_PATH} "$ENV{PRJ_MIDDLEWARE}/protobuf/lib:$ENV{LD_LIBRARY_PATH}")
But it always fails when trying to invoke the protoc compiler I just build. I tried invoking ´´ldconfig´´ from CMake but that doesn't work cause the jenkins user doesn't have the right to do this. Currently my only solution is to login to the build server an do this manually as root. But that is not how I want to do this... the next release moves to a new directory—this has to be done again. Do I have other options? Preferably directly from CMake, from Jenkins or maybe even Protocol Buffers?
Thank you
Two ideas come to mind:
Build protobuf compiler as a static binary (I don't know if that's possible but it usually is.)
Set LD_LIBRARY_PATH environment variable before invoking cmake to point to the location of protoc shared libraries.

Resources