How are Bazel's cc_library include paths expanded? - bazel

I'm using Bazel 0.15.2 to build a third-party dependency (say, "tp_dep") that uses "make" for my project. This means I've had to write my own BUILD file for that dependency.
They have many Makefiles that localize include paths to the various subdirectories, so my targets end up with several custom copts flags to add those paths to the search space. The problem I'm seeing is that "-Iexternal/tp_dep/path/to/directory" in my copts field doesn't end up being useful for #include "file_from_path_to_directory.h".
Note: I have "path/to/directory/file_from_path_to_directory.h" in my target's hdrs list.
When I inspect the --verbose_failures, I see that dependencies from other cc_library targets are included as "bazel-out/host/genfiles/external/tp_dep/some/path".
I thus changed my copts line to "-iquote$(GENDIR)/external/tp_dep/path/to/directory" and still I see trouble. I've also tried $(BINDIR) instead of $(GENDIR).
What is the correct way to add these kinds of include directives?

Related

How to verify BAZEL project for correctness?

How can I verify that my entire project does not contain errors (say, references to targets which are not declared anywhere)?
In a static language, whenever my code references something that doesn't exist, I get compiler errors. Is there a way to perform an equivalent check with bazel?
bazel build --nobuild //... has a similar effect. It evaluates all the rules (and fails with any errors), but doesn't actually build anything.
Add any additional flags you would with a full build you're checking against. Most flags result in rules evaluating differently, so you might see different errors depending on what flags you use.
A set of Bazel targets can build correctly for some configurations but not others. For example, if there's a select without a default like this:
cc_library(
name = "something",
srcs = select({
":cpu_k8": ["something_k8.cc"],
}),
)
then it will build with --cpu=k8 but not --cpu=aarch64. This means you have to specify the same set of flags when checking as with a full build.

How to deal with implicit dependency (e.g C++ include) for incremental build in custom Skylark rule

Problem
I wonder how to inform bazel about dependencies unknown at declaration time, but known at build time (a.k.a implicit dependencies, dynamic dependencies, ...). For instance when compiling C++ sources, the .cpp source file will depends on some header files and this information is not available when writing the BUILD file. It needs to be retrieve at build time. Whatever is the solution to get the information (dry-run, generating depfile, parsing stdout), it needs to be done at build time and the information need to be retrieved to bazel build graph.
Since skylark does not allow to do I/O, for instance to read a generated depfile or to parse stdout result containing a dependency list, I have no clue on how to deal with it.
Behind implicit dependencies, I am looking for correct incremental build.
Example
To experiment this problem I have created a simple tool, just_a_tool.exe which takes an input file, read a list of file from it, and concatenate the content of all these file to an output file.
command line example:
just_a_tool.exe --input input.txt --depfile dep.d output.txt
dep.d contains the list of all the read files.
Issue
If I change the content of test1.txt, test2.txt, or test3.txt, bazel does not rebuild output.txt file. Of course, because it does not know there were dependencies.
Example files
just_a_tool.bzl
def _impl(ctx):
exec_path = "C:/Code/JustATool/just_a_tool.exe"
for f in ctx.attr.source.files:
source_path = f.path
output_path = ctx.outputs.out.path
dep_file = ctx.actions.declare_file("dep.d")
args = ["--input", source_path, "--dep_file", dep_file.path, output_path]
ctx.actions.run(
outputs=[ctx.outputs.out, dep_file],
executable=exec_path,
inputs=ctx.attr.source.files,
arguments=args
)
jat_convert = rule(
implementation = _impl,
attrs = {
"source" : attr.label(mandatory=True, allow_files=True, single_file=True)
},
outputs = {"out": "%{name}.txt"}
)
BUILD
load("//tool:just_a_tool.bzl", "jat_convert")
jat_convert(
name="my_output",
source=":input.txt"
)
input.txt
test1.txt
test2.txt
test3.txt
Goal
I want to do correct and fast incremental build for the following situation:
Generate reflection data from C++ sources, this custom tool execution depends on header file included in my source files.
Use a internal tool to build asset file which can include other files
Run a custom preprocessor on my shaders allowing a #include feature
Thanks!
Bazel's extension language doesn't support creating actions with a dynamic set of inputs, where this set depends on the output of a previous action. In other words, custom rules cannot run an action, read the action's output, then create actions with those inputs or update (or prune the set of) inputs of already created actions.
Instead I suggest adding attribute(s) to your rule where the user can declare the set of files that the sources may include. I call this "the universe of headers". The actions you create depend on this user-defined universe, so the set of action inputs is completely defined. Of course this means these actions potentially depend on more files than the cpp files, which they process, include.
This approach is analogous to how the cc_* rules work: a file in cc_*.srcs can include other files in the srcs of the same rule and from hdrs of dependencies, but nothing else. Thus the union of srcs + hdrs of (direct & transitive) dependencies defines the universe of header files that a cpp file may include.

How to create sources jars for java_library in bazel

As part of our efforts to create a bazel-maven transition interop tool (that creates maven sized jars from more granular sized bazel jars) there is a need to create sources jars.
For java_binary targets there is a mechanism to create it using -src.jar suffix
e.g., for a java_binary target called foo, run bazel build //:foo-src.jar
But, using the same mechanism for java_library target named bar I get:
ERROR: no such target '//:bar-src.jar': target 'bar-src.jar' not declared in package '' (did you mean 'libbar-src.jar'?) defined by /Users/.../java_project/BUILD.
Is there a another mechanism for java_library?
As indicated by the error, the source target is called //:libbar-src.jar (with the lib prefix). See the list of outputs of java_library for reference.

Is there any way to include a file with a bang (!) in the path in a genrule?

I've got an iOS framework that has a dependency on the (presumably Google maintained) pod called '!ProtoCompiler'. In order to build my framework I'm going to need it in the sandbox. So, I have a genrule and can try to include it with
src = glob(['Pods/!ProtoCompiler/**/*']) but I get the following error:
ERROR: BUILD:2:1: //Foo:framework-debug: invalid label 'Pods/!ProtoCompiler/google/protobuf/any.proto' in element 1118 of attribute 'srcs' in 'genrule' rule: invalid target name 'Pods/!ProtoCompiler/google/protobuf/any.proto': target names may not contain '!'.
As is, this seems like a total blocker for me using bazel to do this build. I don't have the ability to rename the pod directory as far as I can tell. As far as I can tell, the ! prohibition is supposed to be for target labels, is there any way I can specify that this is just a file, not a label? Or are those two concepts completely melded in bazel?
(Also, if I get this to work I'm worried about the fact that this produces a .framework directory and it seems like rules are expected to produces files only. Maybe I'll zip it up and then unzip it as part of the build of the test harness.)
As far as I can tell, the ! prohibition is supposed to be for target
labels, is there any way I can specify that this is just a file, not a
label? Or are those two concepts completely melded in bazel?
They are mostly molded.
Bazel associates a label with all source files in a package that appear in BUILD files, so that you can write srcs=["foo.cc", "//bar:baz.cc"] in a build rule and it'll work regardless of foo.cc and baz.cc being a source file, a generated file, or a build rule's name that produces files suitable for this particular srcs attribute.
That said you can of course have any file in the package, but if the name won't allow Bazel to derive a label from it, then you can't reference them in the BUILD file. Since glob is evaluated during loading and is expanded to a list of labels, using glob won't work around this limitation.
(...) it seems like rules are expected to produces files only. Maybe
I'll zip it up and then unzip it as part of the build of the test
harness.
Yes, that's the usual approach.

waf does not correctly detect C++ #include dependencies

I have C++ header file dependencies that I specify in my waf script with the includes=... parameter to bld.program().
I know the waf build configuration sees the includes because my program compiles correctly.
However, when I change a header file, waf does not detect the change. That is, when I run waf build after changing the contents of an included header, nothing gets recompiled.
Isn't waf supposed to determine #include "..." dependencies automatically?
How can I troubleshoot this?
I have looked in the build/c4che directory to see if I could make sense of the configuration files stored there. Mention of "include" in the waf generated .py files is suspiciously absent.
I am using waf version 1.9.0.
I have also tried this with waf 1.8.19 and got the same result.
EDIT: I replaced my original complicated wscript with the much simpler one listed below, and I still get the same behavior.
Here is my wscript:
top = '.'
out = 'build'
CXXFLAGS = ['-fopenmp', '-Wall', '-Werror', '-std=c++11', '-Wl,--no-as-needed']
def options(ctx):
ctx.load('compiler_cxx')
def configure(ctx):
ctx.load('compiler_cxx')
ctx.env.CXXFLAGS = CXXFLAGS
def build(ctx):
ctx.program(source="test_config_parser.cpp", target="test_config_parser", includes=["../include"], lib=['pthread', 'gomp'])
Your problem is that you use includes out of the project's directory. By default waf does not use external includes as dependencies (like system includes) to speed up things. Solutions I know of :
1/
Organize your project to have your include directory under the waf top directory :
top_dir/
wscript
include/
myinclude.h
sources/
mysource.cpp
2/
Change top directory. I think top = .. should work (not tested).
3/
Tell waf to go absolute by adding this lines at the beginning of build():
waflib.Tools.c_preproc.go_absolute=True
waflib.Tools.c_preproc.standard_includes=[]
4/
Use gcc dependencies by loading the gccdeps waf module.
Solution 1/ is probably the best.
By the way I prefer to have my build directory out of the source tree. Use out = ../build in your wscript, if you want to build out of the source tree.
my2c

Resources