what's difference between cc_library and cc_import when import prebuild so library.
I noticed that
cc_library can use multi library but cc_import can just use one.
cc_library can use strip_include_prefix but cc_import not.
Related
Is there a way to control the Bazel build to generate wanted temp files for a list of source files instead of just using the command line option "--save_temps"?
One way is using a cc_binary, and add "-E" option in the "copts", but the obj file name will always have a ".o". This kind of ".o" files will be overwriten by the other build targets. I don't know how to control the compiler output file name in Bazel.
Any better ideas?
cc_library has an output group with the static library, which you can then extract. Something like this:
filegroup(
name = "extract_archive",
srcs = [":some_cc_library"],
output_group = "archive",
)
Many tools will accept the static archive instead of an object file. If the tool you're using does, then that's easy. If not, things get a bit more complicated.
Extracting the object file from the static archive is a bit trickier. You could use a genrule with the $(AR) Make variable, but that won't work with some C++ toolchains that require additional flags to configure architectures etc.
The better (but more complicated) answer is to follow the guidance in integrating with C++ rules. You can get the ar from the toolchain and the flags to use it in a custom rule, and then create an action to extract it. You could also access the OutputGroupInfo from the cc_library in the rule directly instead of using filegroup if you've already got a custom rule.
Thanks all for your suggestions.
Now I think I can solve this problem in two steps(Seems Bazel does not allow to combine two rules into one):
Step1, add a -E option like a normal cc_libary, we can call it a pp_library. It is easy.
Step2, in a new rules, its input is the target of pp_library, then in this rule find out the obj files(can be found via : action.outputs.to_list()) and copy them to the a new place via ctx.actions.run_shell() run_shell.
I take Bazel: copy multiple files to binary directory as a reference.
I have automatically-generated .cc sources and a Starlark rule running the .cc generator:
BUILD file:
generate_cc(
name = "foo_generated"
) # runs an executable that generates foo.h, foo.cc
I'd like the above foo_generated to act also as a cc_library, so that it can be a valid dependency of a subsequent cc_library:
cc_library(
name = "bar",
deps = [":foo_generated"] # foo_generated used like a cc_library()
)
Can generate_cc be implemented in a single rule, without macros, so that a target of type generate_cc would work as other cc_library's deps?
(I realize that generate_cc could be a macro that calls the actual rule and then calls a cc_library rule, thereby creating two separate targets / labels - this is what I'd like to avoid).
If a rule implementation could call another rule, then generate_cc's implementation could
wrap the sources it generates in a cc_library
return the CcInfo provider returned by cc_library
as in (hypothetical .bzl file):
def generate_cc_impl(ctx):
# generate .h, .cc files
# ...
cc_info = native.cc_library(...) # wrap .h, .cc files
return cc_info
But I suppose calling one rule from another is not possible?
Rules cannot call other rules. However, support was added fairly recently for rules to reuse most of the native C++ functionality, which supports this use case. There's a section of documentation about implementing Starlark rules that depend on C++ rules and/or that C++ rules can depend on.
The my_c_archive example shows a lot of the boilerplate to use this functionality (finding the cc_toolchain and feature_configuration in particular). cc_common.compile is the function to create actions to compile your source files. cc_common.create_linking_context_from_compilation_outputs will convert the CcCompilationOutputs from compile into a CcLinkingContext for creating the CcInfo to return.
You can choose to pull some/all of the files out of the CcCompilationOutputs and CcLinkingOutputs to return as your rule's DefaultInfo, depending on your use case.
create_linking_context_from_compilation_outputs returns (CcLinkingContext, CcLinkingOutputs) for reference. I created bazel#10253 just now to add that to the docs.
I am trying to provide some preprocessor definitions at compile time based on whether the user runs bazel test or bazel build.
Specifically, I want to have a conditional dependency of a cc_library.deps and a conditional definition in cc_library.defines.
I found that select() is the way to go but I cannot figure out how to know what action the user runs.
I'm not aware of any way to detect the current command (build vs test) using select(), but I think you can achieve something similar with custom keys.
You could define a config_setting block like the following:
# BUILD
config_setting(
name = "custom",
values = {
"define": "enable_my_flag=true"
}
)
and use it in you library to control the defines:
# BUILD - continued
cc_library(
name = "mylib",
hdrs = ["mylib.h"],
srcs = ["mylib.cc"],
defines = select({
":custom": ["MY_FLAG"],
"//conditions:default": [],
})
)
Now building the library using bazel build :mylib will result in the default case - no defines to be present, but if you build using bazel build :mylib --define enable_my_flag=true then the other branch will be selected and MY_FLAG will be defined.
This can be easily extended to the test case, for example by adding the --define to your .bazelrc:
# .bazelrc
test --define enable_my_flag=true
Now every time you run bazel test :mylib_test the define flag will be appended and the library will be built with MY_FLAG defined.
Out of curiosity why do you want to run the test on a library built with a different set of defines/dependencies? That might defeat the purpose of the test since in the end you're testing something different from the library you're going to use.
I'm using Bazel 0.15.2 to build a third-party dependency (say, "tp_dep") that uses "make" for my project. This means I've had to write my own BUILD file for that dependency.
They have many Makefiles that localize include paths to the various subdirectories, so my targets end up with several custom copts flags to add those paths to the search space. The problem I'm seeing is that "-Iexternal/tp_dep/path/to/directory" in my copts field doesn't end up being useful for #include "file_from_path_to_directory.h".
Note: I have "path/to/directory/file_from_path_to_directory.h" in my target's hdrs list.
When I inspect the --verbose_failures, I see that dependencies from other cc_library targets are included as "bazel-out/host/genfiles/external/tp_dep/some/path".
I thus changed my copts line to "-iquote$(GENDIR)/external/tp_dep/path/to/directory" and still I see trouble. I've also tried $(BINDIR) instead of $(GENDIR).
What is the correct way to add these kinds of include directives?
Problem
I wonder how to inform bazel about dependencies unknown at declaration time, but known at build time (a.k.a implicit dependencies, dynamic dependencies, ...). For instance when compiling C++ sources, the .cpp source file will depends on some header files and this information is not available when writing the BUILD file. It needs to be retrieve at build time. Whatever is the solution to get the information (dry-run, generating depfile, parsing stdout), it needs to be done at build time and the information need to be retrieved to bazel build graph.
Since skylark does not allow to do I/O, for instance to read a generated depfile or to parse stdout result containing a dependency list, I have no clue on how to deal with it.
Behind implicit dependencies, I am looking for correct incremental build.
Example
To experiment this problem I have created a simple tool, just_a_tool.exe which takes an input file, read a list of file from it, and concatenate the content of all these file to an output file.
command line example:
just_a_tool.exe --input input.txt --depfile dep.d output.txt
dep.d contains the list of all the read files.
Issue
If I change the content of test1.txt, test2.txt, or test3.txt, bazel does not rebuild output.txt file. Of course, because it does not know there were dependencies.
Example files
just_a_tool.bzl
def _impl(ctx):
exec_path = "C:/Code/JustATool/just_a_tool.exe"
for f in ctx.attr.source.files:
source_path = f.path
output_path = ctx.outputs.out.path
dep_file = ctx.actions.declare_file("dep.d")
args = ["--input", source_path, "--dep_file", dep_file.path, output_path]
ctx.actions.run(
outputs=[ctx.outputs.out, dep_file],
executable=exec_path,
inputs=ctx.attr.source.files,
arguments=args
)
jat_convert = rule(
implementation = _impl,
attrs = {
"source" : attr.label(mandatory=True, allow_files=True, single_file=True)
},
outputs = {"out": "%{name}.txt"}
)
BUILD
load("//tool:just_a_tool.bzl", "jat_convert")
jat_convert(
name="my_output",
source=":input.txt"
)
input.txt
test1.txt
test2.txt
test3.txt
Goal
I want to do correct and fast incremental build for the following situation:
Generate reflection data from C++ sources, this custom tool execution depends on header file included in my source files.
Use a internal tool to build asset file which can include other files
Run a custom preprocessor on my shaders allowing a #include feature
Thanks!
Bazel's extension language doesn't support creating actions with a dynamic set of inputs, where this set depends on the output of a previous action. In other words, custom rules cannot run an action, read the action's output, then create actions with those inputs or update (or prune the set of) inputs of already created actions.
Instead I suggest adding attribute(s) to your rule where the user can declare the set of files that the sources may include. I call this "the universe of headers". The actions you create depend on this user-defined universe, so the set of action inputs is completely defined. Of course this means these actions potentially depend on more files than the cpp files, which they process, include.
This approach is analogous to how the cc_* rules work: a file in cc_*.srcs can include other files in the srcs of the same rule and from hdrs of dependencies, but nothing else. Thus the union of srcs + hdrs of (direct & transitive) dependencies defines the universe of header files that a cpp file may include.