Using bazel, I have a repetitive call to load() function at the beginning of all BUILD.bazel file.
Moreover, now I see that to test out my code (that extends bazel to another language) - I need to call some macro function in all of my build files.
Is there any way to apply custom code in all subpackages (without the need to write anything in the BUILD.bazel files)?
You can put in load statements into the tools/build_rules/prelude_bazel file in your workspace. For example the Skydoc rules documentation rules mention adding the following to your prelude_bazel file.
load(
"#io_bazel_skydoc//skylark:skylark.bzl",
"skydoc_repositories",
"skylark_library",
"skylark_doc",
)
Related
I have rule A implemented with a macro that uses declare_directory to produce a set of files:
output = ctx.actions.declare_directory("selected")
Names of those files are not known in advance. The implementation returns the directory created by declare_directory with the following:
return DefaultInfo(
files = depset([output]),
)
Rule A is included in "srcs" attribute of rule B. Rule B is also implemented with a macro. Unfortunately the list of files passed to B implementation through "srcs" attribute only contains the "selected" directory created by rule A instead of files residing in that directory.
I know that Args class supports expansion of directories so I could pass names of all files in "selected" directory to a single action. What I need, however, is a separate action for every individual file for parallelism and caching. What is the best way to achieve that?
This is one of the intended use cases of directory outputs (called TreeArtifacts in the implementation), and it's implemented using ActionTemplate:
https://github.com/bazelbuild/bazel/blob/c2100ad420618bb53754508da806b5624209d9be/src/main/java/com/google/devtools/build/lib/actions/ActionTemplate.java#L24-L57
However, this is not exposed to Starlark, and has only a couple usages currently, in the Android rules AndroidBinary.java and C++ rules CcCompilationHelper.java. The Android rules and C++ rules are going to be migrated over to Starlark, so this functionality might eventually be made available in Starlark, but I'm not sure of any concrete timelines. It would probably be good to file a feature request on Github.
Is there a way to control the Bazel build to generate wanted temp files for a list of source files instead of just using the command line option "--save_temps"?
One way is using a cc_binary, and add "-E" option in the "copts", but the obj file name will always have a ".o". This kind of ".o" files will be overwriten by the other build targets. I don't know how to control the compiler output file name in Bazel.
Any better ideas?
cc_library has an output group with the static library, which you can then extract. Something like this:
filegroup(
name = "extract_archive",
srcs = [":some_cc_library"],
output_group = "archive",
)
Many tools will accept the static archive instead of an object file. If the tool you're using does, then that's easy. If not, things get a bit more complicated.
Extracting the object file from the static archive is a bit trickier. You could use a genrule with the $(AR) Make variable, but that won't work with some C++ toolchains that require additional flags to configure architectures etc.
The better (but more complicated) answer is to follow the guidance in integrating with C++ rules. You can get the ar from the toolchain and the flags to use it in a custom rule, and then create an action to extract it. You could also access the OutputGroupInfo from the cc_library in the rule directly instead of using filegroup if you've already got a custom rule.
Thanks all for your suggestions.
Now I think I can solve this problem in two steps(Seems Bazel does not allow to combine two rules into one):
Step1, add a -E option like a normal cc_libary, we can call it a pp_library. It is easy.
Step2, in a new rules, its input is the target of pp_library, then in this rule find out the obj files(can be found via : action.outputs.to_list()) and copy them to the a new place via ctx.actions.run_shell() run_shell.
I take Bazel: copy multiple files to binary directory as a reference.
In my code, there is a place where I need to take different actions based on the input class type.
So I write two lines to check an input object's class type.
debugPrint("Let me know the next action: $action");
debugPrint((action is LoadPomodorosAction).toString());
And the output is
I/flutter (24128): Let me know the next action: Instance of 'LoadPomodorosAction'
I/flutter (24128): false
What does this mean?
The object 'action' is "Instance of 'LoadPomodorosAction'" and at the same time its class type is not LoadPomodorosAction .
how do I adjust my code so that I can know the class type of action?
I was suspecting that maybe there is something wrong with runtimetype. But how do I get to know the runtimetype?
I've tried replicating your issue and I'm not able to reproduce it. But to explain your inquiry, here is a complete details about the difference between the relative path and absolute path when used in imports as discussed in this SO post:
package imports
'package:... imports work from everywhere to import files from
lib/*.
relative imports
Relative imports are always relative to the importing file. If
lib/model/test.dart imports 'example.dart', it imports
lib/model/example.dart.
If you want to import test/model_tests/fixture.dart from any file
within test/*, you can only use relative imports because package
imports always assume lib/.
This also applies for all other non-lib/ top-level directories like
drive_test/, example/, tool/, ...
lib/main.dart
There is currently a known issue with entry-point files in lib/*
like lib/main.dart in Flutter.
https://github.com/dart-lang/sdk/issues/33076
Dart always assumed entry-point files to be in other top-level
directories then lib/ (like bin/, web/, tool/, example/,
...). Flutter broke this assumption. Therefore you currently must not
use relative imports in entry-point files inside lib/
See also
How to reference another file in Dart?
Previously, this bug was posted in GitHub as an issue between relative and absolute path. It seems that this was resolved per this GitHub post.
Problem
I wonder how to inform bazel about dependencies unknown at declaration time, but known at build time (a.k.a implicit dependencies, dynamic dependencies, ...). For instance when compiling C++ sources, the .cpp source file will depends on some header files and this information is not available when writing the BUILD file. It needs to be retrieve at build time. Whatever is the solution to get the information (dry-run, generating depfile, parsing stdout), it needs to be done at build time and the information need to be retrieved to bazel build graph.
Since skylark does not allow to do I/O, for instance to read a generated depfile or to parse stdout result containing a dependency list, I have no clue on how to deal with it.
Behind implicit dependencies, I am looking for correct incremental build.
Example
To experiment this problem I have created a simple tool, just_a_tool.exe which takes an input file, read a list of file from it, and concatenate the content of all these file to an output file.
command line example:
just_a_tool.exe --input input.txt --depfile dep.d output.txt
dep.d contains the list of all the read files.
Issue
If I change the content of test1.txt, test2.txt, or test3.txt, bazel does not rebuild output.txt file. Of course, because it does not know there were dependencies.
Example files
just_a_tool.bzl
def _impl(ctx):
exec_path = "C:/Code/JustATool/just_a_tool.exe"
for f in ctx.attr.source.files:
source_path = f.path
output_path = ctx.outputs.out.path
dep_file = ctx.actions.declare_file("dep.d")
args = ["--input", source_path, "--dep_file", dep_file.path, output_path]
ctx.actions.run(
outputs=[ctx.outputs.out, dep_file],
executable=exec_path,
inputs=ctx.attr.source.files,
arguments=args
)
jat_convert = rule(
implementation = _impl,
attrs = {
"source" : attr.label(mandatory=True, allow_files=True, single_file=True)
},
outputs = {"out": "%{name}.txt"}
)
BUILD
load("//tool:just_a_tool.bzl", "jat_convert")
jat_convert(
name="my_output",
source=":input.txt"
)
input.txt
test1.txt
test2.txt
test3.txt
Goal
I want to do correct and fast incremental build for the following situation:
Generate reflection data from C++ sources, this custom tool execution depends on header file included in my source files.
Use a internal tool to build asset file which can include other files
Run a custom preprocessor on my shaders allowing a #include feature
Thanks!
Bazel's extension language doesn't support creating actions with a dynamic set of inputs, where this set depends on the output of a previous action. In other words, custom rules cannot run an action, read the action's output, then create actions with those inputs or update (or prune the set of) inputs of already created actions.
Instead I suggest adding attribute(s) to your rule where the user can declare the set of files that the sources may include. I call this "the universe of headers". The actions you create depend on this user-defined universe, so the set of action inputs is completely defined. Of course this means these actions potentially depend on more files than the cpp files, which they process, include.
This approach is analogous to how the cc_* rules work: a file in cc_*.srcs can include other files in the srcs of the same rule and from hdrs of dependencies, but nothing else. Thus the union of srcs + hdrs of (direct & transitive) dependencies defines the universe of header files that a cpp file may include.
Let's say that I have an exe that has a function called compareTwoThings. I'd like compareTwoThings to be able to take, as arguments, two directories that have identical file names in each: a .json and a .fs. In compareTwoThings, I want to be able to read in each .json (easy-peasy) and each .fs file. The contents of each .fs file will be known.
How can I read in each .fs file and use the values in those .fs files without them being a part of the overall project structure? Can I? I understand that to build a project and have the project "see" into the .fs files, they need to be added to the .fsproj file, but can I not use open on an external file that has a module name?
Example dir structure of the proj:
myProj
|-proj
|-compareTwoThings.fs
|-myProj.fs
|-myProj.fsproj
Thing1
|-Thing1.fs
|-Thing1.json
Thing2
|-Thing2.fs
|-Thing2.json
And ultimately, the CLI statement would be something like
myProj compareTwoThings [dir to Thing1] [dir to Thing2] [output dir]
I feel like I'm overlooking something very simple here.
Edit: I do not believe that this question is related as I'm asking how to open a non-project .fs file.