In one of my rules, I want to unzip a file via unzip -dso that I can access some headers.
Currently, I am doing so by using whatever unzip is on my PATH.
ctx.actions.run(
...
executable = "unzip",
)
Alternatively, I could use /usr/bin/unzip to avoid relying on my PATH. However, it's unclear what the idiomatic approach should be. Based on my reading of the docs, I would ideally have some sort Bazel binary target for unzip that the rule would invoke. It's somewhat unclear, though, how I would go about defining that binary target, perhaps an sh_binary.
The right solution would be having an unzip utility in your repo. (Either in the source tree or imported as an external repository.)
Without that, I think the most principled approach is to write a repository rule that looks up unzip from the PATH using repository_ctx.which (or just expects "/usr/bin/unzip" to exist) and creates a BUILD file (repository_ctx.file) with a sh_binary that wraps the unzip tool.
In both cases you have to add a label attribute "_unzip_tool" to the consuming rule:
"_unzip_tool": attr.label(
executable = True,
cfg = "host",
default = Label("#unzip//:unzip_tool"),
),
and use it like so:
ctx.actions.run(
...,
executable = ctx.executable._unzip_tool,
)
Related
Is there a way to control the Bazel build to generate wanted temp files for a list of source files instead of just using the command line option "--save_temps"?
One way is using a cc_binary, and add "-E" option in the "copts", but the obj file name will always have a ".o". This kind of ".o" files will be overwriten by the other build targets. I don't know how to control the compiler output file name in Bazel.
Any better ideas?
cc_library has an output group with the static library, which you can then extract. Something like this:
filegroup(
name = "extract_archive",
srcs = [":some_cc_library"],
output_group = "archive",
)
Many tools will accept the static archive instead of an object file. If the tool you're using does, then that's easy. If not, things get a bit more complicated.
Extracting the object file from the static archive is a bit trickier. You could use a genrule with the $(AR) Make variable, but that won't work with some C++ toolchains that require additional flags to configure architectures etc.
The better (but more complicated) answer is to follow the guidance in integrating with C++ rules. You can get the ar from the toolchain and the flags to use it in a custom rule, and then create an action to extract it. You could also access the OutputGroupInfo from the cc_library in the rule directly instead of using filegroup if you've already got a custom rule.
Thanks all for your suggestions.
Now I think I can solve this problem in two steps(Seems Bazel does not allow to combine two rules into one):
Step1, add a -E option like a normal cc_libary, we can call it a pp_library. It is easy.
Step2, in a new rules, its input is the target of pp_library, then in this rule find out the obj files(can be found via : action.outputs.to_list()) and copy them to the a new place via ctx.actions.run_shell() run_shell.
I take Bazel: copy multiple files to binary directory as a reference.
I want to extract some specific resource from jar (which, in turn, is downloaded as a part of http archive) and while I know how to achieve this in principle I don't know what is the most bazelish and minimal way to achieve this.
I've naïvely tried (after reading this answer) to do something like this:
new_http_archive(
name="some_jar_contents",
url="#some_archive//:lib/some_jar.jar",
build_file_content="""
filegroup(
name = "srcs",
srcs = glob(["*"]),
visibility = ["//visibility:public"]
)
"""
)
However, I'm predictably getting java.net.MalformedURLException: no protocol:
The problem you are getting is caused by the fact that Bazel does not understand the url #some_archive//:lib/some_jar.jar.
It cannot infer the protocol to use for fetching the Jar, and the exception is thrown.
The rule new_http_archive is intended to be used for fetching a compressed archive from certain remote location, then build & expose it as an external target to current repo.
You need to change the url parameter to the actual URL in order to let Bazel fetch the Jar with the resource you want to extract.
Then, in build_file_content parameter part, use genrule to move the desired resource file, and export it.
A working example can be found in this private gist.
The example does the following things:
Fetches the Jar for AutoValue in WORKSPACE
Uses genrule to extract the autovalue.vm file from the Jar
Have a Python Program that reads and prints content of autovalue.vm
Helpful resources:
genrule
"Make" Variables
Updating the runfiles tree structure
Problem
I wonder how to inform bazel about dependencies unknown at declaration time, but known at build time (a.k.a implicit dependencies, dynamic dependencies, ...). For instance when compiling C++ sources, the .cpp source file will depends on some header files and this information is not available when writing the BUILD file. It needs to be retrieve at build time. Whatever is the solution to get the information (dry-run, generating depfile, parsing stdout), it needs to be done at build time and the information need to be retrieved to bazel build graph.
Since skylark does not allow to do I/O, for instance to read a generated depfile or to parse stdout result containing a dependency list, I have no clue on how to deal with it.
Behind implicit dependencies, I am looking for correct incremental build.
Example
To experiment this problem I have created a simple tool, just_a_tool.exe which takes an input file, read a list of file from it, and concatenate the content of all these file to an output file.
command line example:
just_a_tool.exe --input input.txt --depfile dep.d output.txt
dep.d contains the list of all the read files.
Issue
If I change the content of test1.txt, test2.txt, or test3.txt, bazel does not rebuild output.txt file. Of course, because it does not know there were dependencies.
Example files
just_a_tool.bzl
def _impl(ctx):
exec_path = "C:/Code/JustATool/just_a_tool.exe"
for f in ctx.attr.source.files:
source_path = f.path
output_path = ctx.outputs.out.path
dep_file = ctx.actions.declare_file("dep.d")
args = ["--input", source_path, "--dep_file", dep_file.path, output_path]
ctx.actions.run(
outputs=[ctx.outputs.out, dep_file],
executable=exec_path,
inputs=ctx.attr.source.files,
arguments=args
)
jat_convert = rule(
implementation = _impl,
attrs = {
"source" : attr.label(mandatory=True, allow_files=True, single_file=True)
},
outputs = {"out": "%{name}.txt"}
)
BUILD
load("//tool:just_a_tool.bzl", "jat_convert")
jat_convert(
name="my_output",
source=":input.txt"
)
input.txt
test1.txt
test2.txt
test3.txt
Goal
I want to do correct and fast incremental build for the following situation:
Generate reflection data from C++ sources, this custom tool execution depends on header file included in my source files.
Use a internal tool to build asset file which can include other files
Run a custom preprocessor on my shaders allowing a #include feature
Thanks!
Bazel's extension language doesn't support creating actions with a dynamic set of inputs, where this set depends on the output of a previous action. In other words, custom rules cannot run an action, read the action's output, then create actions with those inputs or update (or prune the set of) inputs of already created actions.
Instead I suggest adding attribute(s) to your rule where the user can declare the set of files that the sources may include. I call this "the universe of headers". The actions you create depend on this user-defined universe, so the set of action inputs is completely defined. Of course this means these actions potentially depend on more files than the cpp files, which they process, include.
This approach is analogous to how the cc_* rules work: a file in cc_*.srcs can include other files in the srcs of the same rule and from hdrs of dependencies, but nothing else. Thus the union of srcs + hdrs of (direct & transitive) dependencies defines the universe of header files that a cpp file may include.
I am trying to get a value of the full current directory path from within .bzl rule. I have tried following:
ctx.host_configuration.default_shell_env.PATH returns "/Users/[user_name]/.rbenv/shims:/usr/local/bin:/usr/bin:/bin:...
ctx.bin_dir.path returns bazel-out/local-fastbuild/bin
pwd = ctx.expand_make_variables("cmd", "$$PWD", {}) returns string $PWD - I don't think this rule is helpful for me, but may be just using it wrong.
What I need is the directory under which the cmd that runs Bazel .bzl rule is running. For example: /Users/[user_name]/git/workspace/path/to/bazel/rule.bzl or at least first part of the path prior to the WORKSPACE directory.
I can't use pwd because I need this value before I call ctx.actions.run_shell()
Are there no attributes in Bazel configurations that hold this value?
The goal is to have hermetic builds, so you shouldn't depend on the absolute path.
Feel free to use pwd inside the command of ctx.actions.run_shell() (for reproducible builds, be careful, avoid putting the absolute path in the generated files).
Edit.
Technically, there are some workarounds. For example, you can pass the path via the --define flag:
bazel build :all --define=path=$(pwd)
Then the value will be available using ctx.var["path"].
Based on your comment below, you want the path to declare an output. Let me repeat: You shouldn't use an absolute path to declare the output file. Declare an output in your package. Then ask the tool you call to use that output.
For example, when you call gcc, you can use -o to specify the output. When a tool writes to stdout, use the shell to redirect it. If the tool is really not flexible, you may want to wrap it with your own script (e.g. call the tool and copy the output file).
Using an absolute path here is not the right solution. For example, it should be possible to execute the action on a remote machine (where your absolute path won't make sense.
Zip may be a reasonable solution. It's useful when you cannot know in advance the number or the names of the output files.
Context
I am writing a repository rule that invokes another Bazel project. My current approach is to build the additional project as a deploy jar. I would like a user to be able to instantiate the rule like:
jar_path = some/relative/path
my_rule(name = "something", p_arg="m_arg", binary=jar_path)
and then given the jar_path and the arguments, I would like the repository rule to execute the following command in the shell:
java -jar $(SOME_JAR) $(ARGUMENTS_PROVIDED_BY_RULE)
Problem
First, it's unclear how best to accomplish the deploy jar approach. So far, I have attempt two different approaches, with varying levels of success. For examples, I have skimmed through the scala_rules, the maven_rules, and the skylark cookbook.
Second, and more importantly, I am not sure whether the deploy jar is the best route to accomplishing my goals. Again, my interest is to invoke a target from an external Bazel project, that is currently hosted on github. (So feasibly, I could try to fetch the project using the http_archive rule).
Below, I describe the attempts I have made.
Approach 1
My first approach involved trying to execute the command using the command field in ctx.action. I tried various enumerations of
java -jar {computed_absolute_path_of_deploy_jar} {args_passed_from_instantiation}.
My biggest issue here was with determining the absolute path of the deploy jar. The file's root path, would contain some additional information. For example, it would like something like this.
/abs/olute/path[ something ]/rela/tive/path
As a side note, I'm not sure if this is a bug/nit, but the File.root.path, evaluated to None, despite File.none not being None.
My first approach involved was to was to try to use skylark [ctx.binary]
Approach 2
Next thing I tried was to mimic the input binary example from the docs. This was also unsuccessful. The issue was that the actual binary could not be found. Here is how I configured it.
First, I relaxed the repository rule into a regular skylark rule.
def _test_binary(ctx):
ctx.action(
....
arguments = [ctx.attr.p_arg],
executable = ctx.executable.binary)
test_binary = rule(
...
attrs = {
"binary":attr.label(mandatory=True, cfg="host", allow_files=True, executable=True),
...
}
Then, in my external project, I loaded the skylark rule into the WORKSPACE file. Finally, I called the macro from one of my BUILD files as follows:
load("#something_rule//:something_rule.bzl", "test_binary")
test_binary(name = "hello", p_arg = "hello", binary = "script.sh")
The script is a one line java -jar something_deploy.jar -- -arg:$1, and is in the same directory as the BUILD file.
Bazel complains that src/script.sh does not exist. I presume because it is looking for the file in /private/var/tmp/-bazel_username/somehash/relative_path. In response, I tried to pass the absolute path, which is not allowed.
Cheers.
It looks like you're mixing up repository rules with build extensions ("normal" rules). A good rule of thumb is:
Repository rules are for getting sources onto your system or symlinking them to a place Bazel can see them.
Build extension are for everything else: compiling, copying files, running binaries, etc.
I don't actually think you need to use either, for this. You say that the other project is on GitHub, so you can add the following to your WORKSPACE file:
http_archive(
name = "other_project",
...
)
Then, in your BUILD file:
genrule(
name = "run-a-jar",
srcs = ["#other_project//some/relative:path"],
cmd = "java -jar $(location #other_project//some/relative:path) -- arg1 arg2 > $#",
outs = ["jar-output"],
)
You shouldn't need to use the _deploy.jar target, since you're not moving the jar out of its project (_deploy.jar is useful when you need to relocate it).
Other things from your question:
I'm not sure if this is a bug/nit, but the File.root.path, evaluated to None,
Are you sure it didn't evaluate to ""? The path is relative to the execution root, so for sources, it will always be "" (for outputs, it'll be bazel-out/local-fastbuild/bin or similar).
Bazel complains that src/script.sh does not exist.
Passing -s to Bazel can really help debugging Skylark rules. You can see exactly where it is looking.