Can I set a default test_arg for go_tests and override from CLI? - bazel

I use gazelle to generate BUILD files for a go package that has some non-go directories.
I'd like to add -test.short to the go_test runs by default, and then turn it back off, optionally, from CLI.
Adding --test_arg="-test.short" to the CLI does not work, since it gets passed to the non-Go tests.
If I could add something to WORKSPACE that modified the default args for go_test based on a select, I'd be good here. Or if I could persuade gazelle to generate my_go_test instead of go_test, I could do some Skylark. Am I missing any way to accomplish this?

I think you can use Bazel's config_setting and select to make this work. config_setting lets you define a predicate which is true or false, depending on command-line arguments. You can provide a --define argument that the config_setting will test. Then you can optionally pass an argument to tests using select.
Something like this might work for you. This will pass the -test.short argument to the test if you pass --define=short=true on the command line. No argument would be passed by default.
config_setting(
name = "short",
values = {
"define": "short=true",
},
)
go_test(
name = "go_default_test",
srcs = ["hello_test.go"],
args = select({
":short": ["-test.short"],
"//conditions:default": [],
}),
)

Related

How do I create a configuration point in a BUILD file with Bazel?

I would like to pass a variable to some of my build rules, e.g. this Webpack step:
load("#npm//webpack-cli:index.bzl", webpack = "webpack_cli")
webpack(
name = "bundle",
args = [
"--config",
"$(execpath webpack.config.js)",
"--output-path",
"$(#D)",
],
data = [
"index.html",
"webpack.config.js",
"#npm//:node_modules",
] + glob([
"src/**/*.js",
]),
env = {
"FOO_BAR": "abc",
},
output_dir = True,
)
Some builds will be done with FOO_BAR=abc and others with a different value. I don't know the full set of possible values!
I don't think that --action_env is applicable here since it is not a genrule.
I would also like to be able to set a default value in my BUILD script.
How can I accomplish this with Bazel?
If you knew the set of values, then the usual tools are config_setting() and select(), but since you don't know the possible values, then that won't work here.
It looks like webpack is actually a npm_package_bin or nodejs_binary underneath:
# Generated helper macro to call webpack-cli
def webpack_cli(**kwargs):
output_dir = kwargs.pop("output_dir", False)
if "outs" in kwargs or output_dir:
npm_package_bin(tool = "#npm//webpack-cli/bin:webpack-cli", output_dir = output_dir, **kwargs)
else:
nodejs_binary(
entry_point = { "#npm//:node_modules/webpack-cli": "bin/cli.js" },
data = ["#npm//webpack-cli:webpack-cli"] + kwargs.pop("data", []),
**kwargs
)
and in both cases env will do make-variable substitution:
https://bazelbuild.github.io/rules_nodejs/Built-ins.html#nodejs_binary-env
https://bazelbuild.github.io/rules_nodejs/Built-ins.html#npm_package_bin-env
So if you know at least what the variables will be, you can do something like
env = {
"FOO_BAR": "$(FOO_BAR)",
},
and use --define=FOO_BAR=123 from the Bazel command line.
nodejs_binary has additional attributes related to environment variables:
https://bazelbuild.github.io/rules_nodejs/Built-ins.html#nodejs_binary-configuration_env_vars
https://bazelbuild.github.io/rules_nodejs/Built-ins.html#nodejs_binary-default_env_vars
If the number of environment variables to set, or the name of the environment variable itself, is not known, then you might need to open a feature request with rules_nodejs.

Propagating copts/defines to all of a target's dependencies

I have a project that involves multiple BUILD files in a single WORKSPACE, within a fairly complex build system. My goal in short: for some specific target, I want all of its recursive dependencies to be built with an extra set of attributes (copts/defines) compared to when those dependency targets are built in any other way. I have not yet found a way to do this cleanly.
For example, target G is normally built with copts = []. If target P depends on target G, and I run bazel build :P, I want both targets to be built with copts = ["-DMY_DEFINE"], along with all dependencies of target G, etc.
The cc_binary.defines argument propagates in the opposite direction: all targets that depend on some target A will receive all of target A's defines.
Limitations:
prefer to avoid custom command line flags, I don't control how people call bazel {build,test}
duplicating the entire tree of dependency targets is not practical
It doesn't appear possible to set the value of a config_setting from within a BUILD file or a target, so it seems a select-based solution couldn't work.
Previous work:
https://groups.google.com/g/bazel-discuss/c/rZps4nqYqt8/m/YS_pZD6oAQAJ - 2017, recommends "parallel trees" or custom macros (of which we already have many, it would be challenging to wrap them in another)
Propagate copts to all dependencies in Bazel - I believe these all depend on custom command line flags as well
Creating a user-defined build setting doesn't require command-line flags. If you set flag = False, then it actually can't be set on the command line. You can use a user-defined transition to set it instead.
I think something like this will do what you're looking for (save it in extra_copts.bzl):
def _extra_copts_impl(ctx):
context = cc_common.create_compilation_context(
defines = depset(ctx.build_setting_value)
)
return [CcInfo(compilation_context = context)]
extra_copts = rule(
implementation = _extra_copts_impl,
build_setting = config.string_list(flag = False),
)
def _use_extra_copts_implementation(ctx):
return [ctx.attr._copts[CcInfo]]
use_extra_copts = rule(
implementation = _use_extra_copts_implementation,
attrs = "_copts": attr.label(default = "//:extra_copts")},
)
def _add_copts_impl(settings, attr):
return {"//:extra_copts": ["MY_DEFINE"]}
_add_copts = transition(
implementation = _add_copts_impl,
inputs = [],
outputs = ["//:extra_copts"],
)
def _with_extra_copts_implementation(ctx):
infos = [d[CcInfo] for d in ctx.attr.deps]
return [cc_common.merge_cc_infos(cc_infos = infos)]
with_extra_copts = rule(
implementation = _with_extra_copts_implementation,
attrs = {
"deps": attr.label_list(cfg = _add_copts),
"_allowlist_function_transition": attr.label(
default = "#bazel_tools//tools/allowlists/function_transition_allowlist"
)
},
)
and then in the BUILD file:
load("//:extra_copts.bzl", "extra_copts", "use_extra_copts", "with_extra_copts")
extra_copts(name = "extra_copts", build_setting_default = [])
use_extra_copts(name = "use_extra_copts")
cc_library(
name = "G",
deps = [":use_extra_copts"],
)
with_extra_copts(
name = "P_deps",
deps = [":G"],
)
cc_library(
name = "P",
deps = [":P_deps"],
)
extra_copts is the build setting. It returns a CcInfo directly, which means it's straightforward to do any other C++ library swapping with the same approach. Its default is effectively an "empty" CcInfo which won't do anything to libraries that depend on it.
with_extra_copts wraps a set of dependencies, configured to use a different CcInfo. This is the rule that actually changes the value, to create the second version of G with different flags.
_add_copts is the transition which with_extra_copts uses to change the value of the extra_copts build setting. It could examine attr to do something more sophisticated than adding a hard-coded list.
use_extra_copts pulls the CcInfo out of extra_copts so a cc_library can use them.
To avoid rewriting the builtin C++ rules, this uses wrapper rules to pull the copts out and do the transition. You might want to create macros to bundle the wrapper rules along with the corresponding cc_library. Alternatively, you could use rules_cc's my_c_archive as a starting point to create custom rules that reuse the core implementation of the builtin C++ rules while integrating the transition and use of the build setting into a single rule.

Conditionally create a Bazel rule based on --config

I'm working on a problem in which I only want to create a particular rule if a certain Bazel config has been specified (via '--config'). We have been using Bazel since 0.11 and have a bunch of build infrastructure that works around former limitations in Bazel. I am incrementally porting us up to newer versions. One of the features that was missing was compiler transitions, and so we rolled our own using configs and some external scripts.
My first attempt at solving my problem looks like this:
load("#rules_cc//cc:defs.bzl", "cc_library")
# use this with a select to pick targets to include/exclude based on config
# see __build_if_role for an example
def noop_impl(ctx):
pass
noop = rule(
implementation = noop_impl,
attrs = {
"deps": attr.label_list(),
},
)
def __sanitize(config):
if len(config) > 2 and config[:2] == "//":
config = config[2:]
return config.replace(":", "_").replace("/", "_")
def build_if_config(**kwargs):
config = kwargs['config']
kwargs.pop('config')
name = kwargs['name'] + '_' + __sanitize(config)
binary_target_name = kwargs['name']
kwargs['name'] = binary_target_name
cc_library(**kwargs)
noop(
name = name,
deps = select({
config: [ binary_target_name ],
"//conditions:default": [],
})
)
This almost gets me there, but the problem is that if I want to build a library as an output, then it becomes an intermediate dependency, and therefore gets deleted or never built.
For example, if I do this:
build_if_config(
name="some_lib",
srcs=[ "foo.c" ],
config="//:my_config",
)
and then I run
bazel build --config my_config //:some_lib
Then libsome_lib.a does not make it to bazel-out, although if I define it using cc_library, then it does.
Is there a way that I can just create the appropriate rule directly in the macro instead of creating a noop rule and using a select? Or another mechanism?
Thanks in advance for your help!
As I noted in my comment, I was misunderstanding how Bazel figures out its dependencies. The create a file section of The Rules Tutorial explains some of the details, and I followed along here for some of my solution.
Basically, the problem was not that the built files were not sticking around, it was that they were never getting built. Bazel did not know to look in the deps variable and build those things: it seems I had to create an action which uses the deps, and then register an action by returning a (list of) DefaultInfo
Below is my new noop_impl function
def noop_impl(ctx):
if len(ctx.attr.deps) == 0:
return None
# ctx.attr has the attributes of this rule
dep = ctx.attr.deps[0]
# DefaultInfo is apparently some sort of globally available
# class that can be used to index Target objects
infile = dep[DefaultInfo].files.to_list()[0]
outfile = ctx.actions.declare_file('lib' + ctx.label.name + '.a')
ctx.actions.run_shell(
inputs = [infile],
outputs = [outfile],
command = "cp %s %s" % (infile.path, outfile.path),
)
# we can also instantiate a DefaultInfo to indicate what output
# we provide
return [DefaultInfo(files = depset([outfile]))]

How to create and use custom build flags in Bazel?

I've a rule with a conditional attribute:
some_rule(
name = "my_rule",
some_attr = select({
":ts_diagnostics_mode_extended": ["--extendedDiagnostics"]
}),
)
and with the config setting:
config_setting(
name = "ts_diagnostics_mode_extended",
values = { "define": "ts_diagnostics_mode=extended_diagnostics" }
)
However, when building with bazel build :my_target --define ts_diagnostics_mode=extended_diagnostics I get
Configurable attribute "some_attr" doesn't match this configuration (would a default condition help?).
What's missing?
--define flags are handled specially by config_setting, via define_values, because they are multi-valued. I think this will work:
config_setting(
name = "ts_diagnostics_mode_extended",
define_values = { "ts_diagnostics_mode": "extended_diagnostics" }
)
While define_values indeed works, your original example with values should also work. define_values is only necessary when you want the config_setting to have multiple entries.
See this line in the define_values documentation:
--define can still appear in values with normal flag syntax, and can be mixed freely with this attribute as long as dictionary keys remain distinct.

How can I include the current workspace name in the default argument value of a rule?

Let's say I have a rule:
blah = rule(
attrs = {
"foo": attr.string(default = "#HELP#"),
},
)
I want the default value of foo to contain the name of the workspace that invokes the rule. How can I accomplish this?
(Note: An acceptable approach is to leave a placeholder in the value and replace it when the rule uses the attribute, but I can't figure out how to get the current workspace there either. The closest I can find is ctx.label.workspace_root, but that is empty for the "main" workspace, and e.g. external/foo for other things.)
ctx.workspace_name does not give the correct answers. For example, if I print("'%s' -> '%s'", (ctx.label.workspace_root, ctx.workspace_name)), I get results like:
'externals/foo' -> 'main'
'externals/bar' -> 'main'
...which is wrong; those should be 'foo' and 'bar', not 'main' ('main' being my main/root workspace). Note that labels from those contexts are e.g. '#foo//:foo', so Bazel does apparently know the correct workspace name.
You can use a placeholder attribute and then use ctx.workspace_name in the implementation.
def _impl(ctx):
print("ws: %s" % ctx.workspace_name)
blah = rule(
implementation = _impl,
)
As far as getting the workspace name, this seems sub-optimal, but also seems to work:
def _workspace(ctx):
"""Compute name of current workspace."""
# Check for meaningful workspace_root
workspace = ctx.label.workspace_root.split("/")[-1]
if len(workspace):
return workspace
# If workspace_root is empty, assume we are the root workspace
return ctx.workspace_name
Per Kristina's answer and comment in the original question, this can then be used to replace a placeholder in the parameter value.

Resources