Consider the following bazel rule written in a WORKSPACE file:
container_pull(
name = "release-base",
registry = "mydockernet:9443",
repository = "release-base",
digest = "sha256:...",
tag = "1.8.2",
)
The problem is that the tag value 1.8.2 is written in a yaml config file and we want to respect the DRY principle (read the value from the config file instead of duplicating the value in bazel files). Is there a way to handle this?
It's not yaml but you can define things in another bzl file and then load them into your WORKSPACE:
load("common.bzl", "MYVERSION")
container_pull(
name = "release-base",
registry = "mydockernet:9443",
repository = "release-base",
digest = "sha256:...",
tag = MYVERSION,
)
then in common.bzl:
MYVERSION=1.8.2
Related
Suppose I have a Bazel macro that is using a generator rule to generate an output file given an input file:
def my_generator(
name,
input_file,
output_file,
**kwargs):
args = []
args.extend(["--arg1", "$(location %s)" % output_file])
args.extend(["arg2", "$(locations %s)" % input_file])
cmd_params = " ".join(args)
native.genrule(
name = name,
srcs = [input_file],
outs = [output_file],
cmd = "python $(location //path/to:target_generator) %s" % cmd_params,
tools = ["/path/to/tool:mytool"],
)
Then I was previously using this macro as:
my_generator(
name = "gen1",
input_file = ":targetToGeneratetextFile",
output_file = "outputfile.txt",
visibility = ["//myproject/oath/to/current/package/test:__subpackages__"],
)
where a target is passed as input_file. This was working.
Then I wanted to reuse it with a different input but to generate the same output, where the input is now a file within the project but in another folder.
my_generator(
name = "gen2",
input_file = "//path/to/the/file/realFile.txt",
output_file = "outputfile.txt",
visibility = ["//myproject/oath/to/current/package/test:__subpackages__"],
)
I am getting two errors in this way:
For how it is, Bazel cannot find the realFile.txt: it tries to read it as a target:
no such package '//path/to/the/file/realFile.txt': BUILD file not found in any of the following directories. Add a BUILD file to a directory to mark it as a package
If I copy the file in the current package folder, it is able to read it.
Bazel is complaining that gen1 and gen2 are writing/overwriting the same output file outputfile.txt:
Error in genrule: generated file 'outputfile.txt' in rule 'gen2' conflicts with existing generated file from rule 'gen1', defined at ...
How can I solve these issues?
I think that the problem is that these two calls are both executed, whereas I would like them to be executed depending on some target, i.e., target A needs only run gen1 and target B gen2 exclusively. I do not if that is possible but for example moving each of these call inside the target they belong to might be a solution that avoids this issue.
EDIT
I was thinking as solution to do something like:
my_generator(
name = "gen2",
input_file = select({
":opt1": [":targetToGeneratetextFile"],
":opt2": ["realTextFile.txt"],
"//conditions:default": [":targetToGeneratetextFile"],
}),
output_file = "outputfile.txt",
visibility = ["//myproject/oath/to/current/package/test:__subpackages__"],
)
with proper config_setting and then call it from the target with the proper flag but I am getting the error:
expected value of type 'string' for element 0 of attribute 'srcs' in 'genrule' rule, but got select({":opt1": [":targetToGeneratetextFile"], ":opt2": ["realTextFile.txt"],"//conditions:default": [":targetToGeneratetextFile"],
})
The label //path/to/the/file/realFile.txt is shorthand for //path/to/the/file/realFile.txt:realFile.txt, aka <repository root>/path/to/the/file/realFile.txt/realFile.txt. Depending on where the deepest-nested folder with a BUILD file is (which determines the package), you're looking for something like //path/to/the/file:realFile.txt or //path/to:the/file/realFile.txt instead.
You can't have two rules which write the same file, because then Bazel can't tell which way to build it if you bazel build the file. Some alternatives:
Put them in separate packages (aka separate folders with BUILD files)
Name them differently, like gen1_outputfile.txt and gen2_outputfile.txt, or gen1/outputfile.txt and gen2/outputfile.txt. You could automate this in the macro like srcs = [name + '/outputfile.txt'].
Use a single rule to generate it with an appropriate select, like your edit.
With the select, you're trying to create something like this:
genrule(
srcs = select({..., "//conditions:default": [":targetToGeneratetextFile"]}),
...
)
but as written you have this instead:
genrule(
srcs = [select({..., "//conditions:default": [":targetToGeneratetextFile"]})],
...
)
Effectively, between the list in the select's value and the macro body, you're creating a nested list. I would change your macro argument to input_files and then do srcs = input_files in the body, so the caller of the macro can bundle things into lists as desired.
My project depends on some external libraries which I have to bazelfy myself. Thus, my WORKSPACE:
http_archive(
name = "external_lib_component1",
build_file = "//third_party:external_lib_component1.BUILD",
sha256 = "xxx",
urls = ["https://example.org/external_lib_component1.tar.gz"],
)
http_archive(
name = "external_lib_component2",
build_file = "//third_party:external_lib_component2.BUILD",
sha256 = "yyy",
urls = ["https://example.org/external_lib_component2.tar.gz"],
)
...
The two entries above are similar, and external_lib_component{1, 2}.BUILD share a lot of code.
What is the best way to share code (macros) between them?
Just putting a shared_macros.bzl file into third_party/ won't work, because it will not be copied into
the archive location on build (only the build_file is copied).
If you place a bzl file such a In your./third_party/shared_macros.bzl into your tree as you've mentioned.
Then in the //third_party:external_lib_component1.BUILD and //third_party:external_lib_component2.BUILD you provide for your external dependencies, you can load symbols from that shared file using:
load("#//third_party:shared_macros.bzl", ...)
Labels starting with #// refer to packages from the main repository, even when used in an external dependency (as they would otherwise be rooted when starting with //. You can for check docs on labels, in particular the last paragraph.
Alternatively you can also refer to the "parent" project by its name. If in your WORKSPACE file you've had:
workspace(name = "parent")
You could say:
load("#parent//third_party:shared_macros.bzl", ...)
Note: in versions prior to 2.0.0 you might want to add --incompatible_remap_main_repo if you mixed both of above approaches in your project.
We frequently need common combinations of rules per tech stack.
That currently wastes a lot of space in WORKSPACE - and they should be kept in sync over multiple repos. It's 50+ lines after buildifier and contains too many urls, versions and hashes.
Now say I have a "technology stack" repo and do something like
load("#techstack_repo//mylang.bzl", "load_rules")
load_rules()
where load_rules would load and initialize pinned versions of e.g. rules_go, bazel-gazelle, rules_docker, rules_proto and initialize all of them in the right order so they are visible in WORKSPACE?
I did not get this to work in my tests because load apparently can not be run in a function in a bzl file - it's not a function itself.
Is there a way to do this?
Here's an example of what I tested for Java:
load("#io_bazel_rules_docker//repositories:repositories.bzl", container_repositories = "repositories")
load("#io_bazel_rules_docker//repositories:deps.bzl", container_deps = "deps")
load("#io_bazel_rules_docker//container:container.bzl", "container_pull")
load("#rules_proto//proto:repositories.bzl", "rules_proto_dependencies", "rules_proto_toolchains")
load(
"#io_grpc_grpc_java//:repositories.bzl",
"IO_GRPC_GRPC_JAVA_ARTIFACTS",
"IO_GRPC_GRPC_JAVA_OVERRIDE_TARGETS",
"grpc_java_repositories",
)
load("#rules_jvm_external//:defs.bzl", "maven_install")
def prepare_stack(maven_deps = []):
container_repositories()
container_deps()
container_pull(
name = "java_base",
# https://console.cloud.google.com/gcr/images/distroless/GLOBAL/java-debian10
# tag = "11", # OpenJDK 11 as of 2020-03-04
digest = "sha256:eda9e5ae2facccc9c7016f0c2d718d2ee352743bda81234783b64aaa402679b6",
registry = "gcr.io",
repository = "distroless/java-debian10",
)
rules_proto_dependencies()
rules_proto_toolchains()
maven_install(
artifacts = maven_deps + IO_GRPC_GRPC_JAVA_ARTIFACTS,
# for improved debugging in IDE
fetch_sources = True,
generate_compat_repositories = True,
override_targets = IO_GRPC_GRPC_JAVA_OVERRIDE_TARGETS,
repositories = [
"https://repo.maven.apache.org/maven2/",
"https://repo1.maven.org/maven2",
],
strict_visibility = True,
)
grpc_java_repositories()
... all http_archive calls for the rule repos are in WORKSPACE and I want to move them in here, but that did not work at all.
As is, I get this error:
ERROR: Failed to load Starlark extension '#rules_python//python:pip.bzl'.
Cycle in the workspace file detected. This indicates that a repository is used prior to being defined.
The following chain of repository dependencies lead to the missing definition.
- #rules_python
This could either mean you have to add the '#rules_python' repository with a statement like `http_archive` in your WORKSPACE file (note that transitive dependencies are not added automatically), or move an existing definition earlier in your WORKSPACE file.
also adding rules_python does not help either.
I found a solution:
Split it into two files.
One with imports like this:
load("#bazel_tools//tools/build_defs/repo:git.bzl", "git_repository")
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
load("#bazel_tools//tools/build_defs/repo:utils.bzl", "maybe")
def declare():
maybe(
git_repository,
name = "rules_cc",
commit = "34ca16f4aa4bf2a5d3e4747229202d6cb630ebab",
remote = "https://github.com/bazelbuild/rules_cc.git",
shallow_since = "1584036492 -0700",
)
# ... for me requires at least rules_cc, rules_python, bazel_skylib
# for later proto, docker, go, java support
and another using the declared external sources:
# go
load("#io_bazel_rules_go//go:deps.bzl", "go_register_toolchains", "go_rules_dependencies")
load("#bazel_gazelle//:deps.bzl", "gazelle_dependencies")
# protobuf
load("#rules_proto//proto:repositories.bzl", "rules_proto_dependencies", "rules_proto_toolchains")
# container
load("#io_bazel_rules_docker//container:container.bzl", "container_pull")
load("#io_bazel_rules_docker//repositories:repositories.bzl", container_repositories = "repositories")
load("#io_bazel_rules_docker//repositories:deps.bzl", container_deps = "deps")
load("#io_bazel_rules_docker//go:image.bzl", go_image_repositories = "repositories")
def init_rules():
go_rules_dependencies()
go_register_toolchains()
gazelle_dependencies()
rules_proto_dependencies()
rules_proto_toolchains()
container_repositories()
container_deps()
go_image_repositories()
container_pull(
name = "go_static",
digest = "sha256:9b60270ec0991bc4f14bda475e8cae75594d8197d0ae58576ace84694aa75d7a",
registry = "gcr.io",
repository = "distroless/static",
)
It's a bit of a hassle, but fetch this repo with http_archive or git_repository, load the first file and call declare and load the second for init_rules and call that.
It may be a little convoluted, but it still helps to unify the stack and simplify your WORKSPACE.
I wanna build envoy via bazel,i mannual download some package in my pc, then I change http_archive to local_repository, but it tell me name 'local_repository' is not defined. Did local_repository need any load action?
local_repository can be used in WORKSPACE,but can not in my .bzl file
WORKSPACE:
workspace(name = "envoy")
load("//bazel:api_repositories.bzl", "envoy_api_dependencies")
envoy_api_dependencies()
load("//bazel:repositories.bzl", "GO_VERSION", "envoy_dependencies")
load("//bazel:cc_configure.bzl", "cc_configure")
envoy_dependencies()
`repositories.bzl`:
local_repository(
name = "com_google_protobuf",
path = "/home/user/com_google_protobuf",
)
local_repository is a workspace rule so I think it's not available outside of the WORKSPACE file.
If you want to call local_repository from a .bzl file you can define a function in there, using native, and call it from WORKSPACE, e.g.:
# repositories.bzl
def deps():
native.local_repository(
name = "com_google_protobuf",
path = "/home/user/com_google_protobuf",
)
# WORKSPACE
load("//:repositories.bzl", "deps")
deps()
I've seen this pattern, for example, in the grpc project.
In a .bzl file, you have to use native.local_repository instead of just local_repository.
All symbols in .bzl files are expected to be defined in Starlark, but local_repository is a special rule that is defined natively within Bazel.
I have a binary that takes as input a single file and produces an unknown number of header and source C++ files into a single directory. I would like to be able to write a target like:
x_library(
name = "my_x_library",
src = "source.x",
)
where x_library is a macro that ultimately produces the cc_library from the output files. However, I can't bundle all the output files inside the rule implementation or inside the macro. I tried this answer but it doesn't seem to work anymore.
What's the common solution to this problem? Is it possible at all?
Small example of a macro using a genrule (not a huge fan) to get one C file and one header and provide them as a cc_library:
def x_library(name, src):
srcfile = "{}.c".format(name)
hdrfile = "{}.h".format(name)
native.genrule(
name = "files_{}".format(name),
srcs = [src],
outs = [srcfile, hdrfile],
cmd = "./generator.sh $< $(OUTS)",
tools = ["generator.sh"],
)
native.cc_library(
name = name,
srcs = [srcfile],
hdrs = [hdrfile],
)
Used it like this then:
load(":myfile.bzl", "x_library")
x_library(
name = "my_x_library",
src = "source.x",
)
cc_binary(
name = "tgt",
srcs = ["mysrc.c"],
deps = ["my_x_library"],
)
You should be able to extend that with any number of files (and for C++ content; IIRC the suffices are use for automagic decision how to call the tools) as long as your generator input -> generated content is known and stable (generally a good thing for a build). Otherwise you can no longer use genrule as you need your custom rule (probably a good thing anyways) to use TreeArtifact as described in the linked answer. Or two, one with .cc suffix and one with .hh so that you can pass them to cc_library.