I am newbiew to Bazel, I am trying to use thrift 10 in my bazel build and I need to run thrift binary to generate the thrift files. But I have conflicting version of thrift in my linux box, and somehow build is using wrong version while building.
Can someone help me how to solve this problem? Remember I have thrift.bzl which generates the thrift generated files..
Here is the current third party definition
cc_library(
name = "thrift",
srcs = [
"lib/libthrift.a",
"lib/libthrift.so",
"lib/libthrift.so.0.10.0",
"lib/libthriftc.a",
"lib/libthriftc.so",
"lib/libthriftc.so.0.10.0",
"lib/libthriftz.a",
],
hdrs = glob(["include/**/*.h"]),
includes = ["include"],
linkshared = 0,
tags = make_symlink_tags([
"lib/libthrift.a",
"lib/libthriftc.a",
"lib/libthriftz.a",
"lib/libthrift.so",
"lib/libthriftc.so",
"lib/libthrift.so.0.10.0",
"lib/libthriftc.so.0.10.0",
"lib/libthriftz.so.0.10.0",
]),
visibility = ["//visibility:public"],
deps = ["#boost_repo//:boost"],
)
filegroup(
name = "thrift_gen",
srcs = ["#thrift_repo//:bin/thrift"],
visibility = ["//visibility:public"],
)
thrift.bzl
_generate_thrift_cc_lib = rule(
attrs = {
"src": attr.label(
allow_files = True, # FileType(["*.thrift"]),
single_file = True,
),
"thrifts": attr.label_list(
allow_files = True, # FileType(["*.thrift"]),
),
"base_name": attr.string(),
"service_name": attr.string(),
"service": attr.bool(),
"gen": attr.string(default = "cpp"),
"_thrift": attr.label(
default = Label("#thrift_repo//:thrift_gen"),
executable = True,
cfg = "host",
),
},
output_to_genfiles = True,
outputs = _genthrift_outputs,
implementation = _generate_thrift_lib,
)
And here is the error
INFO: Found 11 targets...
ERROR: ...source/mlp/storage/services/thrift/BUILD:10:1: Generating mlp/storage/services/thrift/umm_geometry_constants.cpp failed (Exit 127).
external/thrift_repo/bin/thrift: error while loading shared libraries: libthriftc.so.0.10.0: cannot open shared object file: No such file or directory
Related
I have the following in a BUILD file:
proto_library(
name = "proto_default_library",
srcs = glob(["*.proto"]),
visibility = ["//visibility:public"],
deps = [
"#go_googleapis//google/api:annotations_proto",
"#grpc_ecosystem_grpc_gateway//protoc-gen-openapiv2/options:options_proto",
],
)
genrule(
name = "generate-buf-image",
srcs = [
":buf_yaml",
":buf_breaking_image_json",
":protos",
],
exec_tools = [
":proto_default_library",
"//buf:generate-buf-image-sh",
"//buf:generate-buf-image",
],
outs = ["buf-image.json"],
cmd = "$(location //buf:generate-buf-image-sh) --buf-breaking-image-json=$(location :buf_breaking_image_json) $(location :protos) >$#",
)
While executing $(location //buf:generate-buf-image-sh), glob(["*.proto"]) of proto_default_library can be seen in the sandbox but the proto files of #go_googleapis//google/api:annotations_proto and #grpc_ecosystem_grpc_gateway//protoc-gen-openapiv2/options:options_proto cannot. The same goes for the dependencies of //buf:generate-buf-image-sh.
Do I need to explicitly list out all transitive dependencies so they can be processed by generate-buf-image? Is there a programmatic way to do that?
Since genrules are pretty generic, a genrule sees only the default provider of a target, which usually just has the main outputs of that target (e.g., for java_library, a jar of the classes of that library, for proto_library, the proto files of that library). So to get more detailed information, you would write a Starlark rule to access more specific providers. For example:
WORKSPACE:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "rules_proto",
sha256 = "66bfdf8782796239d3875d37e7de19b1d94301e8972b3cbd2446b332429b4df1",
strip_prefix = "rules_proto-4.0.0",
urls = [
"https://mirror.bazel.build/github.com/bazelbuild/rules_proto/archive/refs/tags/4.0.0.tar.gz",
"https://github.com/bazelbuild/rules_proto/archive/refs/tags/4.0.0.tar.gz",
],
)
load("#rules_proto//proto:repositories.bzl", "rules_proto_dependencies", "rules_proto_toolchains")
rules_proto_dependencies()
rules_proto_toolchains()
defs.bzl:
def _my_rule_impl(ctx):
output = ctx.actions.declare_file(ctx.attr.name + ".txt")
args = ctx.actions.args()
args.add(output)
inputs = []
for src in ctx.attr.srcs:
proto_files = src[ProtoInfo].transitive_sources
args.add_all(proto_files)
inputs.append(proto_files)
ctx.actions.run(
inputs = depset(transitive = inputs),
executable = ctx.attr._tool.files_to_run,
arguments = [args],
outputs = [output],
)
return DefaultInfo(files = depset([output]))
my_rule = rule(
implementation = _my_rule_impl,
attrs = {
"srcs": attr.label_list(providers=[ProtoInfo]),
"_tool": attr.label(default = "//:tool"),
},
)
ProtoInfo is here: https://bazel.build/rules/lib/ProtoInfo
BUILD:
load(":defs.bzl", "my_rule")
proto_library(
name = "proto_a",
srcs = ["proto_a.proto"],
deps = [":proto_b"],
)
proto_library(
name = "proto_b",
srcs = ["proto_b.proto"],
deps = [":proto_c"],
)
proto_library(
name = "proto_c",
srcs = ["proto_c.proto"],
)
my_rule(
name = "foo",
srcs = [":proto_a"],
)
sh_binary(
name = "tool",
srcs = ["tool.sh"],
)
proto_a.proto:
package my_protos_a;
message ProtoA {
optional int32 a = 1;
}
proto_b.proto:
package my_protos_b;
message ProtoB {
optional int32 b = 1;
}
proto_c.proto:
package my_protos_c;
message ProtoC {
optional int32 c = 1;
}
tool.sh:
output=$1
shift
echo input protos: $# > $output
$ bazel build foo
INFO: Analyzed target //:foo (40 packages loaded, 172 targets configured).
INFO: Found 1 target...
Target //:foo up-to-date:
bazel-bin/foo.txt
INFO: Elapsed time: 0.832s, Critical Path: 0.02s
INFO: 5 processes: 4 internal, 1 linux-sandbox.
INFO: Build completed successfully, 5 total actions
$ cat bazel-bin/foo.txt
input protos: proto_a.proto proto_b.proto proto_c.proto
I'm trying to let a repository rule to be run again in bazel when files used by the rule changes.
I have the following rule
def _irule_impl(ctx):
cmd = [ str(ctx.path(ctx.attr._tool)) ]
st = ctx.execute(cmd, environment = ctx.os.environ)
ctx.symlink(st.stdout,"my_r2")
ctx.execute(["echo",">>>>>>>>>>>>>>>>> Running implementation"], quiet=False)
"_tool_deps2": attr.label(
allow_single_file = True,
default = "//tools:tool3",
),
"_tool_deps3": attr.label(
allow_single_file = True,
default = "//tools:tool_filegroup",
),
}
)
//tools:tool3 is a file, and when I change that, the rule is executed again
//tools:tools_filesgroup is a filegroup
filegroup(
name = "tool_filegroup",
srcs = "tool2"
)
and when I change tool2 the repository rule is not executed again when I build.
Is there a way to get this working ?
Here's my workspace;
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
RULES_JVM_EXTERNAL_TAG = "4.0"
RULES_JVM_EXTERNAL_SHA = "31701ad93dbfe544d597dbe62c9a1fdd76d81d8a9150c2bf1ecf928ecdf97169"
http_archive(
name = "maven",
strip_prefix = "rules_jvm_external-%s" % RULES_JVM_EXTERNAL_TAG,
sha256 = RULES_JVM_EXTERNAL_SHA,
url = "https://github.com/bazelbuild/rules_jvm_external/archive/%s.zip" % RULES_JVM_EXTERNAL_TAG,
)
load("#maven//:defs.bzl", "maven_install")
maven_install(
artifacts = [
"com.fasterxml.jackson.core:jackson-databind:2.12.1",
"org.apache.commons:commons-lang3:3.11"
],
repositories = [
"https://repo1.maven.org/maven2",
"https://jcenter.bintray.com/"
],
);
Here's my second/BUILD
java_binary(
name = "main",
srcs = glob(["src/main/java/**/*.java"]),
deps = [
"//First:first",
],
main_class = "com.test.MyMain",
);
here's my First/Build
java_library(
name = "first",
srcs = glob(["src/main/java/**/*.java"]),
deps = [
"#maven//:com_fasterxml_jackson_core_jackson_databind",
],
visibility =[ "//Second:__pkg__"],
);
when doing
bazel build //Second:main
I get
ERROR: /Users/foobar/Documents/Main/First/BUILD:1:13: error loading package '#maven//': Unable to find package for #bazel_skylib//:bzl_library.bzl: The repository '#bazel_skylib' could not be resolved. and referenced by '//First:first'
ERROR: Analysis of target '//Second:main' failed; build aborted: Analysis failed
INFO: Elapsed time: 0.078s
INFO: 0 processes.
You need to add Bazel Skylib to your workspace. Follow the "workspace setup" instructions here: https://github.com/bazelbuild/bazel-skylib/releases
In my Java-based Bazel project I use a code generator written in Java. The generator is part of the root project, and used in sub-projects as well.
What I want to achieve is to include the output of the root project (a .jar file) as a dependency for the code generation in sub-projects to grant the code generator access to all compiled files of the root project (through the classpath). Is that possible in Bazel?
What I see is that the classpath for key generation in the child project only includes the dependencies of the code generator binary (//parent:SettingsGenerator in the script below).
In my custom rule I invoke the code generator basically like this:
def _generate_settings(ctx):
...
ctx.actions.run(
inputs = [ctx.file._src] + [ctx.files.deps],
outputs = [keys, settings, loader],
mnemonic = "GenKeysAndSettings",
arguments = [args],
executable = ctx.executable._tool,
)
return [DefaultInfo(
files=depset([keys, settings, loader]),
runfiles=ctx.runfiles(files=ctx.files.deps)
)]
gen_settings = rule(
implementation = _generate_settings,
attrs = {
"lang": attr.string(),
"deps": attr.label_list(
allow_files = True
),
"_tool": attr.label(
cfg = "host",
executable = True,
default = Label("//parent:SettingsGenerator"),
),
"_src": attr.label(
single_file = True,
default = Label("//parent:Name")
),
}
)
The parent project BUILD:
load("//parent:settings.bzl", "gen_settings")
gen_settings(
name = "GenerateSettings",
lang = ""
)
java_library(
name = "parent",
srcs = glob(["src/main/java/**/*.java"]) + [
":GenerateSettings",
],
...
)
java_binary(
name = "SettingsGenerator",
srcs = glob(["src/main/java/**/SettingsGenerator.java"]),
main_class = "my.company.SettingsGenerator",
...
)
The child project BUILD:
gen_settings(
name = "GenerateSettings",
lang = "Java",
deps = ["//parent"]
)
...
My workaround is to include the .jar file as input and use a custom classloader in the generator. But it would be nice if I could control the classpath directly from Bazel.
Any help would be appreciated. Thank you.
I have a code generator that produces three output files:
client.cpp
server.cpp
data.h
The genrule looks like this:
genrule(
name = 'code_gen',
tools = [ '//tools:code_gen.sh' ],
outs = [ 'client.cpp', 'server.cpp', 'data.h' ],
local = True,
cmd = '$(location //tools:code_gen.sh) $(#D)')
The 'client.cpp' and 'server.cpp' each have their own cc_library rule.
My question is how to depend on the genrule but only use a specific output file.
What I did is create a macro that defined the genrule with specific outs set to the file required, but this resulted in multiple execution of the genrule:
gen.bzl:
def code_generator(
name,
out):
native.genrule(
name = name,
tools = [ '//bazel:gen.sh' ],
outs = [ out ],
local = True,
cmd = '$(location //bazel:gen.sh) $(#D)')
BUILD
load(':gen.bzl', 'code_generator')
code_generator('client_cpp', 'client.cpp')
code_generator('server_cpp', 'server.cpp')
code_generator('data_h', 'data.h')
cc_library(
name = 'client',
srcs = [ ':client_cpp' ],
hdrs = [ ':data_h' ],
)
cc_library(
name = 'server',
srcs = [ ':server_cpp' ],
hdrs = [ ':data_h' ],
)
Is there a way to depend on a genrule making it run once and then use only selected outputs from it?
You should be able to just use the filename (e.g. :server.cpp) to depend on a specific output of a rule.