How to use Bazel to build project uses OpenCV [closed] - opencv

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 2 years ago.
Improve this question
What is the best way to build C++ code that uses the OpenCV library using Bazel? I.e., what would the BUILD rules look like?
How should be the WORKSPACE and BUILD files look like in order to compile the following code using bazel:
#include "opencv2/opencv.hpp"
#include "iostream"
int main(int, char**) {
using namespace cv;
VideoCapture cap(0);
Mat save_img; cap >> save_img;
if(save_img.empty())
{
std::cerr << "ERROR >> Something is wrong with camera..." << std::endl;
}
imwrite("test.jpg", save_img);
return 0;
}

There are a couple of options. The easiest way is probably to install locally in the way the OpenCV site recommends:
git clone https://github.com/Itseez/opencv.git
cd opencv/
mkdir build install
cd build
cmake -D CMAKE_BUILD_TYPE=Release -D CMAKE_INSTALL_PREFIX=/path/to/opencv/install ..
make install
Then add the following to your WORKSPACE file:
new_local_repository(
name = "opencv",
path = "/path/to/opencv/install",
build_file = "opencv.BUILD",
)
Create opencv.BUILD in the same directory as WORKSPACE with the following:
cc_library(
name = "opencv",
srcs = glob(["lib/*.so*"]),
hdrs = glob(["include/**/*.hpp"]),
includes = ["include"],
visibility = ["//visibility:public"],
linkstatic = 1,
)
Then your code can depend on #opencv//:opencv to link in the .so's under lib/ and reference the headers under include/.
However, this isn't very portable. If you want a portable solution (and you're feeling ambitious), you could add the OpenCV git repo to your workspace and download & build it. Something like:
# WORKSPACE
new_git_repository(
name = "opencv",
remote = "https://github.com/Itseez/opencv.git",
build_file = "opencv.BUILD",
tag = "3.1.0",
)
And make opencv.BUILD something like:
cc_library(
name = "core",
visibility = ["//visibility:public"],
srcs = glob(["modules/core/src/**/*.cpp"]),
hdrs = glob([
"modules/core/src/**/*.hpp",
"modules/core/include/**/*.hpp"]
) + [":module-includes"],
)
genrule(
name = "module-includes",
cmd = "echo '#define HAVE_OPENCV_CORE' > $#",
outs = ["opencv2/opencv_modules.hpp"],
)
...
Then your code could depend on more specific targets, e.g., #opencv//:core.
As a third option, you declare both cmake and OpenCV in your WORKSPACE file and use a genrule to run cmake on OpenCV from within Bazel.

Here is an up2date solution that works with the current set of bazel(v3.1.0). In this little project i wanted to build a C++ program that depends on the newest openCV release (4.3.0), but only on a selected set of modules (core,highgui,imgcodecs,imgproc).
No local installed openCV required, bazel loads the needed files from github (although it works even when there is an old version of openCV installed):
Content of /WORKSPACE file:
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
all_content = """filegroup(name = "all", srcs = glob(["**"]), visibility = ["//visibility:public"])"""
http_archive(
name = "opencv",
build_file_content = all_content,
strip_prefix = "opencv-4.3.0",
urls = ["https://github.com/opencv/opencv/archive/4.3.0.zip"],
)
http_archive(
name = "rules_foreign_cc",
strip_prefix = "rules_foreign_cc-master",
url = "https://github.com/bazelbuild/rules_foreign_cc/archive/master.zip",
)
load("#rules_foreign_cc//:workspace_definitions.bzl", "rules_foreign_cc_dependencies")
rules_foreign_cc_dependencies()
Content of /BUILD file:
load("#rules_foreign_cc//tools/build_defs:cmake.bzl", "cmake_external")
cmake_external(
name = "opencv",
cmake_options = [
"-GNinja",
"-DBUILD_LIST=core,highgui,imgcodecs,imgproc",
],
lib_source = "#opencv//:all",
make_commands = [
"ninja",
"ninja install",
],
out_include_dir = "include/opencv4",
shared_libraries = [
"libopencv_core.so",
"libopencv_highgui.so",
"libopencv_imgcodecs.so",
"libopencv_imgproc.so",
],
visibility = ["//visibility:public"],
)
And finally, your target that depends on opencv, in my case a file /opencv/BUILD:
cc_binary(
name = "opencv",
srcs = ["opencv.cpp"],
data = [
"LinuxLogo.jpg",
"WindowsLogo.jpg",
],
deps = ["//:opencv"],
)
If you want to try it out, here is the rest: blackliner/automata
git clone https://github.com/blackliner/automata.git
cd automata
bazel build ...

I succeed with #kristina's first option.
Install opencv:
git clone https://github.com/Itseez/opencv.git
cd opencv/
mkdir build install
cd build
cmake -D CMAKE_BUILD_TYPE=Release -D CMAKE_INSTALL_PREFIX=/usr/local ..
make install
Change WORKSPACE file (at tensorflow/WORKSPACE cloned from github)
new_local_repository(
name = "opencv",
path = "/usr/local",
build_file = "opencv.BUILD",
)
Make opencv.BUILD file at the same place as WORKSPACE file:
cc_library(
name = "opencv",
srcs = glob(["lib/*.so*"]),
hdrs = glob(["include/**/*.hpp"]),
includes = ["include"],
visibility = ["//visibility:public"],
linkstatic = 1,
)
You may have to config the opencv libs path:
a. Make sure you have /etc/ld.so.conf.d/opencv.conf file with content:
/usr/local/lib
b. Run the command:
sudo ldconfig -v

This is what I did for OpenCV 2.4.13.2, core/ only. This approach goes from the opencv source, which is adapted from the accepted answer above by #kristina.
The first thing is to add the http_archive for the opencv 2.4 release:
# OpenCV 2.4.13.2
new_http_archive(
name = "opencv2",
url = "https://github.com/opencv/opencv/archive/2.4.13.2.zip",
build_file = "third_party/opencv2.BUILD",
strip_prefix = "opencv-2.4.13.2",
)
And then, add the file third_party/opencv2.BUILD as:
cc_library(
name = "dynamicuda",
hdrs = glob([
"modules/dynamicuda/include/**/*.hpp",
]),
includes = [
"modules/dynamicuda/include"
],
)
cc_library(
name = "core",
visibility = ["//visibility:public"],
srcs = glob(["modules/core/src/**/*.cpp"]),
hdrs = glob([
"modules/core/src/**/*.hpp",
"modules/core/include/**/*.hpp",
]) + [
":module_includes",
":cvconfig",
":version_string",
],
copts = [
"-Imodules/dynamicuda/include",
],
# Note that opencv core requires zlib and pthread to build.
linkopts = ["-pthread", "-lz"],
includes = [
"modules/core/include",
],
deps = [
":dynamicuda",
],
)
genrule(
name = "module_includes",
cmd = "echo '#define HAVE_OPENCV_CORE' > $#",
outs = ["opencv2/opencv_modules.hpp"],
)
genrule(
name = "cvconfig",
outs = ["cvconfig.h"],
cmd = """
cat > $# <<"EOF"
// JPEG-2000
#define HAVE_JASPER
// IJG JPEG
#define HAVE_JPEG
// PNG
#define HAVE_PNG
// TIFF
#define HAVE_TIFF
// Compile for 'real' NVIDIA GPU architectures
#define CUDA_ARCH_BIN ""
// NVIDIA GPU features are used
#define CUDA_ARCH_FEATURES ""
// Compile for 'virtual' NVIDIA PTX architectures
#define CUDA_ARCH_PTX ""
EOF"""
)
genrule(
name = "version_string",
outs = ["version_string.inc"],
cmd = """
cat > $# <<"EOF"
"\\n"
)
Note that I did not put anything in the version_string.inc. It is just a C++ string literal which does not affect the functionality of OpenCV. If you are really interested in this file see this example.
After this you should be able to add target with dependencies on #opencv2//:core.

Here's a simple demo of OpenCV & C++ built with Bazel: https://github.com/jcju/opencv_bazel_win
You may set up the OpenCV path in WORKSPACE and run:
bazel run //src:main

Related

Reuse different parts of downloaded package (directory output)?

New to bazel so please bear with me :) I have a genrule which basically downloads and unpacks a a package:
genrule(
name = "extract_pkg",
srcs = ["#deb_pkg//file:pkg.deb"],
outs = ["pkg_dir"],
cmd = "dpkg-deb --extract $< $(#D)/pkg_dir",
)
Naturally pkg_dir here is a directory. There is another rule which uses this rule as input to create executable, but the main point is that I now need to add a rule (or something) which will allow me to use some headers from that package. This rule is used as an input to a cc_library which is then used in other parts of the repository to get access to the headers. Tried like this:
genrule(
name = "pkg_headers",
srcs = [":extract_pkg"],
outs = [
"pkg_dir/usr/include/pkg/h1.h",
"pkg_dir/usr/include/pkg/h2.h"
]
)
But it seems Bazel doesn't like the fact that both rules use the same directory as output, even though the second one doesn't do anything (?):
output file 'pkg_dir' of rule 'extract_pkg' conflicts with output file 'pkg_dir/usr/include/pkg/h1.h' of rule 'pkg_headers'
It works fine if I use different "root" directory for both rules, but I think there must be some better way to do this.
EDIT
I tried to use declare_directory as follows (compiled from different sources):
unpack_deb.bzl:
def _unpack_deb_impl(ctx):
input_deb_file = ctx.file.deb
output_dir = ctx.actions.declare_directory(ctx.attr.name + ".cc")
print(input_deb_file.path)
print(output_dir.path)
ctx.actions.run_shell(
inputs = [ input_deb_file ],
outputs = [ output_dir ],
arguments = [ input_deb_file.path, output_dir.path ],
progress_message = "Unpacking %s to %s" % (input_deb_file.path, output_dir.path),
command = "dpkg-deb --extract \"$1\" \"$2\"",
)
return [DefaultInfo(files = depset([output_dir]))]
unpack_deb = rule(
implementation = _unpack_deb_impl,
attrs = {
"deb": attr.label(
mandatory = True,
allow_single_file = True,
doc = "The .deb file to be unpacked",
),
},
doc = """
Unpacks a .deb file and returns a directory.
""",
)
BUILD.bazel:
load(":unpack_deb.bzl", "unpack_deb")
unpack_deb(
name = "pkg_dir",
deb = "#deb_pkg//file:pkg.deb"
)
cc_library(
name = "headers",
linkstatic = True,
srcs = [ "pkg_dir" ],
hdrs = ["pkg_dir.cc/usr/include/pkg/h1.h",
"pkg_dir.cc/usr/include/pkg/h2.h"],
strip_include_prefix = "pkg_dir.cc/usr/include",
)
The trick with adding .cc so the input can be accepted by cc_library was stolen from this answer. However the command fails on
ERROR: missing input file 'blah/blah/pkg_dir.cc/usr/include/pkg/h1.h'
From the library.
When I run with debug, I can see the command being "executed" (strange thing is that I don't always see this printout):
SUBCOMMAND: # //blah/pkg:pkg_dir [action 'Unpacking tmp/deb_pkg/file/pkg.deb to blah/pkg/pkg_dir.cc', configuration: xxxx]
(cd /home/user/.../execroot/src && \
exec env - \
/bin/bash -c 'dpkg-deb --extract "$1" "$2"' '' tmp/deb_pkg/file/pkg.deb bazel-out/.../pkg/pkg_dir.cc)
After execution, bazel-out/.../pkg/pkg_dir.cc exists but is empty. If I run the command manually it extracts files correctly. What might be the reason? Also, is it correct that there's an empty string directly after bash command line string?
Bazel's genrule doesn't work very well with directory outputs. See https://docs.bazel.build/versions/master/be/general.html#general-advice
Bazel mostly works with individual files, although there's some support for working with directories in Starlark rules with https://docs.bazel.build/versions/master/skylark/lib/actions.html#declare_directory
Your best bet is probably to extract all the files you're interested in in the genrule, then create filegroups for the different groups of files:
genrule(
name = "extract_pkg",
srcs = ["#deb_pkg//file:pkg.deb"],
outs = [
"pkg_dir/usr/include/pkg/h1.h",
"pkg_dir/usr/include/pkg/h2.h",
"pkg_dir/other_files/file1",
"pkg_dir/other_files/file2",
],
cmd = "dpkg-deb --extract $< $(#D)/pkg_dir",
)
filegroup(
name = "pkg_headers",
srcs = [
":pkg_dir/usr/include/pkg/h1.h",
":pkg_dir/usr/include/pkg/h2.h",
],
)
filegroup(
name = "pkg_other_files",
srcs = [
":pkg_dir/other_files/file1",
":pkg_dir/other_files/file2",
],
)
If you've seen glob, you might be tempted to use glob(["pkg_dir/usr/include/pkg/*.h"]) or similar for the srcs of the filegroup, but note that glob works only with "source files", which means files already on disk, not with the outputs of other rules.
There are rules for creating debs, but I'm not aware of rules for importing them. It's possible to write such rules using Starlark:
https://docs.bazel.build/versions/master/skylark/repository_rules.html
With repository rules, it's possible to avoid having to explicitly write out all the files you want to extract, among other things. Might be more work than you want to do though.

How to specify output artifact from a cc_library in bazel?

I want to build "foo.c" as a library and then execute "readelf" on the generated .so but not the ".a", how can I write it in bazel?
The following BUILD.bazel file doesn't work:
cc_library(
name = "foo",
srcs = ["foo.c"],
)
genrule(
name = "readelf_foo",
srcs = ["libfoo.so"],
outs = ["readelf_foo.txt"],
cmd = "readelf -a $(SRCS) > $#",
)
The error is "missing input file '//:libfoo.so'".
Changing the genrule's srcs attribute to ":foo" passes both the ".a" and ".so" file to readelf, which is not what I need.
Is there any way to specify which output of ":foo" to pass to the genrule?
cc_library produces several outputs, which are separated by output groups. If you want to get only .so outputs, you can use filegroup with dynamic_library output group.
So, this should work:
cc_library(
name = "foo",
srcs = ["foo.c"],
)
filegroup(
name='libfoo',
srcs=[':foo'],
output_group = 'dynamic_library'
)
genrule(
name = "readelf_foo",
srcs = [":libfoo"],
outs = ["readelf_foo.txt"],
cmd = "readelf -a $(SRCS) > $#",
)

How to create a directory structure in bazel

I want to create the following structure in bazel.
dir1
|_ file1
|_ file2
|_ dir2
|_file3
Creating a specific structure doesn't seem trivial.
I'm hoping there's a simple and reusable rule.
Something like:
makedir(
name = "dir1",
path = "dir1",
)
makedir(
name = "dir2",
path = "dir1/dir2",
deps = [":dir1"],
)
What I've tried:
I could create a macro with a python script, but want something cleaner.
I tried creating a genrule with mkdir -p path/to/directoy which didn't work
The use case is that I want to create a squashfs using bazel.
It's important to note that Bazel provides some packaging functions.
To create a squashfs, the command requires a directory structure populated with artifacts.
In my case, I want to create a directory structure and run mksquashfs to produce a squashfs file.
To accomplish this, I ended up modifying the basic example from bazel's docs on packaging.
load("#bazel_tools//tools/build_defs/pkg:pkg.bzl", "pkg_tar")
genrule(
name = "file1",
outs = ["file1.txt"],
cmd = "echo exampleText > $#",
)
pkg_tar(
name = "dir1",
strip_prefix = ".",
package_dir = "/usr/bin",
srcs = [":file1"],
mode = "0755",
)
pkg_tar(
name = "dir2",
strip_prefix = ".",
package_dir = "/usr/share",
srcs = ["//main:file2.txt", "//main:file3.txt"],
mode = "0644",
)
pkg_tar(
name = "pkg",
extension = "tar.gz",
deps = [
":dir1",
":dir2",
],
)
If there's an easier way to create a tar or directory structure without the need for intermediate tars, I'll make that top answer.
You could create such a Bazel macro, that uses genrule:
def mkdir(name, out_dir, marker_file = "marker"):
"""Create an empty directory that you can use as an input in another rule
This will technically create an empty marker file in that directory to avoid Bazel warnings.
You should depend on this marker file.
"""
path = "%s/%s" % (out_dir, marker_file)
native.genrule(
name = name,
outs = [path],
cmd = """mkdir -p $$(dirname $(location :%s)) && touch $(location :%s)""" % (path, path),
)
Then you can use the outputs generated by this macro in a pkg_tar definition:
mkdir(
name = "generate_a_dir",
out_dir = "my_dir",
)
pkg_tar(
name = "package",
srcs = [
# ...
":generate_a_dir",
],
# ...
)
You can always create a genrule target or a shell_binary target that will execute bash command or a shell script (respectively) that creates these directories.
with genrule you can use bazel's $(location) that will make sure that the dir structure you create will be under an output path that is inside bazel's sandbox environment.
The genrule example shows how to use it exactly.
Here you can find more details on predefined output paths.

Runfiles location substitution

I'm trying to run qemu on the output of a cc_binary rule. For that I have created a custom rule, which is pretty similiar to this example, but instead of the cat command on the txt-file, I want to invoke qemu on the output elf-file (":test_portos.elf") of the cc_binary rule. My files are the following:
run_tests.bzl
def _impl(ctx):
# The path of ctx.file.target.path is:
'bazel-out/cortex-a9-fastbuild/bin/test/test_portos.elf'
target = ctx.file.target.path
command = "qemu-system-arm -M xilinx-zynq-a9 -cpu cortex-a9 -nographic
-monitor null -serial null -semihosting
-kernel %s" % (target)
ctx.actions.write(
output=ctx.outputs.executable,
content=command,
is_executable=True)
return [DefaultInfo(
runfiles=ctx.runfiles(files=[ctx.file.target])
)]
execute = rule(
implementation=_impl,
executable=True,
attrs={
"command": attr.string(),
"target" : attr.label(cfg="data", allow_files=True,
single_file=True, mandatory=True)
},
)
BUILD
load("//make:run_tests.bzl", "execute")
execute(
name = "portos",
target = ":test_portos.elf"
)
cc_binary(
name = "test_portos.elf",
srcs = glob(["*.cc"]),
deps = ["//src:portos",
"#unity//:unity"],
copts = ["-Isrc",
"-Iexternal/unity/src",
"-Iexternal/unity/extras/fixture/src"]
)
The problem is, that in the command (of the custom rule) the location of the ":test_portos.elf" is used and not the location of the runfile. I have also tried, like shown in the example, to use $(location :test_portos.elf) together with ctx.expand_location but the result was the same.
How can I get the location of the "test_portos.elf" runfile and insert it into the command of my custom rule?
Seems that the runfiles are safed according to the short_path of the File, so this was all I needed to change in my run_tests.bzl file:
target = ctx.file.target.short_path

In Bazel, how can I make a C++ library depend on a general rule?

I have a library that depends on graphics files that are generated by a shell script.
I would like the library, when it is compiled, to use the shell script to generate the graphics files, which should be copied as if it were a 'data' statement, but whenever I try to make the library depend on the genrule, I get
in deps attribute of cc_library rule //graphics_assets
genrule rule '//graphics_assets:assets_gen_rule' is misplaced here
(expected cc_inc_library, cc_library, objc_library or
cc_proto_library)
# This is the correct format.
# Here we want to run all the shader .glsl files through the program
# file_utils:archive_tool (which we also build elsewhere) and copy the
# output .arc file to the data.
# 1. List the source files
filegroup(
name = "shader_source",
srcs = glob([
"shaders/*.glsl",
]),
)
# 2. invoke file_utils::archive_tool on the shaders
genrule(
name = "shaders_gen_rule",
srcs = [":shader_source"],
outs = ["shaders.arc"],
cmd = "echo $(locations shader_source) > temp.txt ; \
$(location //common/file_utils:archive_tool) \
--create_from_list=temp.txt \
--archive $(OUTS) ; \
$(location //common/file_utils:archive_tool) \
--ls --archive $(OUTS) ",
tools = ["//common/file_utils:archive_tool"],
)
# 3. when a a binary depends on this tool the arc file will be copied.
# This is the thing I had trouble with
cc_library(
name = "shaders",
srcs = [], # Something
data = [":shaders_gen_rule"],
linkstatic = 1,
)

Resources