Change test execution directory in Bazel? - bazel

I have a simple Bazel project layout like this:
.
├── foo
│   ├── BUILD.bazel
│   ├── testdata
│   │   └── a.txt
│   └── test.sh
└── WORKSPACE
The test checks that a.txt exists:
foo/test.sh
#!/bin/bash
FILE=foo/testdata/a.txt
test -f "$FILE"
And it is defined for Bazel like this:
foo/BUILD.bazel
sh_test(
name = "foo",
size = "small",
srcs = [ "test.sh" ],
data = glob([ "testdata/*.txt" ]),
)
However, suppose I don't want my test to depend on its location within the workspace:
#!/bin/bash
FILE=testdata/a.txt # <-------- Path relative to the package directory
test -f "$FILE"
This does not work of course.
$ bazel test --cache_test_results=no --test_output=streamed //foo
...
//foo:foo FAILED in 0.0s
Is there a way to define my test target in Bazel so that it works, without modifying my test script?
In case it matters:
$ bazel --version
bazel 5.3.1

Bazel expects the files to be relative to the WORKSPACE file. Therefore, somewhere the path has to be stored. For instance, you can move the path from .sh file to BUILD file:
Add Skylib to your project, i.e. extend you WORKSPACE file:
WORKSPACE
load("#bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")
http_archive(
name = "bazel_skylib",
sha256 = "74d544d96f4a5bb630d465ca8bbcfe231e3594e5aae57e1edbf17a6eb3ca2506",
urls = [
"https://mirror.bazel.build/github.com/bazelbuild/bazel-skylib/releases/download/1.3.0/bazel-skylib-1.3.0.tar.gz",
"https://github.com/bazelbuild/bazel-skylib/releases/download/1.3.0/bazel-skylib-1.3.0.tar.gz",
],
)
load("#bazel_skylib//:workspace.bzl", "bazel_skylib_workspace")
bazel_skylib_workspace()
Replace the path in the bash script using text replacement:
foo/BUILD.bazel
load("#bazel_skylib//rules:expand_template.bzl", "expand_template")
expand_template(
name = "modifiy_for_bazel",
out = "test_modfied_for_bazel.sh",
substitutions = {
"FILE=testdata/a.txt": "FILE=foo/testdata/a.txt",
},
template = "test.sh",
)
sh_test(
name = "foo",
size = "small",
srcs = ["test_modfied_for_bazel.sh"],
data = glob(["testdata/*.txt"]),
)
Now bazel test --cache_test_results=no --test_output=streamed //foo works.
A similar problem is described here: Bazel: Reading a file with relative path to package, not workspace

Related

Deploy py_binary without container bazel

I am attempting to create a py_binary in bazel and then copy that to a remote machine. At the moment I was doing a dummy app to make sure the pipeline would work. I cannot use a docker image as I need lower level control of the hardware than docker can provide.
My goal is to do the following:
build the software with bazel
create a tar package that contains the py_binary
copy and run that binary on another computer not connected to bazel
To do this I made a simple binary(that for context just makes some RPC calls to a server I am working on as a side project) and the build files is as follows:
py_binary(
name="rosecore_cli",
srcs=glob([
"src/*.py"
]),
deps = [
"//rosecore/proto:project_py_pb2_grpc",
"//rosecore/proto:project_py_pb2"
]
)
pkg_files(
name = "binary",
srcs = [
":rosecore_cli",
],
prefix = "/usr/share/rosecore/bin",
)
pkg_filegroup(
name = "arch",
srcs = [
":binary",
],
visibility = ["//visibility:public"],
)
pkg_tar(
name = "rosecore_tar",
srcs = [
":arch"
],
include_runfiles=True
)
When I build, copy the tar file and extract it I get the following error:
AssertionError: Cannot find .runfiles directory for ./usr/share/rosecore/bin/rosecore_cli
Any help would be appreciated :)

`Bazel run` won't find static file dependency in bazel-bin when running nodejs_image

I am trying to get a rules docker nodejs_image to run using bazel.
My command is
bazel run :image.binary
Here is my rule:
load("#npm//#bazel/typescript:index.bzl", "ts_project")
load("#io_bazel_rules_docker//nodejs:image.bzl", "nodejs_image")
ts_project(
name = "typescript_build",
srcs = glob([
"src/**/*",
]),
allow_js = True,
out_dir = "build",
deps = ["#npm//:node_modules"],
)
nodejs_image(
name = "image",
data = [
":package.json",
":typescript_build",
"#npm//:node_modules",
],
entry_point = "build/app.js",
)
Basically, I need the package.json file because it includes some important configuration information when Node executes. If I call bazel build :image and then grab/run that image, everything works fine. But if I call bazel run :image it will basically work except that it can't find the package.json.
When I check the bazel-bin/ folder, I've noticed that the package.json isn't included, but the built typescript and node_modules are. I'm guessing because I'm not running any prior rules on the package.json, it doesn't get added to the bin, but I really don't know how to work around this.
So, basically, it seems that if you just use a rule like copy_to_bin or js_library, I think their purpose is to help put static files into your bazel-bin.
https://bazelbuild.github.io/rules_nodejs/Built-ins.html#copy_to_bin
ts_project(
name = "typescript_build",
srcs = glob([
"src/**/*",
]),
allow_js = True,
out_dir = "build",
deps = ["#npm//:node_modules"],
)
js_library(
name = "library",
srcs = ["package.json"],
deps = [":typescript_build"],
)
nodejs_image(
name = "image",
data = [
":library",
"#npm//:node_modules",
],
entry_point = "build/app.js",
)

Bazel Make variable substitution for the package root?

Suppose I have a Bazel project like this:
tree .
.
├── foo
│   ├── BUILD.bazel
│   └── foo.txt
└── WORKSPACE
1 directory, 3 files
foo/BUILD.bazel:
genrule(
name = "bar",
srcs = [
"foo.txt",
],
cmd = "cp foo.txt $#",
outs = [
"bar.txt",
],
)
I cannot build bazel build //foo:bar:
bazel build //foo:bar
...
cp: cannot stat 'foo.txt': No such file or directory
It appears that paths in cmd must be relative to the WORKSPACE root, rather than the BUILD root.
This works:
genrule(
name = "bar",
srcs = [
"foo.txt",
],
# cmd = "cp foo.txt $#",
cmd = "cp foo/foo.txt $#",
outs = [
"bar.txt",
],
)
It's inconvenient to have to specify the full path, particularly when BUILD files might be moved around.
It is also nice to be able to write scripts as if they run from their location in the source-tree (of course they are actually run in the sandbox!)
Is there a Make variable substitution that would allow me to specify this more cleanly?
For example:
genrule(
name = "bar",
srcs = [
"foo.txt",
],
cmd = "cd $(SRCDIR) && cp foo.txt $#",
outs = [
"bar.txt",
],
)
Here $(SRCDIR) could expand to ./foo.
Note that this is a contrived example. I cannot use $(SRCS) since I need to use the input files in different ways. I also cannot use $< since I have more than once srcs.
Yes, there are such Make variables. In this particular case $< is the most convenient, so the rule declaration will look like this:
genrule(
name = "bar",
srcs = ["foo.txt"],
outs = ["bar.txt"],
cmd = "cp $< $#",
)
$< can be used if there is only one file in srcs. If there are more of them, then consider using $(SRCS) which will expand in space-separated inputs from srcs.
Also, there are predefined path substitutions, such as $(execpath) and $(rootpath) which expand labels to their full paths. So, the snippet mentioned above will look similar to this:
genrule(
name = "bar",
srcs = ["foo.txt"],
outs = ["bar.txt"],
cmd = "cp $(execpath foo.txt) $#",
)
And there is $(location) expansion which is synonym for either execpath or rootpath (depending on the context) but it's legacy and using it is not recommended.
Here you can check the official docs on Make variables in Bazel: https://docs.bazel.build/versions/2.0.0/be/make-variables.html

How to specify groupid, artifact and version directly in dependencies section of BUILD file using Bazel?

How to specify groupid, artifact and version directly in dependencies section of BUILD file using Bazel?
I am trying to convert a simple gradle project to bazel project. Can't really use generate_workspace since I have a gradle project (not maven).
I am wondering if there is just a easier way to specify GAV in the dependencies itself in the BUILD file so it would look something like this
java_binary(
name = "HelloWorld",
srcs = glob(["src/main/java/**/*.java"]),
resources = glob(["src/main/resources/**"]),
deps = ["com.fasterxml.jackson.core:jackson-core:2.8.8"],
main_class = "com.hello.sample.Foo"
)
Have you tried using maven_jar() directly?
In WORKSPACE:
maven_jar(
name = "com_google_guava_guava",
artifact = "com.google.guava:guava:18.0",
sha1 = "cce0823396aa693798f8882e64213b1772032b09",
)
In BUILD:
java_binary(
name = "HelloWorld",
srcs = glob(["src/main/java/**/*.java"]),
resources = glob(["src/main/resources/**"]),
deps = ["#com_google_guava_guava//jar"],
main_class = "com.hello.sample.Foo"
)
See https://docs.bazel.build/versions/master/be/workspace.html#maven_jar

Jenkins Job DSL - load groovy library from git repo

I want to keep my seed job as small as possible and keep all the logic in a central git repository. Also, I have several independent Jenkins instances that then could share the code. How can I load a groovy library in a Jenkins Job DSL script?
Is there something like the Pipeline Remote File Loader Plugin, so that you only need to do fileLoader.fromGit('lib.groovy', 'https://git.repo')?
Hereafter my quicksheet about achieving this in a parameterized Pipeline job,
Using Pipeline script from SCM from git.repo
What may be of interest to you:
loading mecanism : stash/unstash
"from SCM" location : src = "../${env.JOB_NAME}#script/"
Jenkins
Pipeline
Definition: "Pipeline script from SCM"
SCM: Git
Repository URL git.repo
Branches to build */master
Script Path jobs/build.groovy
This project is parameterized:
String Parameter PARAM0
String Parameter PARAM1
git.repo
├── jobs
│ ├── helpers
│ │ └── utils.groovy
│ └── build.groovy
└── scripts
├── build
│ └── do_build.sh
└── inc.sh
Contents : utils.groovy
├── jobs
│ ├── helpers
│ │ └── utils.groovy
def log(msg) {
println("========== " + msg)
}
return this
Contents : build.groovy
├── jobs
│ └── build.groovy
stage ('Init') {
/* Loads */
def src = "../${env.JOB_NAME}#script/"
def helpers_dir = 'jobs/helpers'
def scripts_dir = 'scripts'
/* Stages Scripts */
def do_build = 'build/do_build.sh'
utils = load src + helpers_dir + "/utils.groovy"
dir(src) {
stash name: scripts_dir, includes: "${scripts_dir}/"
}
}
stage ('Build') {
node () {
unstash scripts_dir
build_return = sh (returnStdout: true, script: """
./${scripts_dir}/${do_build} \
"${PARAM0}" \
"${PARAM1}"
""").readLines()
builded = build_return.get(build_return.size()-1).tokenize(',')
utils.log("PARAM0: " + builded[0])
utils.log("PARAM1: " + builded[1])
}
}
Contents : inc.sh
└── scripts
└── inc.sh
#!/bin/sh
## scripts common includes
common=included
Contents : do_build.sh
└── scripts
├── build
│ └── do_build.sh
#!/bin/sh
## includes
. $(dirname $(dirname ${0}))/inc.sh
echo ${common}
## ${0} : PARAM0
## ${1} : PARAM1
echo "${0},${1}"
The Job DSL Gradle Example shows how to maintain DSL code in a Git repository.

Resources