I'm running Bazel (via Bazelisk) in CI and struggling to get reasonable output from builds. I'd like to see the result of all tests and/or targets in the output (even if they were completed by the cache) so developers can be sure the thing they've written is being tested.
The most obvious thing would be to set the --show_task_finish flag, but unfortunately that flag doesn't seem to work. I haven't found any flag that will reliably print out the results of tests or targets. I unfortunately cannot print stdout/stderr from tests that pass as they generate too much output.
For example, some slightly redacted output I encountered recently is nearly completely useless if I want to be sure a specific target was tested:
bazel test //...
(23:19:36) INFO: Options provided by the client:
Inherited 'common' options: --isatty=0 --terminal_columns=0
(23:19:36) INFO: Reading rc options for 'test' from /home/bazelisk/project/.bazelrc:
Inherited 'common' options: --attempt_to_print_relative_paths --show_timestamps --experimental_allow_tags_propagation
(23:19:36) INFO: Reading rc options for 'test' from /home/bazelisk/.bazelrc:
Inherited 'common' options: --color=yes --curses=no --show_progress_rate_limit=0.25 --show_task_finish --announce_rc
(23:19:36) INFO: Reading rc options for 'test' from /home/bazelisk/project/.bazelrc:
Inherited 'build' options: --keep_going --verbose_failures --local_cpu_resources=HOST_CPUS*0.5 --local_ram_resources=HOST_RAM*0.5
(23:19:36) INFO: Reading rc options for 'test' from /home/bazelisk/.bazelrc:
Inherited 'build' options: --local_cpu_resources=HOST_CPUS --local_ram_resources=HOST_RAM*.67
(23:19:36) INFO: Reading rc options for 'test' from /home/bazelisk/project/.bazelrc:
'test' options: --test_output=errors --test_summary=detailed --test_env=LANG=en_US.utf8 --test_env=LOCALE_ARCHIVE
(23:19:36) INFO: Reading rc options for 'test' from /home/bazelisk/.bazelrc:
'test' options: --test_verbose_timeout_warnings
(23:19:36) INFO: Current date is <blah>
(23:19:36) Loading:
(23:19:36) Loading: 0 packages loaded
(23:19:47) Analyzing: 48 targets (0 packages loaded)
(23:19:47) INFO: Build option --test_env has changed, discarding analysis cache.
(23:19:47) Analyzing: 48 targets (0 packages loaded, 0 targets configured)
(23:19:48) INFO: Analyzed 48 targets (0 packages loaded, 13812 targets configured).
(23:19:48) INFO: Found 25 targets and 23 test targets...
(23:19:48) [0 / 3] [Prepa] BazelWorkspaceStatusAction stable-status.txt
(23:19:50) [2,529 / 2,551] 2 / 23 tests; Testing // ... (22 actions running)
(23:19:53) [2,532 / 2,551] 5 / 23 tests; Testing // ... (19 actions running)
(23:19:56) [2,533 / 2,551] 5 / 23 tests; Testing // ... (18 actions running)
(23:19:59) [2,534 / 2,551] 6 / 23 tests; Testing // ... (17 actions running)
(23:20:05) [2,535 / 2,551] 8 / 23 tests; Testing // ... (16 actions running)
(23:20:11) [2,536 / 2,551] 8 / 23 tests; Testing // ... (15 actions running)
(23:20:16) [2,539 / 2,551] 11 / 23 tests; Testing // ... (12 actions running)
(23:20:22) [2,541 / 2,551] 14 / 23 tests; Testing // ... (10 actions running)
(23:20:32) [2,544 / 2,551] 17 / 23 tests; Testing // ... (7 actions running)
(23:20:44) [2,546 / 2,551] 18 / 23 tests; Testing // ... (5 actions running)
(23:20:54) [2,546 / 2,551] 18 / 23 tests; Testing // ... (5 actions running)
(23:21:06) [2,547 / 2,551] 19 / 23 tests; Testing // ... (4 actions running)
(23:21:29) [2,549 / 2,551] 21 / 23 tests; Testing // ... (2 actions running)
(23:21:59) [2,549 / 2,551] 21 / 23 tests; Testing // ... (2 actions running)
(23:22:27) [2,549 / 2,551] 21 / 23 tests; Testing // ... (2 actions running)
(23:22:50) INFO: Elapsed time: 193.959s, Critical Path: 181.09s
(23:22:50) INFO: 24 processes: 24 processwrapper-sandbox.
(23:22:50) INFO: Build completed successfully, 24 total actions
Test cases: finished with 417 passing and 0 failing out of 417 test cases
Executed 23 out of 23 tests: 23 tests pass.
(23:22:50) INFO: Build completed successfully, 24 total actions
CircleCI received exit code 0
For convenience, the fully expanded flags for this run look like:
bazel test \
--announce_rc \
--attempt_to_print_relative_paths \
--color=yes \
--curses=no \
--experimental_allow_tags_propagation \
--isatty=0 \
--keep_going \
--local_cpu_resources=HOST_CPUS \
--local_ram_resources=HOST_RAM*.67 \
--show_progress_rate_limit=0.25 \
--show_task_finish \
--show_timestamps \
--terminal_columns=0 \
--test_env=LANG=en_US.utf8 \
--test_env=LOCALE_ARCHIVE \
--test_output=errors \
--test_summary=detailed \
--test_verbose_timeout_warnings \
--verbose_failures \
//...
--show_result=1000000 will show all of the targets (make the number as big as necessary to include all of your targets). That tends to be a lot of output though. Also note that for tests it indicates the test binary was built, not that it was run.
--test_summary=short is the way to print information about each test.
It looks like with --test_summary=detailed, Bazel prints information only about failed tests: https://docs.bazel.build/versions/main/command-line-reference.html#flag--test_summary
Using the default value of short for --test_summary gives all the targets:
$ for i in $(seq 50); do echo "exit 0" > test$i.sh; done
$ chmod +x *.sh
$ for i in $(seq 50); do echo "sh_test(
name = 'test$i',
srcs = ['test$i.sh'],
)" >> BUILD; done
with detailed:
$ bazel test //... --test_summary=detailed
Starting local Bazel server and connecting to it...
INFO: Analyzed 50 targets (24 packages loaded, 380 targets configured).
INFO: Found 50 test targets...
INFO: Elapsed time: 10.812s, Critical Path: 0.33s
INFO: 201 processes: 101 internal, 100 linux-sandbox.
INFO: Build completed successfully, 201 total actions
Test cases: finished with 50 passing and 0 failing out of 50 test cases
Executed 50 out of 50 tests: 50 tests pass.
INFO: Build completed successfully, 201 total actions
and with short:
$ bazel test //... --test_summary=short
INFO: Analyzed 50 targets (0 packages loaded, 0 targets configured).
INFO: Found 50 test targets...
INFO: Elapsed time: 0.162s, Critical Path: 0.02s
INFO: 1 process: 1 internal.
INFO: Build completed successfully, 1 total action
//:test1 (cached) PASSED in 0.1s
//:test10 (cached) PASSED in 0.1s
//:test11 (cached) PASSED in 0.1s
//:test12 (cached) PASSED in 0.0s
//:test13 (cached) PASSED in 0.1s
//:test14 (cached) PASSED in 0.1s
//:test15 (cached) PASSED in 0.1s
//:test16 (cached) PASSED in 0.1s
//:test17 (cached) PASSED in 0.1s
//:test18 (cached) PASSED in 0.1s
//:test19 (cached) PASSED in 0.1s
//:test2 (cached) PASSED in 0.1s
//:test20 (cached) PASSED in 0.0s
//:test21 (cached) PASSED in 0.1s
//:test22 (cached) PASSED in 0.1s
//:test23 (cached) PASSED in 0.1s
//:test24 (cached) PASSED in 0.1s
//:test25 (cached) PASSED in 0.1s
//:test26 (cached) PASSED in 0.1s
//:test27 (cached) PASSED in 0.1s
//:test28 (cached) PASSED in 0.1s
//:test29 (cached) PASSED in 0.1s
//:test3 (cached) PASSED in 0.0s
//:test30 (cached) PASSED in 0.1s
//:test31 (cached) PASSED in 0.1s
//:test32 (cached) PASSED in 0.1s
//:test33 (cached) PASSED in 0.1s
//:test34 (cached) PASSED in 0.1s
//:test35 (cached) PASSED in 0.1s
//:test36 (cached) PASSED in 0.2s
//:test37 (cached) PASSED in 0.0s
//:test38 (cached) PASSED in 0.1s
//:test39 (cached) PASSED in 0.1s
//:test4 (cached) PASSED in 0.1s
//:test40 (cached) PASSED in 0.0s
//:test41 (cached) PASSED in 0.1s
//:test42 (cached) PASSED in 0.1s
//:test43 (cached) PASSED in 0.1s
//:test44 (cached) PASSED in 0.0s
//:test45 (cached) PASSED in 0.1s
//:test46 (cached) PASSED in 0.1s
//:test47 (cached) PASSED in 0.0s
//:test48 (cached) PASSED in 0.1s
//:test49 (cached) PASSED in 0.1s
//:test5 (cached) PASSED in 0.1s
//:test50 (cached) PASSED in 0.1s
//:test6 (cached) PASSED in 0.1s
//:test7 (cached) PASSED in 0.1s
//:test8 (cached) PASSED in 0.1s
//:test9 (cached) PASSED in 0.1s
Executed 0 out of 50 tests: 50 tests pass.
INFO: Build completed successfully, 1 total action
with detailed and a test failure:
$ bazel test //... --test_summary=detailed
INFO: Analyzed 50 targets (0 packages loaded, 0 targets configured).
INFO: Found 50 test targets...
FAIL: //:test2 (see /home/ahumesky/.cache/bazel/_bazel_ahumesky/d7e5f46ce97861928779430e418f94f3/execroot/__main__/bazel-out/k8-fastbuild/testlogs/test2/test.log)
INFO: Elapsed time: 0.124s, Critical Path: 0.04s
INFO: 2 processes: 2 linux-sandbox.
INFO: Build completed, 1 test FAILED, 2 total actions
//:test2 FAILED in 0.0s
ERROR .test2 (0.0s)
Test cases: finished with 49 passing and 1 failing out of 50 test cases
Executed 1 out of 50 tests: 49 tests pass and 1 fails locally.
INFO: Build completed, 1 test FAILED, 2 total actions
with short and a test failure:
$ bazel test //... --test_summary=short
INFO: Analyzed 50 targets (0 packages loaded, 0 targets configured).
INFO: Found 50 test targets...
FAIL: //:test2 (see /home/ahumesky/.cache/bazel/_bazel_ahumesky/d7e5f46ce97861928779430e418f94f3/execroot/__main__/bazel-out/k8-fastbuild/testlogs/test2/test.log)
INFO: Elapsed time: 0.164s, Critical Path: 0.06s
INFO: 2 processes: 2 linux-sandbox.
INFO: Build completed, 1 test FAILED, 2 total actions
//:test1 (cached) PASSED in 0.1s
//:test10 (cached) PASSED in 0.1s
//:test11 (cached) PASSED in 0.1s
//:test12 (cached) PASSED in 0.0s
//:test13 (cached) PASSED in 0.1s
//:test14 (cached) PASSED in 0.1s
//:test15 (cached) PASSED in 0.1s
//:test16 (cached) PASSED in 0.1s
//:test17 (cached) PASSED in 0.1s
//:test18 (cached) PASSED in 0.1s
//:test19 (cached) PASSED in 0.1s
//:test20 (cached) PASSED in 0.0s
//:test21 (cached) PASSED in 0.1s
//:test22 (cached) PASSED in 0.1s
//:test23 (cached) PASSED in 0.1s
//:test24 (cached) PASSED in 0.1s
//:test25 (cached) PASSED in 0.1s
//:test26 (cached) PASSED in 0.1s
//:test27 (cached) PASSED in 0.1s
//:test28 (cached) PASSED in 0.1s
//:test29 (cached) PASSED in 0.1s
//:test3 (cached) PASSED in 0.0s
//:test30 (cached) PASSED in 0.1s
//:test31 (cached) PASSED in 0.1s
//:test32 (cached) PASSED in 0.1s
//:test33 (cached) PASSED in 0.1s
//:test34 (cached) PASSED in 0.1s
//:test35 (cached) PASSED in 0.1s
//:test36 (cached) PASSED in 0.2s
//:test37 (cached) PASSED in 0.0s
//:test38 (cached) PASSED in 0.1s
//:test39 (cached) PASSED in 0.1s
//:test4 (cached) PASSED in 0.1s
//:test40 (cached) PASSED in 0.0s
//:test41 (cached) PASSED in 0.1s
//:test42 (cached) PASSED in 0.1s
//:test43 (cached) PASSED in 0.1s
//:test44 (cached) PASSED in 0.0s
//:test45 (cached) PASSED in 0.1s
//:test46 (cached) PASSED in 0.1s
//:test47 (cached) PASSED in 0.0s
//:test48 (cached) PASSED in 0.1s
//:test49 (cached) PASSED in 0.1s
//:test5 (cached) PASSED in 0.1s
//:test50 (cached) PASSED in 0.1s
//:test6 (cached) PASSED in 0.1s
//:test7 (cached) PASSED in 0.1s
//:test8 (cached) PASSED in 0.1s
//:test9 (cached) PASSED in 0.1s
//:test2 FAILED in 0.0s
/home/ahumesky/.cache/bazel/_bazel_ahumesky/d7e5f46ce97861928779430e418f94f3/execroot/__main__/bazel-out/k8-fastbuild/testlogs/test2/test.log
Executed 1 out of 50 tests: 49 tests pass and 1 fails locally.
INFO: Build completed, 1 test FAILED, 2 total actions
The build event protocol provides all the data you need.
A tool such as BuildBuddy can visualize the Build Event Protocol for you.
Screenshot from BuildBuddy:
BuildBuddy shows for example all tests with the corresponding test results. You can click on each individual test and have look on the corresponding logs:
The Build Event Protocol provides live streaming. This way who can provide your developers with live build/test information.
There is also an commercial-only solution from EngFlow called Build
Dashboard.
Related
I'm following the instructions from here and here to install drake and I used this file to install all the dependencies. Now when I run cmake ../drake and then make -j, I see I very very long traceback that stackoverflow won't let me post here because I can't seem to get it into a format it likes. But this is where the first error starts:
ERROR: /home/prasanth/.cache/bazel/_bazel_prasanth/7a6ad5daa22f12899ee3dcd32eb0729d/external/gz_math_internal/BUILD.bazel:172:20: Compiling external/gz_math_internal/drake_src/src/Angle.cc failed: (Exit 1): cc failed: error executing command /usr/bin/cc -U_FORTIFY_SOURCE -fstack-protector -Wall -Wunused-but-set-parameter -Wno-free-nonheap-object -fno-omit-frame-pointer -g0 -O2 '-D_FORTIFY_SOURCE=1' -DNDEBUG -ffunction-sections ... (remaining 25 arguments skipped)
Alternatively, when I use bazel build //... as mentioned here, I see:
WARNING: The following rc files are no longer being read, please transfer their contents or import their path into one of the standard rc files: /home/prasanth/ros2/robotic_tray_processor/drake/tools/bazel.rc INFO: Build option --compilation_mode has changed, discarding analysis cache. INFO: Analyzed 9795 targets (143 packages loaded, 29016 targets configured). INFO: Found 9795 targets... ERROR: /home/prasanth/ros2/robotic_tray_processor/drake/examples/kuka_iiwa_arm/models/BUILD.bazel:8:13: Middleman _middlemen/examples_Skuka_Uiiwa_Uarm_Smodels_Sinstall_Udata-runfiles failed: missing input file '//tools/install:installer.py' ERROR: /home/prasanth/ros2/robotic_tray_processor/drake/examples/kuka_iiwa_arm/models/BUILD.bazel:8:13: Middleman _middlemen/examples_Skuka_Uiiwa_Uarm_Smodels_Sinstall_Udata-runfiles failed: 1 input file(s) do not exist ERROR: /home/prasanth/ros2/robotic_tray_processor/drake/examples/kuka_iiwa_arm/models/BUILD.bazel:8:13 Middleman _middlemen/examples_Skuka_Uiiwa_Uarm_Smodels_Sinstall_Udata-runfiles failed: 1 input file(s) do not exist INFO: Elapsed time: 7.483s, Critical Path: 0.01s INFO: 9 processes: 9 internal. FAILED: Build did NOT complete successfully
Please help! Thanks in advance.
I am trying to build a django project to push it to an ECR instance but I am having trouble building the project through docker.
Running this command:
docker build -t stuffkeep-docker-app .
Results in this error and I have not found any way to resolve this yet.
Building wheels for collected packages: backports.zoneinfo, django-allauth, django-rest-framework, fcm-django, gcloud, http-ece, jws, py-vapid, pywebpush
Building wheel for backports.zoneinfo (pyproject.toml): started
Building wheel for backports.zoneinfo (pyproject.toml): finished with status 'error'
error: subprocess-exited-with-error
× Building wheel for backports.zoneinfo (pyproject.toml) did not run successfully.
│ exit code: 1
╰─> [43 lines of output]
/tmp/pip-build-env-9x4p5_f7/overlay/lib/python3.10/site-packages/setuptools/config/setupcfg.py:459: SetuptoolsDeprecationWarning: The license_file parameter is deprecated, use license_files instead.
warnings.warn(msg, warning_class)
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-x86_64-cpython-310
creating build/lib.linux-x86_64-cpython-310/backports
copying src/backports/__init__.py -> build/lib.linux-x86_64-cpython-310/backports
creating build/lib.linux-x86_64-cpython-310/backports/zoneinfo
copying src/backports/zoneinfo/_tzpath.py -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
copying src/backports/zoneinfo/_version.py -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
copying src/backports/zoneinfo/_zoneinfo.py -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
copying src/backports/zoneinfo/_common.py -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
copying src/backports/zoneinfo/__init__.py -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
running egg_info
writing src/backports.zoneinfo.egg-info/PKG-INFO
writing dependency_links to src/backports.zoneinfo.egg-info/dependency_links.txt
writing requirements to src/backports.zoneinfo.egg-info/requires.txt
writing top-level names to src/backports.zoneinfo.egg-info/top_level.txt
reading manifest file 'src/backports.zoneinfo.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching '*.png' under directory 'docs'
warning: no files found matching '*.svg' under directory 'docs'
no previously-included directories found matching 'docs/_build'
no previously-included directories found matching 'docs/_output'
adding license file 'LICENSE'
adding license file 'licenses/LICENSE_APACHE'
writing manifest file 'src/backports.zoneinfo.egg-info/SOURCES.txt'
copying src/backports/zoneinfo/__init__.pyi -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
copying src/backports/zoneinfo/py.typed -> build/lib.linux-x86_64-cpython-310/backports/zoneinfo
running build_ext
building 'backports.zoneinfo._czoneinfo' extension
creating build/temp.linux-x86_64-cpython-310
creating build/temp.linux-x86_64-cpython-310/lib
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -I/usr/local/include/python3.10 -c lib/zoneinfo_module.c -o build/temp.linux-x86_64-cpython-310/lib/zoneinfo_module.o -std=c99
lib/zoneinfo_module.c: In function ‘zoneinfo_fromutc’:
lib/zoneinfo_module.c:600:19: error: ‘_PyLong_One’ undeclared (first use in this function); did you mean ‘_PyLong_New’?
600 | one = _PyLong_One;
| ^~~~~~~~~~~
| _PyLong_New
lib/zoneinfo_module.c:600:19: note: each undeclared identifier is reported only once for each function it appears in
error: command '/usr/bin/gcc' failed with exit code 1
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for backports.zoneinfo
and
ERROR: Could not build wheels for backports.zoneinfo, which is required to install pyproject.toml-based projects
I am very aware of this question which has the same error but the poster of that is building for Heroku which I am not using and the solutions given are heroku focused.
I have the following config_setting defined:
config_setting(
name = "perception_env",
values = {"perception": "true"},
)
print(perception_env)
However, I can't seem to print the variable, it says it doesn't exist.
config_setting is only used for selecting the different possible values in a select(). A config_setting doesn't really have a value, it's more an association of a variable (a Bazel flag, a Starlark-defined flag, platform constraints) and its value. The values attribute is basically for flag values ("perception" would have to be a bazel flag).
For example,
config_setting(
name = "my_config_setting_opt",
values = {"compilation_mode": "opt"}
)
config_setting(
name = "config_setting_dbg",
values = {"compilation_mode": "dbg"}
)
config_setting(
name = "config_setting_fastbuild",
values = {"compilation_mode": "fastbuild"}
)
genrule(
name = "gen_out",
outs = ["out"],
cmd = select({
":my_config_setting_opt": "echo Opt mode > $#",
":config_setting_dbg": "echo Dbg mode > $#",
":config_setting_fastbuild": "echo Fastbuild mode > $#",
}),
)
The 3 config_settings declare 3 different associations of the --compilation_mode flag, one for each of its possible values (see https://bazel.build/docs/user-manual#compilation-mode)
Then the select() declares 3 different possible values for the cmd attribute of the genrule gen_out. Then setting the --compilation_mode flag to different values changes which value for cmd is selected:
$ bazel build out --compilation_mode=dbg && cat bazel-bin/out
INFO: Build option --compilation_mode has changed, discarding analysis cache.
INFO: Analyzed target //:out (0 packages loaded, 11 targets configured).
INFO: Found 1 target...
Target //:out up-to-date:
bazel-bin/out
INFO: Elapsed time: 0.145s, Critical Path: 0.01s
INFO: 2 processes: 1 internal, 1 linux-sandbox.
INFO: Build completed successfully, 2 total actions
Dbg mode
$ bazel build out --compilation_mode=opt && cat bazel-bin/out
INFO: Build option --compilation_mode has changed, discarding analysis cache.
INFO: Analyzed target //:out (0 packages loaded, 11 targets configured).
INFO: Found 1 target...
Target //:out up-to-date:
bazel-bin/out
INFO: Elapsed time: 0.111s, Critical Path: 0.01s
INFO: 2 processes: 1 internal, 1 linux-sandbox.
INFO: Build completed successfully, 2 total actions
Opt mode
$ bazel build out --compilation_mode=fastbuild && cat bazel-bin/out
INFO: Build option --compilation_mode has changed, discarding analysis cache.
INFO: Analyzed target //:out (0 packages loaded, 11 targets configured).
INFO: Found 1 target...
Target //:out up-to-date:
bazel-bin/out
INFO: Elapsed time: 0.145s, Critical Path: 0.01s
INFO: 1 process: 1 internal.
INFO: Build completed successfully, 1 total action
Fastbuild mode
I had this genrule in BUILD file, but bazel build failed for the error of:
in cmd attribute of genrule rule //example:create_version_pom: $(BUILD_TAG) not defined
genrule(
name = "create_version_pom",
srcs = ["pom_template.xml"],
outs = ["version_pom.xml"],
cmd = "sed 's/BUILD_TAG/$(BUILD_TAG)/g' $< > $#",
)
What's the reason, and how to fix it please?
The cmd attribute of genrule will do variable expansion for Bazel build variables before the command is executed. The $< and $# variables for the input file and the output file are some of the pre-defined variables. Variables can be defined with --define, e.g.:
$ cat BUILD
genrule(
name = "genfoo",
outs = ["foo"],
cmd = "echo $(bar) > $#",
)
$ bazel build foo --define=bar=123
INFO: Analyzed target //:foo (5 packages loaded, 8 targets configured).
INFO: Found 1 target...
Target //:foo up-to-date:
bazel-bin/foo
INFO: Elapsed time: 0.310s, Critical Path: 0.01s
INFO: 2 processes: 1 internal, 1 linux-sandbox.
INFO: Build completed successfully, 2 total actions
$ cat bazel-bin/foo
123
So to have $(BUILD_TAG) work in the genrule, you'll want to pass
--define=BUILD_TAG=the_build_tag
Unless it's that you want BUILD_TAG replaced with literally $(BUILD_TAG), in which case the $ needs to be escaped with another $: $$(BUILD_TAG).
See
https://docs.bazel.build/versions/main/be/general.html#genrule.cmd
https://docs.bazel.build/versions/main/be/make-variables.html
Note that Bazel also has a mechanism for "build stamping" for bringing information like build time and version numbers into the build:
https://docs.bazel.build/versions/main/user-manual.html#workspace_status
https://docs.bazel.build/versions/main/command-line-reference.html#flag--embed_label
Using --workspace_status_command and --embed_label are a little more complicated though.
How do you view stdout of bazel build as it happens?
I want to see all the logs written to stdout during a bazel build.
No one of these allows it to show the ls command before after it has failed
$ bazel build --show_progress --worker_verbose --verbose_failures --verbose_explanations=true -s --test_output=streamed :build
genrule(
name = "build",
cmd = "ls && sleep 60 && exit 1",
)
$ bazel build --show_progress --worker_verbose --verbose_failures --verbose_explanations=true -s --test_output=streamed :build
WARNING: --verbose_explanations has no effect when --explain=<file> is not enabled
INFO: Analyzed target //:build (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
SUBCOMMAND: # //:build [action 'Executing genrule //:build']
(cd /private/var/tmp/_bazel_kevinsimper/f9e6a72c146c5ad83b84a8ebf539f8b2/execroot/__main__ && \
exec env - \
PATH=/usr/local/sbin \
/bin/bash -c 'source external/bazel_tools/tools/genrule/genrule-setup.sh; ls && sleep 60 && exit 1')
ERROR: /Users/kevinsimper/testproject/BUILD:1:1: Executing genrule //:build failed (Exit 1)
BUILD
TESTFILE
Target //:build failed to build
INFO: Elapsed time: 60.256s, Critical Path: 60.04s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
There's no way to stream action stdout/stderr while it's executing, unless it's a test while using the --test_output=streamed flag.
You can follow an underhanded approach. If your build includes small number of lengthy actions, it is feasible to snoop for output in bazel-out/_tmp/actions/std{err,out}-* as it happens. This works for me with Bazel 3.1.0.