How do you view stdout of bazel build as it happens?
I want to see all the logs written to stdout during a bazel build.
No one of these allows it to show the ls command before after it has failed
$ bazel build --show_progress --worker_verbose --verbose_failures --verbose_explanations=true -s --test_output=streamed :build
genrule(
name = "build",
cmd = "ls && sleep 60 && exit 1",
)
$ bazel build --show_progress --worker_verbose --verbose_failures --verbose_explanations=true -s --test_output=streamed :build
WARNING: --verbose_explanations has no effect when --explain=<file> is not enabled
INFO: Analyzed target //:build (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
SUBCOMMAND: # //:build [action 'Executing genrule //:build']
(cd /private/var/tmp/_bazel_kevinsimper/f9e6a72c146c5ad83b84a8ebf539f8b2/execroot/__main__ && \
exec env - \
PATH=/usr/local/sbin \
/bin/bash -c 'source external/bazel_tools/tools/genrule/genrule-setup.sh; ls && sleep 60 && exit 1')
ERROR: /Users/kevinsimper/testproject/BUILD:1:1: Executing genrule //:build failed (Exit 1)
BUILD
TESTFILE
Target //:build failed to build
INFO: Elapsed time: 60.256s, Critical Path: 60.04s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
There's no way to stream action stdout/stderr while it's executing, unless it's a test while using the --test_output=streamed flag.
You can follow an underhanded approach. If your build includes small number of lengthy actions, it is feasible to snoop for output in bazel-out/_tmp/actions/std{err,out}-* as it happens. This works for me with Bazel 3.1.0.
Related
I am trying to build electron (master) using the appended script on Ubuntu 22.04. Its throwing the following error (e build doesn't report this error). I am using the latest depot_tools, gn and node.js. Please help:
root#acs-x86-node1-ghatwala-rhel:/electron/src# gn gen out/Release --args="import(\"//electron/build/args/release.gn\")"
ERROR at //electron/BUILD.gn:110:20: Script returned non-zero exit code.
electron_version = exec_script("script/print-version.py",
^----------
Current dir: /electron/src/out/Release/
Command: python3 /electron/src/electron/script/print-version.py
Returned 1 and printed out: 0a>\n/electron/src/electron/script/lib/get-version.js:19\n throw new Error('Failed to get current electron version');\n ^\n\nError: Failed to get current electron version\n at module.exports.getElectronVersion (/electron/src/electron/script/lib/get-version.js:19:11)\n at [eval]:1:37\n at Script.runInThisContext (node:vm:129:12)\n at Object.runInThisContext (node:vm:307:38)\n at node:internal/process/execution:83:21\n at [eval]-wrapper:6:24\n at runScript (node:internal/process/execution:82:62)\n at evalScript (node:internal/process/execution:104:10)\n at node:internal/main/eval_string:50:3\n\nNode.js v19.3.0\n"
File "/usr/lib/python3.8/subprocess.py", line 516, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['node', '-p', 'require("./script/lib/get-version").getElectronVersion()']' returned non-zero exit status 1.
See //electron/build/args/all.gn:2:21: which caused the file to be included.
root_extra_deps = [ "//electron" ]
^-----------
mkdir electron && cd electron
gclient config --name "src/electron" --unmanaged https://github.com/electron/electron
gclient sync --with_branch_heads --with_tags --no-history
cd src
export CHROMIUM_BUILDTOOLS_PATH=`pwd`/buildtools
gn gen out/Release --args="import(\"//electron/build/args/release.gn\")"
ninja -C out/Release electron
I am following the instructions here: https://drake.mit.edu/from_source.html. I already ran
./setup/mac/install_prereqs.sh
in my python virtualenv (drake-venv) and it succeeded. I then managed to build and run the inclined plane example with Bazel. But trying to build some of the other examples results in errors involving YAML like this:
(drake-venv) benq:acrobot % bazel build acrobot_input --subcommands --verbose_failures --sandbox_debug
INFO: Analyzed target //examples/acrobot:acrobot_input (0 packages loaded, 0 targets configured).
INFO: Found 1 target...
SUBCOMMAND: # //examples/acrobot:acrobot_input_codegen [action 'Action examples/acrobot/gen/acrobot_input.cc', configuration: f8bba554e4e3784a5a24e83c682b75e9b6104059526c94f74d854527a53436a6, execution platform: #local_config_platform//:host]
(cd /private/var/tmp/_bazel_benq/a35a7fa5c4830c980dbc52ab349cb0bc/execroot/drake && \
exec env - \
bazel-out/host/bin/tools/vector_gen/lcm_vector_gen '--src=examples/acrobot/acrobot_input_named_vector.yaml' '--out=bazel-out/darwin-opt/bin/examples/acrobot/gen/acrobot_input.cc' '--out=bazel-out/darwin-opt/bin/examples/acrobot/gen/acrobot_input.h' '--include_prefix=drake')
# Configuration: f8bba554e4e3784a5a24e83c682b75e9b6104059526c94f74d854527a53436a6
# Execution platform: #local_config_platform//:host
ERROR: /Users/benq/Documents/drake/examples/acrobot/BUILD.bazel:30:28: Action examples/acrobot/gen/acrobot_input.cc failed: (Exit 1): sandbox-exec failed: error executing command
(cd /private/var/tmp/_bazel_benq/a35a7fa5c4830c980dbc52ab349cb0bc/sandbox/darwin-sandbox/176/execroot/drake && \
exec env - \
TMPDIR=/var/folders/s0/tfqtn2s54135x0qzt5kxnzs00000gn/T/ \
/usr/bin/sandbox-exec -f /private/var/tmp/_bazel_benq/a35a7fa5c4830c980dbc52ab349cb0bc/sandbox/darwin-sandbox/176/sandbox.sb /var/tmp/_bazel_benq/install/ebbb2540c6000feeb8873385c487a79c/process-wrapper '--timeout=0' '--kill_delay=15' bazel-out/host/bin/tools/vector_gen/lcm_vector_gen '--src=examples/acrobot/acrobot_input_named_vector.yaml' '--out=bazel-out/darwin-opt/bin/examples/acrobot/gen/acrobot_input.cc' '--out=bazel-out/darwin-opt/bin/examples/acrobot/gen/acrobot_input.h' '--include_prefix=drake')
Traceback (most recent call last):
File "/private/var/tmp/_bazel_benq/a35a7fa5c4830c980dbc52ab349cb0bc/sandbox/darwin-sandbox/176/execroot/drake/bazel-out/host/bin/tools/vector_gen/lcm_vector_gen.runfiles/drake/tools/vector_gen/lcm_vector_gen.py", line 10, in <module>
import yaml
ModuleNotFoundError: No module named 'yaml'
Target //examples/acrobot:acrobot_input failed to build
INFO: Elapsed time: 1.059s, Critical Path: 0.58s
INFO: 5 processes: 5 internal.
FAILED: Build did NOT complete successfully
But I'm not sure why this is happening considering that importing yaml in Terminal works:
(drake-venv) benq:acrobot % which python
/Users/benq/Documents/drake/drake-venv/bin/python
(drake-venv) benq:acrobot % python --version
Python 3.9.10
(drake-venv) benq:acrobot % python -c 'import yaml'
(drake-venv) benq:acrobot %
I've already tried reinstalling PyYaml but that didn't help.
Relevant Info:
Operating System: macOS Monterey (12.3)
Architecture: x86_64
Python: Python 3.9.10
Bazel version:
% which bazel; bazel version
/usr/local/bin/bazel
Build label: 5.0.0-homebrew
Build target: bazel-out/darwin-opt/bin/src/main/java/com/google/devtools/build/lib/bazel/BazelServer_deploy.jar
Build time: Tue Jan 1 00:00:00 1980 (315532800)
Build timestamp: 315532800
Build timestamp as int: 315532800
Bazel C++ compiler: Apple clang version 13.1.6 (clang-1316.0.21.2)
Git revision: 06dd087b40
The lcm_vector_gen in the error message is a code-generation tool that's run as part of the build.
It's probably not obeying your which python, but instead using the hard-coded /usr/local/bin/python3.9 from https://github.com/RobotLocomotion/drake/blob/master/tools/py_toolchain/interpreter_paths.bzl.
We don't run or test our builds within a virtual environment, so you've stumbled into a novel situation.
Possibly editing that bzl file linked above (interpreter_paths.bzl), to point MACOS_I386_INTERPRETER_PATH to your venv python (/Users/benq/Documents/drake/drake-venv/bin/python), would fix the error.
I am trying to compile some personal python libraries to Buildroot for an embedded device. Here is one of my makefiles :
LXML_VERSION = 4.6.3
LXML_SITE = /home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/package/CNES/lxml/lxml
LXML_SITE_METHOD = local
LXML_SETUP_TYPE = setuptools
LXML_DEPENDENCIES = libxml2 libxslt zlib
HOST_LXML_DEPENDENCIES = host-libxml2 host-libxslt host-zlib
LXML_BUILD_OPTS = \
--xslt-config=$(STAGING_DIR)/usr/bin/xslt-config \
--xml2-config=$(STAGING_DIR)/usr/bin/xml2-config
HOST_LXML_BUILD_OPTS = \
--xslt-config=$(HOST_DIR)/bin/xslt-config \
--xml2-config=$(HOST_DIR)/bin/xml2-config
$(eval $(python-package))
$(eval $(host-python-package))
As you can see, the makefile is almost exactly the default make file for python-lxml. It is also the case for my other libraries. I only changed the source and the method to have buildroot go into my local library.
Here's what I get as an error :
>>> Executing post-image script board/zynq/post-image.sh
[...]
INFO: vfat(boot.vfat): adding file 'u-boot.img' as 'u-boot.img' ...
INFO: vfat(boot.vfat): cmd: "MTOOLS_SKIP_CHECK=1 mcopy -bsp -i '/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/boot.vfat' '/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/u-boot.img' '::'" (stderr):
INFO: vfat(boot.vfat): adding file 'devicetree.dtb' as 'devicetree.dtb' ...
INFO: vfat(boot.vfat): cmd: "MTOOLS_SKIP_CHECK=1 mcopy -bsp -i '/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/boot.vfat' '/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/devicetree.dtb' '::'" (stderr):
INFO: vfat(boot.vfat): adding file 'uImage' as 'uImage' ...
INFO: vfat(boot.vfat): cmd: "MTOOLS_SKIP_CHECK=1 mcopy -bsp -i '/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/boot.vfat' '/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/uImage' '::'" (stderr):
Disk full
INFO: vfat(boot.vfat): cmd: "rm -f "/home/mso-aerosat/Desktop/cecilia_workspace/Buildroot/buildroot-2021.02.7/output/images/boot.vfat"" (stderr):
ERROR: vfat(boot.vfat): failed to generate boot.vfat
make[1]: *** [Makefile:836: target-post-image] Error 1
make: *** [Makefile:84: _all] Error 2
And finally here is the post-image.sh script :
#!/bin/sh
# By default U-Boot loads DTB from a file named "devicetree.dtb", so
# let's use a symlink with that name that points to the *first*
# devicetree listed in the config.
FIRST_DT=$(sed -n \
's/^BR2_LINUX_KERNEL_INTREE_DTS_NAME="\([a-z0-9\-]*\).*"$/\1/p' \
${BR2_CONFIG})
[ -z "${FIRST_DT}" ] || ln -fs ${FIRST_DT}.dtb ${BINARIES_DIR}/devicetree.dtb
support/scripts/genimage.sh -c board/zynq/genimage.cfg
Here is the link to the source code from github, in case you'd need to see other scripts : https://github.com/buildroot/buildroot/blob/master/board/zynq/post-image.sh
How can I solve the problem ? If I change the menuconfig to not add the libraries, it works without an error. But as soon as I add those, it won't. Any advice please ?
Thank you !
So I did make clean and then make again and the error was removed but I still don't have my customized libraries in ouput/target. They are correctly added in output/built though. I don't know why Buildroot doesn't treat them until the end of the process...
I had this genrule in BUILD file, but bazel build failed for the error of:
in cmd attribute of genrule rule //example:create_version_pom: $(BUILD_TAG) not defined
genrule(
name = "create_version_pom",
srcs = ["pom_template.xml"],
outs = ["version_pom.xml"],
cmd = "sed 's/BUILD_TAG/$(BUILD_TAG)/g' $< > $#",
)
What's the reason, and how to fix it please?
The cmd attribute of genrule will do variable expansion for Bazel build variables before the command is executed. The $< and $# variables for the input file and the output file are some of the pre-defined variables. Variables can be defined with --define, e.g.:
$ cat BUILD
genrule(
name = "genfoo",
outs = ["foo"],
cmd = "echo $(bar) > $#",
)
$ bazel build foo --define=bar=123
INFO: Analyzed target //:foo (5 packages loaded, 8 targets configured).
INFO: Found 1 target...
Target //:foo up-to-date:
bazel-bin/foo
INFO: Elapsed time: 0.310s, Critical Path: 0.01s
INFO: 2 processes: 1 internal, 1 linux-sandbox.
INFO: Build completed successfully, 2 total actions
$ cat bazel-bin/foo
123
So to have $(BUILD_TAG) work in the genrule, you'll want to pass
--define=BUILD_TAG=the_build_tag
Unless it's that you want BUILD_TAG replaced with literally $(BUILD_TAG), in which case the $ needs to be escaped with another $: $$(BUILD_TAG).
See
https://docs.bazel.build/versions/main/be/general.html#genrule.cmd
https://docs.bazel.build/versions/main/be/make-variables.html
Note that Bazel also has a mechanism for "build stamping" for bringing information like build time and version numbers into the build:
https://docs.bazel.build/versions/main/user-manual.html#workspace_status
https://docs.bazel.build/versions/main/command-line-reference.html#flag--embed_label
Using --workspace_status_command and --embed_label are a little more complicated though.
I'm trying to send the code coverage report that is created by Slather to Codacy using Fastlane. This is the lane:
desc "Do A Slather and send to Codacy"
lane :code_coverage do |options|
slather(output_directory: "SlatherReports", scheme: "MyApp", configuration: "Debug", workspace: "MyApp.xcworkspace", proj: "MyApp.xcodeproj", cobertura_xml: true, use_bundle_exec: true)
ENV["CODACY_PROJECT_TOKEN"] = options[:codacy_token]
sh "bash <(curl -Ls https://coverage.codacy.com/get.sh -r SlatherReports/cobertura.xml)"
end
The Slather worked but the bash script didn't. It returned this error:
[06:40:19]: Exit status of command 'bash <(curl -Ls
https://coverage.codacy.com/get.sh -r SlatherReports/cobertura.xml)'
was 2 instead of 0. sh: -c: line 0: syntax error near unexpected token
`(' sh: -c: line 0: `bash <(curl -Ls
https://coverage.codacy.com/get.sh -r SlatherReports/cobertura.xml)'
So, how do I fix this? Thanks.
I've found the answer myself. And it's a very simple one (why didn't I think of this earlier?). I just put the whole bash <(curl -Ls https://coverage.codacy.com/get.sh -r SlatherReports/cobertura.xml) line into a different bash script file and call it instead. LOL.