When i build my project for coverage testing with "--coverage -fprofile-arcs -ftest-coverage", and then move the build and source to the other user directory to execute testing. I will get so many problems such as "xxx/cc/cc/getopt_log.c:cannot open source file"
the details as the below:
Processing cs/CMakeFiles/cfa/__/src/base/fault_injection.c.gcda
/home/cov/build/xfcq/src/base/fault_injection.c:cannot open source file
the path of "/home/cov/build/xfcq/src/base/fault_injection.c" is the path of build environment, how to change it as the relative path or the path I specified.
I tried to use GCOV_PREFIX and GCOV_PREFIX_STRIP, these can't work well for me.
I also tried to add -b option for lcov, it does not also work well for me.
e.g., lcov --gcov-tool=/bin/gcov -d . -b xx/src -t "xfcq" -o test_cov.info
do you have idea to resolve it?
Well for using gcov coverage process you should never move the files after building your project, instead you should modify your automated build scripts to build everything to the desired location.
When you compile your project with the specified options it generates *.gcno files for each source file which are necessarily the flow chart like details of the relevant source file.
So, the object files are instrumented in such a way that they should trigger function(added by compiler to generate coverage info ) whenever any line of statement is executed to generate *.gcda files with all the execution information.
Note: I can see that you have specified three options in question (--coverage -fprofile-arcs -ftest-coverage) which is again wrong, as --coverage works as a replacement to the other two.
If you specify only --coverage then it will do for the compilation and the linking too.(remember to use it at both the places positively though)
Related
I have a genrule in Bazel that is supposed to manipulate some files. I think I'm not accessing these files by the correct path, so I want to look at the directory structure that Bazel is creating so I can debug.
I added some echo statements to my genrule and I can see that Bazel is working in the directory /home/lyft/.cache/bazel/_bazel_lyft/8de0a1069de8d166c668173ca21c04ae/sandbox/linux-sandbox/1/execroot/. However, after Bazel finishes running, this directory is gone, so I can't look at the directory structure.
How can I prevent Bazel from deleting its temporary files so that I can debug what's happening?
Since this question is a top result for "keep sandbox files after build bazel" Google search and it wasn't obvious for me from the accepted answer, I feel the need to write this answer.
Short answer
Use --sandbox_debug. If this flag is passed, Bazel will not delete the files inside the sandbox folder after the build finishes.
Longer answer
Run bazel build with --sandbox_debug option:
$ bazel build mypackage:mytarget --sandbox_debug
Then you can inspect the contents of the sandbox folder for the project.
To get the location of the sandbox folder for current project, navigate to project and then run:
$ bazel info output_base
/home/johnsmith/.cache/bazel/_bazel_johnsmith/d949417420413f64a0b619cb69f1db69 # output will be something like this
Inside that directory there will be sandbox folder.
Possible caveat: (I'm NOT sure about this but) It's possible that some of the files are missing in sandbox folder, if you previously ran a build without --sandbox_debug flag and it partially succeeded. The reason is Bazel won't rerun parts of the build that already succeeded, and consequently the files corresponding to the successful build parts might not end up in the sandbox.
If you want to make sure all the sandbox files are there, clean the project first using either bazel clean or bazel clean --expunge.
You can use --spawn_strategy=standalone.
You can also use --sandbox_debug to see which directories are mounted to the sandbox.
You can also set the genrule's cmd to find . > $# to debug what's available to the genrule.
Important: declare all srcs/outs/tools that the genrule will read/write/use, and use $(location //label/of:target) to look up their path. Example:
genrule(
name = "x1",
srcs = ["//foo:input1.txt", "//bar:generated_file"],
outs = ["x1out.txt", "x1err.txt"],
tools = ["//util:bin1"],
cmd = "$(location //util:bin1) --input1=$(location //foo:input1.txt) --input2=$(location //bar:generated_file) --some_flag --other_flag >$(location x1out.txt) 2>$(location x1err.txt)",
)
I am running xcodebuild to build my project via command line and the strange thing is that the intermediate build files and the object files from the build folder contain hardcoded absolute paths from my machine. I think that xcodebuild does that automatically.
Is there a way to make them relative? I searched quite long for this but without success.
I need this because I want to transfer the entire project on another machine and to run some xcode unit tests via xcodebuild with the test parameter without rebuilding the project, so as a result I need to transfer the build files also on that machine. The problem is that the paths from the previous machine (on which the build was made) are present in the build files and .o files and it doesn't match the current machine path.
Example:
Build machine project path: /Users/MyBuildUser/BuildFolder/XcodeProject
Test machine project path (the transfer location): /Users/MyTestUser/TestFolder/XcodeProject
Paths such as: /Users/MyBuildUser/BuildFolder/XcodeProject/Sources/Source.h
The paths can be set in Xcode > Preferences > Locations via the Advanced... button:
From there select Custom > Relative to Derived Data or Relative to Workspace.
The problem was about the way I generated the project. The generation is done using CMake. By default, CMake uses absolute paths everywhere and this prohibits generated content to be moved from a workstation to another as described here:
How to tell CMake to use relative paths
As part of our efforts to create a bazel-maven transition interop tool (that creates maven sized jars from more granular sized bazel jars),
we have written an aspect that runs on bazel build of the entire bazel repo and writes important information to txt files outputs (e.g.: jar file paths, compile deps targets and runtime deps targets, etc.)
We ran across an issue where the repo's code was changed such that some of the txt file were not written anymore. But the old txt file from previous runs (before the code change) remained!
Is there a way to know that these txt files are no longer relevant?
You should be able to run with --build_event_json_file=file.json and try to locate generated artifacts. For example we use it on ci.bazel.io to locate actual test.xml file that were generated: https://github.com/bazelbuild/continuous-integration/blob/09975cbb487a84a62ca1e43aa43e7c6fe078f058/jenkins/lib/src/build/bazel/ci/BazelUtils.groovy#L218
The definition of the protocol can be found in build_event_stream.proto
I have a TFS build in a Git team project that uses the default template. It builds a .proj file containing a single target that executes a .PS1 file in Powershell.exe.
The .PS1 generates its own log file. I have been trying to figure out how to get this file to copy to the drop directory \logs folder. From what I can tell, TFS only copies specific files to this output directory:
ActivityLog.AgentScope.[id].xml
ActivityLog.xml
build.log
Anyone tried getting custom logging info to this directory? I tried writing to build.log but that failed with errors.
I like #MrHinsh's answer better than mine, but I found that you can write to a file at this location: $(TF_BUILD_DROPLOCATION)\logs during build.
I assumed that since the path doesn't exist until the log files are copied it would not work. But it does... the TFS/MSBuild log files are simply merged in. And it even seemed to work with a name conflict. For example, if your file is named build.log, MSBuild's will be renamed to build.01.log.
In your PowerShell you can easily execute Host-Write to write to the Build log. All of the standers output methods are captured, although you need to use the "-verbose" tag to get the text to always write.
With Dart's "pub" tool and its "deploy" command, you can create a deployable version of your Dart web application. The output is written to the "deploy" directory. I want to know if there's a way to specify a different output directory?
I searched the internet for a command-line option, but found no mention of it. Running "pub help deploy" shows no options for the deploy command.
If "pub deploy" has no output directory option, I want to find the pub.dart source code. I'll create a customized version that accepts an output directory option. Unfortunately, I can't find pub.dart in the SDK. I found the pub shell script. It calls pub.dart.snapshot, which is 100 thousand lines of unintelligible Dart bytecode. Is there a human readable pub.dart file? Is it in the SDK?
Your help is appreciated.
For now, this is not configurable. The source folder is always /web and the output folder is always /deploy.
The source is available at http://code.google.com/p/dart/source/browse/trunk/dart/sdk/lib/_internal/pub/lib/src/command/deploy.dart#33 .
A simple workaround is just to rename deploy once generated.