How to copy OpenCV DLL files into cmake debug/release folder? - opencv

I need to copy
C:\opencv-3.4.0.-opencl\bin\Debug\*.dll =>
myproj\build\bin\Debug\*.dll
and also
C:\opencv-3.4.0.-opencl\bin\Release\*.dll =>
myproj\build\bin\Release\*.dll
I'd like to do it in one command for Build/Release if possible.

You can copy files on a post-build command. A step through tutorial can be found here.
The basic concept is that you can use batch file commands, as a post-build step in Visual Studio to do basically anything you want as you build.
A further tutorial can be found here
For CMAKE
The easiest way is to follow the advice above but instead of putting it in the post-build options in VS just add a custom command

You can try using CPack to handle multiple configuration at one go. See an example in the following tutorial
https://cmake.org/cmake/help/latest/guide/tutorial/index.html#packaging-debug-and-release-step-12
By default, CMake’s model is that a build directory only contains a
single configuration, be it Debug, Release, MinSizeRel, or
RelWithDebInfo. It is possible, however, to setup CPack to bundle
multiple build directories and construct a package that contains
multiple configurations of the same project.
Then you will need to use either of the following method for each configuration to copy the files you need
configure_file
https://cmake.org/cmake/help/latest/command/configure_file.html
or
add_custom_command
https://cmake.org/cmake/help/latest/command/add_custom_command.html
Here is an example from reddit
https://www.reddit.com/r/cmake/comments/gmewhu/copy_one_file_in_src_directory_to_build_directory/
# Copy <filename> to build directory
set(copy_source_dir "${CMAKE_SOURCE_DIR}/src/<path>")
set(copy_dest_dir "${CMAKE_BINARY_DIR}/Build/<path>/$<CONFIG>")
set(copy_file_name "<filename>")
add_custom_command(
TARGET ${PROJECT_NAME} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E make_directory ${copy_dest_dir}
)
add_custom_command(
TARGET ${PROJECT_NAME} POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy "${copy_source_dir}/${copy_file_name}" "${copy_dest_dir}/${copy_file_name}"
COMMENT "Copying ${copy_file_name} to build directory"
)

Related

Atmel Studio 7 command line build - define symbols on command line

I am using the following command to build an atmel studio project on the command line:
AtmelStudio.exe C:\blahblah\solution.atsln /rebuild DEBUG /project PROJNAME /out output.txt
I would like to define symbols and override the values defined in the project's .cproj file, so that I do not need to define many different configurations. Is this possible, or do I need to use some other tool?
You need to use "/projectconfig".
/projectconfig Specifies the project configuration to build or deploy.
Must be used with /project

How to avoid deleting cached files after build in Bazel

I have a genrule in Bazel that is supposed to manipulate some files. I think I'm not accessing these files by the correct path, so I want to look at the directory structure that Bazel is creating so I can debug.
I added some echo statements to my genrule and I can see that Bazel is working in the directory /home/lyft/.cache/bazel/_bazel_lyft/8de0a1069de8d166c668173ca21c04ae/sandbox/linux-sandbox/1/execroot/. However, after Bazel finishes running, this directory is gone, so I can't look at the directory structure.
How can I prevent Bazel from deleting its temporary files so that I can debug what's happening?
Since this question is a top result for "keep sandbox files after build bazel" Google search and it wasn't obvious for me from the accepted answer, I feel the need to write this answer.
Short answer
Use --sandbox_debug. If this flag is passed, Bazel will not delete the files inside the sandbox folder after the build finishes.
Longer answer
Run bazel build with --sandbox_debug option:
$ bazel build mypackage:mytarget --sandbox_debug
Then you can inspect the contents of the sandbox folder for the project.
To get the location of the sandbox folder for current project, navigate to project and then run:
$ bazel info output_base
/home/johnsmith/.cache/bazel/_bazel_johnsmith/d949417420413f64a0b619cb69f1db69 # output will be something like this
Inside that directory there will be sandbox folder.
Possible caveat: (I'm NOT sure about this but) It's possible that some of the files are missing in sandbox folder, if you previously ran a build without --sandbox_debug flag and it partially succeeded. The reason is Bazel won't rerun parts of the build that already succeeded, and consequently the files corresponding to the successful build parts might not end up in the sandbox.
If you want to make sure all the sandbox files are there, clean the project first using either bazel clean or bazel clean --expunge.
You can use --spawn_strategy=standalone.
You can also use --sandbox_debug to see which directories are mounted to the sandbox.
You can also set the genrule's cmd to find . > $# to debug what's available to the genrule.
Important: declare all srcs/outs/tools that the genrule will read/write/use, and use $(location //label/of:target) to look up their path. Example:
genrule(
name = "x1",
srcs = ["//foo:input1.txt", "//bar:generated_file"],
outs = ["x1out.txt", "x1err.txt"],
tools = ["//util:bin1"],
cmd = "$(location //util:bin1) --input1=$(location //foo:input1.txt) --input2=$(location //bar:generated_file) --some_flag --other_flag >$(location x1out.txt) 2>$(location x1err.txt)",
)

Bazel- How to recursively glob deleted_packages to ignore maven outputs?

I have a mutli-module project which I'm migrating from Maven to Bazel. During this migration people will need to be able to work on both build systems.
After an mvn clean install Maven copies some of the BUILD files into the target folder.
When I later try to run bazel build //... it thinks the BUILD files under the various target folders are valid packages and fails due to some mismatch.
I've seen deleted_packages but AFAICT it requires I specify the list of folders to "delete" while I can't do that for 200+ modules.
I'm looking for the ability to say bazel build //... --deleted_packages=**/target.
Is this supported? (my experimentation says it's not but I might be wrong). If it's not supported is there an existing hack for it?
Can you use your shell to find the list of packages to ignore?
deleted=$(find . -name target -type d)
bazel build //... --deleted_packages="$deleted"
#Laurent's answer gave me the lead but Bazel didn't accept relative paths and required I add both classes and test-classes folders under target to delete the package so I decided to answer with the complete solution:
#!/bin/bash
#find all the target folders under the current working dir
target_folders=$(find . -name target -type d)
#find the repo root (currently assuming it's git based)
repo_root=$(git rev-parse --show-toplevel)
repo_root_length=${#repo_root}
#the current bazel package prefix is the PWD minus the repo root and adding a slash
current_bazel_package="/${PWD:repo_root_length}"
deleted_packages=""
for target in $target_folders
do
#cannonicalize the package path
full_package_path="$current_bazel_package${target:1}"
classes_full="${full_package_path}/classes"
test_classes_full="${full_package_path}/test-classes"
deleted_packages="$deleted_packages,$classes_full,$test_classes_full"
done
#remove the leading comma and call bazel-real with the other args
bazel-real "$#" --deleted_packages=${deleted_packages:1}
This script was checked in under tools/bazel which is why it calls bazel-real at the end.
I'm sorry I don't think this is supported. Some brainstorming:
Is it an option to point maven outputs somewhere else?
Is is an option not to use //... but explicit target(s)?
Maybe just remove the bad BUILD files before running bazel?

how to change the path of source file which was referred gcda?

When i build my project for coverage testing with "--coverage -fprofile-arcs -ftest-coverage", and then move the build and source to the other user directory to execute testing. I will get so many problems such as "xxx/cc/cc/getopt_log.c:cannot open source file"
the details as the below:
Processing cs/CMakeFiles/cfa/__/src/base/fault_injection.c.gcda
/home/cov/build/xfcq/src/base/fault_injection.c:cannot open source file
the path of "/home/cov/build/xfcq/src/base/fault_injection.c" is the path of build environment, how to change it as the relative path or the path I specified.
I tried to use GCOV_PREFIX and GCOV_PREFIX_STRIP, these can't work well for me.
I also tried to add -b option for lcov, it does not also work well for me.
e.g., lcov --gcov-tool=/bin/gcov -d . -b xx/src -t "xfcq" -o test_cov.info
do you have idea to resolve it?
Well for using gcov coverage process you should never move the files after building your project, instead you should modify your automated build scripts to build everything to the desired location.
When you compile your project with the specified options it generates *.gcno files for each source file which are necessarily the flow chart like details of the relevant source file.
So, the object files are instrumented in such a way that they should trigger function(added by compiler to generate coverage info ) whenever any line of statement is executed to generate *.gcda files with all the execution information.
Note: I can see that you have specified three options in question (--coverage -fprofile-arcs -ftest-coverage) which is again wrong, as --coverage works as a replacement to the other two.
If you specify only --coverage then it will do for the compilation and the linking too.(remember to use it at both the places positively though)

Get sourcecode into Jenkins WORKSPACE subdirectory

Is it possible to configure Jenkins to get source code into a subdirectory of a %WORKSPACE%? Right now the source gets pulled into %WORKSPACE% and for the build output I explicitly specify a directory outside of the %WORKSPACE%.
Ideally I would like to have something similar to this:
%WORKSPACE%\source for source code and %WORKSPACE%\artifacts for build outputs. Is it possible to have this configuration?
Create a 'run batch command' build step and use xcopy, this is presuming jenkins is running on a Windows machine, if it's a deployment directory then make it a post build step.
cd c:/
xcopy /Y "c:/program files 86/junkies/workspace/app" "c:/path to new directory"
This is just a guess at your directories, replace with correct ones, the /Y forces it to be overwritten every time it's copied.

Resources