I am using the UseLATEX.cmake to compile my project documentation folder.
My project is organized as follows --
.
├── CMakeLists.txt
├── bin
├── build
├── cmake
│ ├── CMakeCompilerFlags.cmake
│ ├── CMakeDefaults.cmake
│ ├── MacroEnsureOutOfSourceBuilds.cmake
│ └── UseLATEX.cmake
├── doc
│ ├── Doc.tex
│ ├── CMakeLists.txt
│ └── images
│ ├── img1.png
│ ├── img2.png
│ ├── img3.png
│ └── img4.jpeg
............
└── src
├── CMakeLists.txt
├── file1.cpp
├── file2.cpp
└── file3.cpp
My root level cmake file is like this ...
cmake_minimum_required(VERSION 2.8 FATAL_ERROR)
# Set path for CMake
set(CMAKE_MODULE_PATH
"${CMAKE_SOURCE_DIR}/cmake"
${CMAKE_MODULE_PATH}
)
# Define project settings
project(proj)
set(APPLICATION_NAME ${PROJECT_NAME})
include(CMakeDefaults)
# Compile Program and Docs
include_directories(inc)
add_subdirectory(src)
add_subdirectory(doc)
And the CMakeList file in the document file is --
include(UseLATEX)
ADD_LATEX_DOCUMENT(Doc.tex
#BIBFILES mybib.bib
IMAGE_DIRS images
DEFAULT_PDF
)
Now I compile my project in the build folder. Is there any way I can copy back the Doc.pdf file created in the build/doc folder back to my original build folder?
Since ADD_LATEX_DOCUMENT adds a CMake target named pdf here, you should be able to make use of add_custom_command. Try adding the following to your /doc/CMakeLists.txt after the ADD_LATEX_DOCUMENT call:
add_custom_command(TARGET pdf POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_CURRENT_BINARY_DIR}/Doc.pdf
${CMAKE_BINARY_DIR}/Doc.pdf)
This custom command invokes the cmake executable (held in the variable ${CMAKE_COMMAND}) along with the -E copy arguments every time the pdf target is built.
Related
Suppse I got a file tree as following
workspace
├── project
│ ├── a
│ ├── b
│ └── c (c is not included in project.tar)
└── project.tar
andproject.tar was created by tar -cf project.tar project-new where project-new has following structure
project-new
├── a
└── b
So I was wondering if there exists any way that when I unarchive project.tar in workspace, I can completely overwrite the structure of directory project. In other words, after unarchiving, is it possiple to make sub-directory c disappear?
I'm trying to build a Docker image using a go compiled binary as the ENTRYPOINT but I'm unable to compile the binary because the go mod is unable to find one of the required files.
the project structure looks like this:
editor/
├── container
│ ├── Dockerfile
│ └── src
│ ├── install-browsers.sh
│ ├── selenium-server-standalone-3.141.59.jar
│ └── webCopy
│ ├── go.mod
│ ├── go.sum
│ └── main.go
├── copier
│ ├── copier.go
│ ├── internal
│ │ └── utils.go
│ └── scripts
│ └── load.go
└── resource
└── handler.go
The file I'm trying to compile is webCopy/main.go
Inside that file I need to import the module editor/copier
The path to the editor module is:
bitbucket.org/backend/editor
which is inside the GOPATH
The error the go mod tidy gives me is:
go: finding module for package bitbucket.org/mvps/backend/editor/copier
bitbucket.org/backend/editor/container/src/webCopy imports
bitbucket.org/backend/editor/copier: cannot find module providing package bitbucket.org/mvps/backend/editor/copier: reading https://api.bitbucket.org/2.0/repositories/mvps/backend?fields=scm: 404 Not Found
I really don't want to mix the copier module inside the src of the container, the reason being I feel the submodules to the main should be separated, yet inside the editor module.
Furthermore, I'm using go.mod as a way to get a clean image by compiling main.go and using the binary to create a new clean artifact, so I would like to have the go.mod and go.sum files inside editor/container/src/webCopy/
btw. I have checked the package names and everything is properly named.
FYI if you are using a go modules build - you are no longer using GOPATH - so that is not the issue.
If you want a custom build - and not have to create laborious git key access to repo's from within a docker build - you can leverage the replace directive in go.mod
So add to .../webCopy/go.mod the following line:
replace bitbucket.org/backend/editor/copier => ../../../copier/
this will instruct the go build to use this relative path (instead of a direct https download)
I'm trying to make bazel build a jar and an so file for a flutter project but every time I type 'bazel build' into the command prompt I keep getting
ERROR: The 'build' command is only supported from within a workspace (below a directory having a WORKSPACE file).
See documentation at https://docs.bazel.build/versions/master/build-ref.html#workspace
I've read some documentation it seems like the solution is to create a blank file called 'WORKSPACE' but I don't understand where this file is supposed to be stored. here's a link to the documentation I read https://docs.bazel.build/versions/2.0.0/tutorial/java.html
thanks in advance!
WORKSPACE file goes to the root of your workspace (source). It's the top directory for all your build packages and start of there absolute path you'd refer too with //. For instance if you had a tree like this:
.
├── BUILD
├── a_source_file
├── package1
│ ├── BUILD
│ └── other_source
└── package2
├── BUILD
└── another_source
You would construct your workspace where all your packages converge (root they share) as:
.
├── BUILD
├── WORKSPACE
├── a_source_file
├── package1
│ ├── BUILD
│ └── other_source
└── package2
├── BUILD
└── another_source
And your build targets could then be for instance: //:a_build_target or //package2:another_target.
I have an existing project which is built with Maven. It typically defines several modules. I want to migrate this project to Bazel.
In a first attempt, I use
└── project
├── moduleA
│ ├── BUILD
│ ├── pom.xml
│ └── src
│ ├── main
│ │ └── java
│ └── test
│ ├── data
│ └── java
├── moduleB
│ ├── BUILD
│ ├── pom.xml
│ └── src
│ ├── main
│ │ └── java
│ └── test
│ └── java
├── pom.xml
└── WORKSPACE
It was not too hard to make the project build with Bazel. My problem is now that tests fails to find their test data.
Indeed, with Maven (or ant), the working directory is the one that contains the pom.xml (or build.xml). So, in that case for moduelA can do:
new File("src/test/data/foo.txt");
However, when the test runs in Bazel, the working directory is the sanboxed runfiles which are rooted like the workspace, i.e. the test must now open:
new File("projectA/src/test/data/foo.txt");
This is all fine after migration, but do you handle this situation during migration, i.e. how do you make the test pass both in Maven and in Bazel?
Is there any facility offered by the Bazel test runner to adapt the paths to legacy behaviour?
The current workaround I have is to check whether new File(".").getAbsoluteFile().getParentFile() is __main__.
See TestFileUtil.java
I am now using NDK to implement the image processing method, and I want to import some extra modules, like xphoto and matlab, to OpenCV, so that I can include the extra libraries into my cpp file.
I have already downloaded the extra modules. Those modules look like this:
├── CONTRIBUTING.md
├── doc
├── LICENSE
├── modules
├── README.md
└── samples
and the modules folder looks like this:
modules
├── aruco
├── bgsegm
├── bioinspired
├── ccalib
├── cnn_3dobj
├── contrib_world
├── cvv
├── datasets
├── dnn
├── dnns_easily_fooled
├── dpm
├── face
├── freetype
├── fuzzy
├── hdf
├── line_descriptor
├── matlab
├── optflow
├── phase_unwrapping
├── plot
├── README.md
├── reg
├── rgbd
├── saliency
├── sfm
├── stereo
├── structured_light
├── surface_matching
├── text
├── tracking
├── xfeatures2d
├── ximgproc
├── xobjdetect
└── xphoto
Since I am not quite familiar to configuring the CMake settings, please give detailed steps.
I do not think that you could just add/include the contrib modules to the OpenCV4Android's pre-built libraries. I would say that you should compile OpenCV+contrib from source. To do so, you have two options:
Option 1: Following this steps you could compile OpenCV+contrib for a target ABI.
Option 2: You could use OpenCV's python cross-compilation script, which enables you to compile OpenCV+contrib for many available ABIS (armeabi-v7a,armeabi,arm64-v8a,x86_64,x86,mips64 and mips), doing the following steps.
-Download/clone OpenCV and the extra contrib modules source code.
-Then, you should run this command line to compile OpenCV for all mentioned ABIS adding Opencv contrib modules as input parameter of the pyhton script.
python ../opencv/platforms/android/build_sdk.py --extra_modules_path ../opencv_contrib/modules <dir-to-store-result> ../opencv
If you had not provided the paths to Android SDK and NDK, to your environment variables, so that your build system is aware of their location, you could also add them as input parameters to the python script.
python ../opencv/platforms/android/build_sdk.py --extra_modules_path ../opencv_contrib/modules --ndk_path <your-path-to-ndk-top-level-folder> --sdk_path <your-path-to-sdk-top-level-folder> <dir-to-store-result> ../opencv
This way you would have a complete OpenCV+contrib modules compilation inside < dir-to-store-result >, with the same structure as the OpenCV4Android.