Bazel working directory differs from Maven. How to migrate? - bazel

I have an existing project which is built with Maven. It typically defines several modules. I want to migrate this project to Bazel.
In a first attempt, I use
└── project
├── moduleA
│   ├── BUILD
│   ├── pom.xml
│   └── src
│   ├── main
│   │   └── java
│   └── test
│   ├── data
│   └── java
├── moduleB
│   ├── BUILD
│   ├── pom.xml
│   └── src
│   ├── main
│   │   └── java
│   └── test
│   └── java
├── pom.xml
└── WORKSPACE
It was not too hard to make the project build with Bazel. My problem is now that tests fails to find their test data.
Indeed, with Maven (or ant), the working directory is the one that contains the pom.xml (or build.xml). So, in that case for moduelA can do:
new File("src/test/data/foo.txt");
However, when the test runs in Bazel, the working directory is the sanboxed runfiles which are rooted like the workspace, i.e. the test must now open:
new File("projectA/src/test/data/foo.txt");
This is all fine after migration, but do you handle this situation during migration, i.e. how do you make the test pass both in Maven and in Bazel?
Is there any facility offered by the Bazel test runner to adapt the paths to legacy behaviour?

The current workaround I have is to check whether new File(".").getAbsoluteFile().getParentFile() is __main__.
See TestFileUtil.java

Related

Jenkins automatically add job in gui based on Jenkinsfile in repo

I have my pipelines in one bitbucket repo.
├── shared_libs
(...)
├── pipelines
│   ├── group1
│   │   ├── pipelineA
│   │   │   └── Jenkinsfile
│   │   ├── pipelineB
│   │   │   └── Jenkinsfile
│   ├── group2
│   │   ├── pipelineD
│   │   │   └── Jenkinsfile
│   │   ├── pipelineC
│   │   │   └── Jenkinsfile
So far I was manually creating jobs in GUI, configuring all parameters, repo, paths to Jenkinsfile. Now I'm having hard time to find a plugin, a way to automate it. Sort of auto-discovery that after setting up repo access. It will create a job based on the Jenkinsfile, put it in a correct place in folder structure, set up all parameters. Kind of like with shared libs. Where I specify repo, path, key. And it's ready to use.
As those pipelines are not just to build and deploy but all sort of admin/cleanup work... I don't want to run them automatically. I just want to automatically add element in GUI once it's pushed to repo/merged to master.
Yes, you can automate the process. It's difficult to give a complete solution without knowing all the bits and pieces. But here is how you can do this.
You can create a new Jenkins Job which will be triggered by a Webhook from the bitbucket repo. The Webhook request will transmit the latest changed files etc. From the request, you can extract the files that were added or changed. Bsed on this information you can use the following Groovy script to create the new Job and move it to the desired directory.
def createJenkinsJob(def jobName, def folderName) {
echo "Creating the job ${jobName}"
// Here I'm using a shared library in the pipeline, so I have loaded my shared library here
// You can simply have the entire pipeline syntax here.
def jobDSL="#Library('ycr#master') _\n" +
"Pipeline()"
def flowDefinition = new org.jenkinsci.plugins.workflow.cps.CpsFlowDefinition(jobDSL, true)
def jenkins = Jenkins.instance
def job = new org.jenkinsci.plugins.workflow.job.WorkflowJob(instance, jobName )
job.definition = flowDefinition
job.setConcurrentBuild(false)
job.save()
jenkins.reload()
// After creating the Job move the job to a folder
if (folderName != null && folderName != "") {
def folder = jenkins.getItemByFullName(folderName)
Items.move(job, folder)
}
}

how should I get bazel workspace set up?

I'm trying to make bazel build a jar and an so file for a flutter project but every time I type 'bazel build' into the command prompt I keep getting
ERROR: The 'build' command is only supported from within a workspace (below a directory having a WORKSPACE file).
See documentation at https://docs.bazel.build/versions/master/build-ref.html#workspace
I've read some documentation it seems like the solution is to create a blank file called 'WORKSPACE' but I don't understand where this file is supposed to be stored. here's a link to the documentation I read https://docs.bazel.build/versions/2.0.0/tutorial/java.html
thanks in advance!
WORKSPACE file goes to the root of your workspace (source). It's the top directory for all your build packages and start of there absolute path you'd refer too with //. For instance if you had a tree like this:
.
├── BUILD
├── a_source_file
├── package1
│   ├── BUILD
│   └── other_source
└── package2
├── BUILD
└── another_source
You would construct your workspace where all your packages converge (root they share) as:
.
├── BUILD
├── WORKSPACE
├── a_source_file
├── package1
│   ├── BUILD
│   └── other_source
└── package2
├── BUILD
└── another_source
And your build targets could then be for instance: //:a_build_target or //package2:another_target.

How to run Jenkins Pipeline defined in a single repo navigating through folder?

Problem: I have a one single repository where I have to walk through the repo to find a specific Jenkinsfile to run the pipeline. Note that I want to define the path to this Jenkinsfile explicity so I thought about having a jenkinsfilePath.yml in root directory of the repo, read the yaml, change directory and run Jenkinfile from the path. The folder structure is as follows:
testingSingleRepo
├── Jenkinsfile
├── feature_flagging
│   ├── Jenkinsfile
│   ├── __init__.py
│   ├── src
│   └── tests
└── jenkinsfilePath.yml
I am having issue running Jenkinsfile inside feature_flagging from the root Jenkinfile in testingSingleRepo. I was successful in changing directory to the folder feature_flagging by using dir. After googling a lot with similar questions, I came across the function build but I could not make that work. Any suggestions/solutions?
Calling a Jenkinsfile from a main pipeline, we can use load
load 'feature_flagging/Jenkinsfile'
So after looking around, I have decided to go with another approach. I have decided that I will have a master Jenkinfile in the root where I will have a generic pipeline setup. It will read the yaml file, change directory and execute shell scripts inside Jenkins/ folder accordingly. The folder will consist of generic scripts that reflects to the root Jenkins pipeline such as setup.sh,test.sh, deploy.sh etc. The folder structure will look something like below:
testingSingleRepo
├── Jenkinsfile
├── feature_flagging
│ ├── Jenkins/
│ ├── __init__.py
│ ├── src
│ └── tests
└── jenkinsfilePath.yml

How to add extra modules to OpenCV4Android?

I am now using NDK to implement the image processing method, and I want to import some extra modules, like xphoto and matlab, to OpenCV, so that I can include the extra libraries into my cpp file.
I have already downloaded the extra modules. Those modules look like this:
├── CONTRIBUTING.md
├── doc
├── LICENSE
├── modules
├── README.md
└── samples
and the modules folder looks like this:
modules
├── aruco
├── bgsegm
├── bioinspired
├── ccalib
├── cnn_3dobj
├── contrib_world
├── cvv
├── datasets
├── dnn
├── dnns_easily_fooled
├── dpm
├── face
├── freetype
├── fuzzy
├── hdf
├── line_descriptor
├── matlab
├── optflow
├── phase_unwrapping
├── plot
├── README.md
├── reg
├── rgbd
├── saliency
├── sfm
├── stereo
├── structured_light
├── surface_matching
├── text
├── tracking
├── xfeatures2d
├── ximgproc
├── xobjdetect
└── xphoto
Since I am not quite familiar to configuring the CMake settings, please give detailed steps.
I do not think that you could just add/include the contrib modules to the OpenCV4Android's pre-built libraries. I would say that you should compile OpenCV+contrib from source. To do so, you have two options:
Option 1: Following this steps you could compile OpenCV+contrib for a target ABI.
Option 2: You could use OpenCV's python cross-compilation script, which enables you to compile OpenCV+contrib for many available ABIS (armeabi-v7a,armeabi,arm64-v8a,x86_64,x86,mips64 and mips), doing the following steps.
-Download/clone OpenCV and the extra contrib modules source code.
-Then, you should run this command line to compile OpenCV for all mentioned ABIS adding Opencv contrib modules as input parameter of the pyhton script.
python ../opencv/platforms/android/build_sdk.py --extra_modules_path ../opencv_contrib/modules <dir-to-store-result> ../opencv
If you had not provided the paths to Android SDK and NDK, to your environment variables, so that your build system is aware of their location, you could also add them as input parameters to the python script.
python ../opencv/platforms/android/build_sdk.py --extra_modules_path ../opencv_contrib/modules --ndk_path <your-path-to-ndk-top-level-folder> --sdk_path <your-path-to-sdk-top-level-folder> <dir-to-store-result> ../opencv
This way you would have a complete OpenCV+contrib modules compilation inside < dir-to-store-result >, with the same structure as the OpenCV4Android.

CMake and Latex

I am using the UseLATEX.cmake to compile my project documentation folder.
My project is organized as follows --
.
├── CMakeLists.txt
├── bin
├── build
├── cmake
│   ├── CMakeCompilerFlags.cmake
│   ├── CMakeDefaults.cmake
│   ├── MacroEnsureOutOfSourceBuilds.cmake
│   └── UseLATEX.cmake
├── doc
│   ├── Doc.tex
│   ├── CMakeLists.txt
│   └── images
│   ├── img1.png
│   ├── img2.png
│   ├── img3.png
│   └── img4.jpeg
............
└── src
├── CMakeLists.txt
├── file1.cpp
├── file2.cpp
└── file3.cpp
My root level cmake file is like this ...
cmake_minimum_required(VERSION 2.8 FATAL_ERROR)
# Set path for CMake
set(CMAKE_MODULE_PATH
"${CMAKE_SOURCE_DIR}/cmake"
${CMAKE_MODULE_PATH}
)
# Define project settings
project(proj)
set(APPLICATION_NAME ${PROJECT_NAME})
include(CMakeDefaults)
# Compile Program and Docs
include_directories(inc)
add_subdirectory(src)
add_subdirectory(doc)
And the CMakeList file in the document file is --
include(UseLATEX)
ADD_LATEX_DOCUMENT(Doc.tex
#BIBFILES mybib.bib
IMAGE_DIRS images
DEFAULT_PDF
)
Now I compile my project in the build folder. Is there any way I can copy back the Doc.pdf file created in the build/doc folder back to my original build folder?
Since ADD_LATEX_DOCUMENT adds a CMake target named pdf here, you should be able to make use of add_custom_command. Try adding the following to your /doc/CMakeLists.txt after the ADD_LATEX_DOCUMENT call:
add_custom_command(TARGET pdf POST_BUILD
COMMAND ${CMAKE_COMMAND} -E copy
${CMAKE_CURRENT_BINARY_DIR}/Doc.pdf
${CMAKE_BINARY_DIR}/Doc.pdf)
This custom command invokes the cmake executable (held in the variable ${CMAKE_COMMAND}) along with the -E copy arguments every time the pdf target is built.

Resources