How to run a specific goal on a multi-module project? - maven-3

I have multiple modules and I would like to have clean install goals run against a set of modules and followed by that I would like to have clean tomcat7:deploy run against a specific module. This has to be done in a incremental manner.
I know this can be run in a two command line statements, like this:
mvn -pl moduleA,moduleB,moduleC clean install
mvn -pl moduleD clean tomcat7:deploy
Is it possible to consolidate these above two statements into one?
Or, in my case, moduleD is dependent on moduleA, moduleB and moduleC,
so I would rather use the -am flag (also make):
mvn -am -pl moduleD clean tomcat7:deploy
but tomcat7:deploy is not applicable to the dependent modules A, B, and C.
How can I accomplish this?

Related

Access hash of input files in genrule to pass to command in Bazel

I am looking for a way run command in genrule with hash of input files.
I want to start replacing Maven with Bazel in my projects. It is a multi-repo setup building selected product from source from different repositories.
ProjectA
- moduleA1
- moduleA2
ProjectB
- moduleB1
- moduleB2
Maven builds can be executed like this:
cd ProjectA
mvn versions:set -DnewVersion=A_HASH
mvn clean install
cd ../ProjectB
mvn versions:set -DnewVersion=B_HASH
mvn clean install -DprojectA-version=A_HASH
I use versions:set to not rely on snapshots and get reliable builds even locally. I could use hash from GIT but it is not enough because 1) I want to have build working locally without committed changed 2) B_HASH should change when ProjectA changes
Bazel will let me to re-run maven only when files change but it is not enough to integrate it with maven repository.
Is there a way to implement genrule calling "mvn versions:set -DnewVersion=HASH" with hash of input files? Bazel calculates hash of input files but I cannot find how to expose this hash to genrule.
With Bazel, you can forget about the hacky hash you used with Maven. Bazel maintains hashes for you, and will recompile everything that is needed.
That's the reliable part of {reliable, fast}: Choose two

Maven build with multiple exec plugin executions, ant tasks

I have to perform following steps in my maven build, in the specific order mentioned below:
exec-maven-plugin
maven-antrun-plugin
exec-maven-plugin
maven-antrun-plugin
maven-remote-resources-plugin, jaxb2-maven-plugin
maven-javadoc-plugin
exec-maven-plugin
I have to use JDK 6, so using Maven 3.2.1.
In the pom file, I have defined 5 different profiles for #1, #2, #3, #4, #7 above (profile ids: p1, p2, p3, p4, p5).
I am building my project using multiple commands:
mvn exec:exec -Dp1 (for #1 above)
mvn antrun:run -Dp2 exec:exec -Dp3 (for #2 and #3 above)
mvn antrun:run -Dp4 (for #4 above)
mvn clean install (for #5, #6 above and compile classes)
mvn exec:exec -Dp5 (for #7 above)
The build works okay with multiple commands, but is it possible to execute all steps using one command i.e. mvn clean install ?
What I understand is, it is not possible to have multiple executions of exec-maven-plugin in non consecutive order, hence I used profiles and then execute each step using the profile id. Reference:
Maven maven-exec-plugin multiple execution configurations
What I understand is, it is not possible to have multiple executions of exec-maven-plugin in non consecutive order, hence I used profiles and then execute each step using the profile id.
The “profile trick” mentioned elsewhere is only needed when performing explicit goal invocation from the command line.
From what I gather, however, you would rather have your goals executed as part of a normal mvn clean install. In that case, you are in luck: Simply bind each goal to an appropriate phase in the default lifecycle. Depending on what your steps do, you may, e.g., bind the first <execution> of exec:exec (step 1) to the generate-sources phase. If the first <execution> of antrun:antrun (step 2) is then bound to, say, the process-sources phase, it will be called after step 1 as executes the goals bound to all phases up to install.
Building a project like this with a single mvn install is what Maven was designed to do; having to call mvn five times to build one project is definitely not the Maven Way.
That being said, you may run out of phases if all your steps logically belong to, say, the package phase. In that case, the steps are executed in the order in which their <execution> elements are listed in the pom.xml.

Bazel- How to recursively glob deleted_packages to ignore maven outputs?

I have a mutli-module project which I'm migrating from Maven to Bazel. During this migration people will need to be able to work on both build systems.
After an mvn clean install Maven copies some of the BUILD files into the target folder.
When I later try to run bazel build //... it thinks the BUILD files under the various target folders are valid packages and fails due to some mismatch.
I've seen deleted_packages but AFAICT it requires I specify the list of folders to "delete" while I can't do that for 200+ modules.
I'm looking for the ability to say bazel build //... --deleted_packages=**/target.
Is this supported? (my experimentation says it's not but I might be wrong). If it's not supported is there an existing hack for it?
Can you use your shell to find the list of packages to ignore?
deleted=$(find . -name target -type d)
bazel build //... --deleted_packages="$deleted"
#Laurent's answer gave me the lead but Bazel didn't accept relative paths and required I add both classes and test-classes folders under target to delete the package so I decided to answer with the complete solution:
#!/bin/bash
#find all the target folders under the current working dir
target_folders=$(find . -name target -type d)
#find the repo root (currently assuming it's git based)
repo_root=$(git rev-parse --show-toplevel)
repo_root_length=${#repo_root}
#the current bazel package prefix is the PWD minus the repo root and adding a slash
current_bazel_package="/${PWD:repo_root_length}"
deleted_packages=""
for target in $target_folders
do
#cannonicalize the package path
full_package_path="$current_bazel_package${target:1}"
classes_full="${full_package_path}/classes"
test_classes_full="${full_package_path}/test-classes"
deleted_packages="$deleted_packages,$classes_full,$test_classes_full"
done
#remove the leading comma and call bazel-real with the other args
bazel-real "$#" --deleted_packages=${deleted_packages:1}
This script was checked in under tools/bazel which is why it calls bazel-real at the end.
I'm sorry I don't think this is supported. Some brainstorming:
Is it an option to point maven outputs somewhere else?
Is is an option not to use //... but explicit target(s)?
Maybe just remove the bad BUILD files before running bazel?

rebar dependency without repository

I have rebar project with dependencies, so after clean when I run rebar compile, it downloads dependencies (for git runs git clone, looks like), runs configure for them and then compiles everything. Can I somehow make those dependencies local? I mean to skip downloading them and directly run configure there?
Try to use
rsync option and specify the file path
{rsync, "file:///foo/bar/baz"}
is the shape of it as long as I remember

How to execute package for one submodule only on Jenkins?

I have a sbt project with 4 modules: module-a, module-b, module-c, module-d.
Each module can be packaged as a WAR. I want to set up a deployment on Jenkins that would build only one of the 4 modules and deploy it to a container.
In detail, I want to have 4 Jenkins jobs - job-a, job-b, job-c, job-d, each building only the defined module (a to d).
For now, I am using clean update test package as the command for the Jenkins sbt build, but this results in packaging all 4 modules that is not necessary.
I already tried project -module-a clean update test package but with no luck.
You may also like to execute project-scoped clean and test tasks as follows:
sbt module-a/clean module-a/test
The solution is slightly shorter and clearer as to what project the following commands apply to.
You don't need to execute update task since it's implicitly executed by test as described in inspect tree test.
There's a way to make it cleaner with an alias. Use the following in the build.sbt:
addCommandAlias("jenkinsJob4ModuleA", "; module-a/clean; module-a/test")
With the alias, execute jenkinsJob4ModuleA to have the same effect as the above solution.
Quote the argument to project, i.e. project module-a, and don't use a dash before the name of the submodule.
The entire command line for the Jenkins job would than be as follows:
./sbt "project module-a" clean update test

Resources