I have grails app whose build runs under maven. When running the build, the grails' integration test phase fails, which fails the build. Since I don't have any integration tests, but do have unit tests, I'd like to skip just the grails integration test part of the build, while still running my unit tests. Is there a way to do this?
I've looked at the following links, but they don't help in my case. Ideally, I want to be able to just run mvn package without any additional command-line parameters.
http://grails.1312388.n4.nabble.com/skipping-integration-testing-td1377155.html
http://jira.grails.org/browse/MAVEN-94
I've thought about breaking things up into multiple executions and using grails:exec (as mentioned in the nabble post) to execute the test phase, but I'm not sure how I'd set "args" in that case.
I'm using maven 2.0.9, grails 1.3.7. I cannot change the version of maven or grails I'm using.
<plugin>
<groupId>org.grails</groupId>
<artifactId>grails-maven-plugin</artifactId>
<version>1.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<goals>
<goal>init</goal>
<goal>maven-clean</goal>
<goal>validate</goal>
<goal>config-directories</goal>
<goal>maven-compile</goal>
<goal>maven-test</goal>
<goal>maven-war</goal>
<goal>maven-functional-test</goal>
</goals>
</execution>
</executions>
</plugin>
Related
I have a Java application that use Apache POI as dependency and also other jars, all managed by maven, and is deployed on a Tomcat server that has an old version of Apache POI and also its dependencies. My problem is, when I test the application locally it use the jars that are inside the application, the new version, when I run in a docker container it use the jars that are on the Tomcat server, the old version. How can I fix the application to run and use its jar internally and not externally when are on a docker container? I already used the shad maven plugin but it's hard because there are many jars that are old on the server, I can't shade everything.
I changed the plugin to generate jar from maven shade to maven assembly:
<plugin>
<artifactId>maven-assembly-plugin</artifactId>
<version>3.3.0</version>
<configuration>
<descriptorRefs>
<descriptorRef>jar-with-dependencies</descriptorRef>
</descriptorRefs>
</configuration>
<executions>
<execution>
<id>make-assembly</id> <!-- this is used for inheritance merges -->
<phase>package</phase> <!-- bind to the packaging phase -->
<goals>
<goal>single</goal>
</goals>
</execution>
</executions>
</plugin>
In this way the application has everything that needs inside the jar and don't use old classes from the server. And also don't need to shade all the jars.
I am using surefire reports plugin to log the SoapUI test results in Jenkins.
<groupId>com.smartbear.soapui</groupId>
<artifactId>soapui-pro-maven-plugin</artifactId>
<version>5.1.2</version>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<projectFile>${basedir}/i-services-bkt.xml</projectFile>
<outputFolder>${basedir}/target/surefire-reports</outputFolder>
<junitReport>true</junitReport>
<junitHtmlReport>false</junitHtmlReport>
<exportAll>true</exportAll>
<printReport>true</printReport>
<testFailIgnore>true</testFailIgnore>
</configuration>
</execution>
</executions>
After the job is run on Jenkins, I see two entries for each failure in the report. I have post build action with publish JUnit report.
The real issue is with Jenkins. Jenkins post-build actions have a "Publish JUnit Test Results Report" action. This takes an input of the target reports path. If you use
target/surefire-reports/*-xml
the results will be reported twice. You need to pass /TEST*-xml to get results reported just once.
I use surefire-testng to run junit tests, but don't have any testng tests in the same run.
I extract the reports using:
**/surefire-reports/*.xml
You can configure surefire-testng to include or exclude any tests, perhaps you can exclude the junit tests by pattern, then have the junit provider run those.
surefire-reports can also be configured to collect junit and/or testng reports.
Later junit tests use annotations rather than naming conventions. testng seems to recognise this, but default behaviour of surefire-reports seems to be only collect data by naming convention.
So, You need to pass /TEST*-xml for unit and /IT*-XML for integration to get results reported just once. It will allows you to control surefire and failsafe better.
I have several Java projects that runs on several Jenkins jobs with Gradle and Jacoco plugins.
Let's say I have two jobs:
core
app
The core job, pulls from SVN to his workspace: /Jenkins_workspace/core/, then builds and then creates its jacoco exec file. It works fine and I can see the code coverage.
The app job, pulls from SVN to his workspace: /Jenkins_workspace/app/, then builds and then creates its jacoco exec file. It works fine and I can see the code coverage
However inside the app job there were some tests that actullay covered part of the core project. So the code coverage of the core job should be updated. I guess the core project should then have access to jacoco exec file of the app job, but they are on two different workspaces.
Question: How can the core job can access the jacoco exec file of the app job to update the core code coverage?
You can try to use both "destFile" and "append" configuration.
If you use maven:
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<executions>
<execution>
<id>jacoco-ut</id>
<phase>process-test-classes</phase>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>/path/to/jacoco.exec</destFile>
<append>true</append>
</configuration>
</execution>
</executions>
</plugin>
or gradle:
test {
jacoco {
append = true
destinationFile = file("/path/to/jacoco.exec")
}
}
Somehow the jacoco plugin does not allow searching for exec files outside of the current job workspace.
So what I did is, for every app build, its jacoco exec file is copied to the core workspace. The core job will then have no problem findind the jacoco exec file of the app job.
I have a WAR project that also produces some extra artefacts that I want to deploy to my artifact repo. So I have configured executions under the deploy plugin to deploy each of the extra artefacts
<execution>
<id>deploy-exe</id>
<phase>deploy</phase>
<goals>
<goal>deploy-file</goal>
</goals>
<configuration>
<file>target/${project.build.finalName}.exe</file>
<packaging>exe</packaging>
<!-- pom, sources and javadoc already deployed by project. Release repo will fail redeployment -->
<generatePom>false</generatePom>
<sources/>
<javadoc/>
</configuration>
</execution>
But each execution will try and deploy the javadoc and sources for the project, even though I have tried to explicitly switch them off for the execution. NB I want javadoc and sources for the project, but I only want them deployed once (by deploy mojo).
This isn't a big deal until it comes to release time at which point my build breaks because it tries to deploy the javadoc and source for the deploy mojo as well as each of the deploy-file mojo executions to a release repo that doesn't allow redeploy of artifacts.
Is it possible to configure the maven-deploy-plugin to not deploy source & javadoc for the deploy-file mojo?
I am currently working on a multi-module project with the following structure.
root
-module A
-module B
What I want to do is to execute module B (The main method of the module) after the compiling of module B (Module B depends on module A). But I need to do this with a customized command.
Ex -
mvn runb
I know that the exec maven plugin can be used to run a project using maven. What I don't understand is how to create a custom command (phase) in maven. Is there anyway to achieve this without writing a maven plugin?
I referred various guides such as https://community.jboss.org/wiki/CreatingACustomLifecycleInMaven trying to achieve this. But they need to create components.xml and lifecycle.xml files under src/resources/META-INF. I don't understand how to apply that file structure to my project since it is a multi-module project where each module has seperate src directories.
(I'm using maven 3)
You cannot create a custom lifecycle without writing a Maven plugin.
And without hacking Maven itself, at least as of Maven 3.0.5, it is not possible to add a custom phase to Maven through a plugin. The phases are loaded up by the core of Maven from its configuration before any plugins are processed.
If you really have your heart set on using one command to do what you want, writing a plugin is the only way. With some pluginGroup mappings in your settings.xml, this can be made simpler (you can specify mvn my:plugin rather than mvn com.mygroupid:plugin).
But if you are willing to have a slightly more verbose syntax on the command line, what you want could be achieved through profiles and the exec maven plugin.
Add a profile to module B that uses the exec plugin to run itself.
Something like this:
<project>
...
<profiles>
<!-- This profile uses local webapp security instead of the BlueCoat device, useful for testing -->
<profile>
<id>execb</id>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<id>runb</id>
<goals>
<goal>java</goal>
</goals>
<phase>verify</phase> <!-- Anything after package phase -->
<configuration>
<!-- Exec plugin configuration goes here -->
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
You'll need to configure the exec plugin depending on how you run your JAR, more info here.
What this does is run the exec plugin as part of module B's build, but only if the execb profile is activated.
Now, when you just want to build your project (without any exec), build like normal (e.g. mvn install).
When you want to build and run, use the command line:
mvn install -Pexecb
and it will do the exec as part of the build.