I am using surefire reports plugin to log the SoapUI test results in Jenkins.
<groupId>com.smartbear.soapui</groupId>
<artifactId>soapui-pro-maven-plugin</artifactId>
<version>5.1.2</version>
<executions>
<execution>
<phase>test</phase>
<goals>
<goal>test</goal>
</goals>
<configuration>
<projectFile>${basedir}/i-services-bkt.xml</projectFile>
<outputFolder>${basedir}/target/surefire-reports</outputFolder>
<junitReport>true</junitReport>
<junitHtmlReport>false</junitHtmlReport>
<exportAll>true</exportAll>
<printReport>true</printReport>
<testFailIgnore>true</testFailIgnore>
</configuration>
</execution>
</executions>
After the job is run on Jenkins, I see two entries for each failure in the report. I have post build action with publish JUnit report.
The real issue is with Jenkins. Jenkins post-build actions have a "Publish JUnit Test Results Report" action. This takes an input of the target reports path. If you use
target/surefire-reports/*-xml
the results will be reported twice. You need to pass /TEST*-xml to get results reported just once.
I use surefire-testng to run junit tests, but don't have any testng tests in the same run.
I extract the reports using:
**/surefire-reports/*.xml
You can configure surefire-testng to include or exclude any tests, perhaps you can exclude the junit tests by pattern, then have the junit provider run those.
surefire-reports can also be configured to collect junit and/or testng reports.
Later junit tests use annotations rather than naming conventions. testng seems to recognise this, but default behaviour of surefire-reports seems to be only collect data by naming convention.
So, You need to pass /TEST*-xml for unit and /IT*-XML for integration to get results reported just once. It will allows you to control surefire and failsafe better.
Related
I have several Java projects that runs on several Jenkins jobs with Gradle and Jacoco plugins.
Let's say I have two jobs:
core
app
The core job, pulls from SVN to his workspace: /Jenkins_workspace/core/, then builds and then creates its jacoco exec file. It works fine and I can see the code coverage.
The app job, pulls from SVN to his workspace: /Jenkins_workspace/app/, then builds and then creates its jacoco exec file. It works fine and I can see the code coverage
However inside the app job there were some tests that actullay covered part of the core project. So the code coverage of the core job should be updated. I guess the core project should then have access to jacoco exec file of the app job, but they are on two different workspaces.
Question: How can the core job can access the jacoco exec file of the app job to update the core code coverage?
You can try to use both "destFile" and "append" configuration.
If you use maven:
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<executions>
<execution>
<id>jacoco-ut</id>
<phase>process-test-classes</phase>
<goals>
<goal>prepare-agent</goal>
</goals>
<configuration>
<destFile>/path/to/jacoco.exec</destFile>
<append>true</append>
</configuration>
</execution>
</executions>
</plugin>
or gradle:
test {
jacoco {
append = true
destinationFile = file("/path/to/jacoco.exec")
}
}
Somehow the jacoco plugin does not allow searching for exec files outside of the current job workspace.
So what I did is, for every app build, its jacoco exec file is copied to the core workspace. The core job will then have no problem findind the jacoco exec file of the app job.
I have a WAR project that also produces some extra artefacts that I want to deploy to my artifact repo. So I have configured executions under the deploy plugin to deploy each of the extra artefacts
<execution>
<id>deploy-exe</id>
<phase>deploy</phase>
<goals>
<goal>deploy-file</goal>
</goals>
<configuration>
<file>target/${project.build.finalName}.exe</file>
<packaging>exe</packaging>
<!-- pom, sources and javadoc already deployed by project. Release repo will fail redeployment -->
<generatePom>false</generatePom>
<sources/>
<javadoc/>
</configuration>
</execution>
But each execution will try and deploy the javadoc and sources for the project, even though I have tried to explicitly switch them off for the execution. NB I want javadoc and sources for the project, but I only want them deployed once (by deploy mojo).
This isn't a big deal until it comes to release time at which point my build breaks because it tries to deploy the javadoc and source for the deploy mojo as well as each of the deploy-file mojo executions to a release repo that doesn't allow redeploy of artifacts.
Is it possible to configure the maven-deploy-plugin to not deploy source & javadoc for the deploy-file mojo?
After switching from maven 2 to maven 3 I have found out having 0% test coverage reported by cobertura. I've stated question about which versions of cobertura and surefire to use: What versions of cobertura and surefire plugins work together under maven3?
I have, however, investigated problem deeper and found out what of configuration fragment was not working:
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-surefire-plugin</artifactId>
<version>${surefire.plugin.version}</version>
<configuration>
<forkMode>never</forkMode>
<redirectTestOutputToFile>true</redirectTestOutputToFile>
<argLine>-Xmx512m -XX:MaxPermSize=256m</argLine>
</configuration>
</plugin>
After changing forkMode from never to once the test coverage started to be generated. So, the problem was not the plugins version incompatibility itself, but the problem with the support by various fork modes of surefire by cobertura under maven 3.
So, my question is, is it a bug, or cobertura plugin is designed in such way, that it won't work with some forkMode=never?
Cobertura is designed to output the coverage results when the JVM exits.
<forkMode>never</forkMode> instructs Maven not to fork a JVM for running the tests but to re-use the current JVM.
In this case, the coverage results may not be output until Maven completes execution.
In Maven 2, I am not 100% certain, but I think some of how the forked lifecycle (evilly) used by the cobertura plugin worked caused either a JVM fork for the forked lifecycle, or else a classloader unload effectively had the same result.
Thus in my opinion it was a bug of Maven 2 that coverage happened to work with <forkMode>never</forkMode>.
Note: <forkMode>never</forkMode> is considered quite dangerous because of how System properties are not scoped per classloader, among other issues. <forkMode>once</forkMode> is generally the best option (unless you have tests that abuse memory - some versions of JUnit keep all the test class instances in memory until reporting at the end of the run, so if each test class holds on to heavy objects, GC will be unable to clear them out as they are live until the end of the test run. In such cases a perclass/always forkMode will be required)
I have grails app whose build runs under maven. When running the build, the grails' integration test phase fails, which fails the build. Since I don't have any integration tests, but do have unit tests, I'd like to skip just the grails integration test part of the build, while still running my unit tests. Is there a way to do this?
I've looked at the following links, but they don't help in my case. Ideally, I want to be able to just run mvn package without any additional command-line parameters.
http://grails.1312388.n4.nabble.com/skipping-integration-testing-td1377155.html
http://jira.grails.org/browse/MAVEN-94
I've thought about breaking things up into multiple executions and using grails:exec (as mentioned in the nabble post) to execute the test phase, but I'm not sure how I'd set "args" in that case.
I'm using maven 2.0.9, grails 1.3.7. I cannot change the version of maven or grails I'm using.
<plugin>
<groupId>org.grails</groupId>
<artifactId>grails-maven-plugin</artifactId>
<version>1.3.7</version>
<extensions>true</extensions>
<executions>
<execution>
<goals>
<goal>init</goal>
<goal>maven-clean</goal>
<goal>validate</goal>
<goal>config-directories</goal>
<goal>maven-compile</goal>
<goal>maven-test</goal>
<goal>maven-war</goal>
<goal>maven-functional-test</goal>
</goals>
</execution>
</executions>
</plugin>
I have an Ant build that is currently being converted to Maven. However, the Ant build has 2 build targets - one that builds the entire app, and one that builds a JAR from some of those files (only a few). In Ant, it's easy to have multiple build targets to handle this, but I'm trying to determine the best way to handle this in Maven.
I could split the subset of files into a second project and it will have its own POM. Then the first project could depend on this one. However, since the subset of files is so small (less than 10), it seems like it might be overkill to have an entirely new project for that.
Are there other ways to handle this?
You could do this with profiles...
If you really wanted to use two separate profiles and customize the JAR plugin to include and exclude patterns of class and package names, you could easily do this by putting something like this in your POM:
<profiles>
<profile>
<id>everything</id>
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<classifier>everything</classifier>
<includes>
<include>**/*</include>
</includes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>only-library</id>
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<classifier>only-library</classifier>
<excludes>
<exclude>**/Main*</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Aside: If that seems like a lot of configuration, polyglot Maven's support for Groovy POMs is just about ready. It will cut the line count down considerably.
You would put this at the end of your pom.xml (withing the project element), and it adds two profiles. The first profile "everything" is really just there to demonstrate the configuration. This "everything" profile is unnecesary because it simply duplicates the behavior of the default JAR plugin jar goal execution. The second profile "only-library" excludes any class in any package that starts with the text "Main". To invoke these profiles:
mvn package -Peverything
mvn package -Ponly-library
I tested this against the sample application that ships with Chapter 6 of Maven by Example, and running either of these commands will produce a JAR file in ${basedir}/target that has a classifier. Since the JAR plugin's jar goal is bound to the package phase in the default maven lifecycle, these two profiles are going to modify the configuration for this plugin.
Or, you could do this with two JAR plugin executions...
If you need to create two JARs without using profiles. You can bind the JAR plugin's jar goal to the package lifecycle phase multiple times and use different configuration for each configured execution. If you configure two separate executions, each execution has an execution-specific configuration block so you can supply a unique identifier and include/exclude pattern for each execution.
Here is the build element you would use to add both custom JARs to the lifecycle phase "package". Doing this on a project with packaging "jar" would result in the jar goal being run three times. Once as the default lifecycle binding, and then twice for two custom, classified JARs.
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>only-library</id>
<goals><goal>jar</goal></goals>
<phase>package</phase>
<configuration>
<classifier>only-library</classifier>
<excludes>
<exclude>**/Main*</exclude>
</excludes>
</configuration>
</execution>
<execution>
<id>everything</id>
<goals><goal>jar</goal></goals>
<phase>package</phase>
<configuration>
<classifier>everything</classifier>
<includes>
<include>**/*</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
If you are not talking about including a different set of classes in each artifact, you'll want to use Maven Assemblies. If you want to know the details of assemblies, there is a chapter listed at the end of this answer from Maven: The Complete Reference. Frankly, i don't think that this particular chapter is a good introductory reference; in fact, I've had numerous reports that this chapter is nearly unreadable (and we're working to fix that). If you are looking to use assemblies, I'd recommend the Maven Assembly Plugin's documentation. In the left-hand nav menu you'll see a list of sample assembly descriptors.
Disclaimer: (Please) don't do this. If you are creating two difference JARs with two different set of classes I strongly recommend that you split the project up into two interdependent modules.
While you can do this with profiles, it is going to be easier for you to split the project into two (actually three). Longer term there are going to be challenges that you are going to face as your application scales. You will be responsible for figuring out this manual list of classes and packages to be included in each of your classified JARs.
There is minimal overhead to having a simple parent project that references two separate modules. If you look at the free Maven by Example book, we show how to make the transition between a single-module and a multi-module project. Chapters 3-5 focus on single module projects, and Chapter 6 shows you how you would combine these single module components into a larger multi-module project.
For more information:
You question involves the following topics, here are some links that will provide more details for each:
The Maven JAR Plugin: http://maven.apache.org/plugins/maven-jar-plugin/jar-mojo.html
Multi-module Maven Projects: Chapter 6 of Maven by Example and Section 3.6.2 of Maven: The Complete Reference.
The Maven Lifecycle (jar is bound to package if your packagin is "jar"): Section 3.5.2 of Maven by Example "Core Concepts" and Chapter 4 of Maven: The Complete Reference
Maven Assemblies: First, the Maven Assembly Plugin site, then Chapter 8 of Maven: The Complete Reference for some heavy (almost too heavy) details.
Your first thought is the correct one. Split the 2 pieces into 2 projects.
The maven philosophy is that each project should build one and only artifact (jar, war, whatever)
You could probably hack something together so that you only have one maven project building 2 atrifacts, but it would be a hack.
You can call ant from maven, so if you really want to do this, then I suggest you start looking at the maven ant plugin. The artifact id is "maven-antrun-plugin"
You've got 2 choices:
Profiles
Assemblies
If the subset is only a collection of resources, then I wouldn't make it a separate module.
If the project is always dependent on the subset which is packaged in a uniform way, then the subset is a good candidate to become a module.
If the subset is repackaged in multiple different "flavours", then I would define assemblies for each "flavour" and qualify the artifact names with a "classifier" see maven coordinates.
Finally, you could use profiles to determine which assemblies get produced, your default profile may only create the initial artifact "flavour" that is required during development.
A "full" profile may produce all "flavour" variants of the artifact for final deployment once development is complete.