Nexus Repository Manager's Remove Snapshots scheduled task is not cleaning out timestamped artifact's associated GPG signature files? - maven-3

I am running a dockerized Nexus Repository Manager v2.13.0-01. I have artifacts in the Snapshot repository that I want to remove using the Remove Snapshots scheduled task. My parameters for this scheduled task is as follows.
Repository/Group : Snapshots (Repo)
Minimum snapshot count: 1
Snapshot retention (days): 1
Remove if released : (unchecked)
Grace period after release (days): 1
Delete immediately: (checked)
When I run this task, I am expecting at least 1 snapshot to be kept and all other snapshots older than 1 day to be removed. What I am noticing when I am on the Browse Storage tab is that all the .jar + .pom files are being removed including associated .md5 and .sha1 files. For example, the following files are removed.
my-artifact-0.0.1-20160705-020817-5-javadoc.jar
my-artifact-0.0.1-20160705-020817-5-javadoc.jar.md5
my-artifact-0.0.1-20160705-020817-5-javadoc.jar.sha1
my-artifact-0.0.1-20160705-020817-5-sources.jar
my-artifact-0.0.1-20160705-020817-5-sources.jar.md5
my-artifact-0.0.1-20160705-020817-5-sources.jar.sha1
my-artifact-0.0.1-20160705-020817-5.pom
my-artifact-0.0.1-20160705-020817-5.pom.md5
my-artifact-0.0.1-20160705-020817-5.pom.sha1
my-artifact-0.0.1-20160705-020817-5.jar
my-artifact-0.0.1-20160705-020817-5.jar.md5
my-artifact-0.0.1-20160705-020817-5.jar.sha1
However, the associated .asc, .asc.md5, and .asc.sha1 hashes are NOT being removed. For example,
my-artifact-0.0.1-20160705-020817-5.jar.asc
my-artifact-0.0.1-20160705-020817-5.jar.asc.md5
my-artifact-0.0.1-20160705-020817-5.jar.asc.sha1
The following are the 2 maven plugins that I use to publish to my SNAPSHOT repository in the pom.xml.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.6</version>
<executions>
<execution>
<id>sign-artifacts</id>
<phase>verify</phase>
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>1.6.7</version>
<extensions>true</extensions>
<configuration>
<serverId>mycompanynexus</serverId>
<nexusUrl>http://nexus.mycompanynexus.io/</nexusUrl>
<autoReleaseAfterClose>true</autoReleaseAfterClose>
</configuration>
</plugin>
My distribution management section in the pom.xml looks like the following.
<distributionManagement>
<snapshotRepository>
<id>mycompanynexus</id>
<url>http://nexus.mycompanynexus.io/content/repositories/snapshots/</url>
</snapshotRepository>
<repository>
<id>mycompanynexus</id>
<url>http://nexus.mycompanynexus.io/content/repositories/releases/</url>
</repository>
</distributionManagement>
Not shown is my settings.xml where I supply the credentials for publishing to these repositories.
When I deploy, I simply type in mvn clean deploy with Maven v3.3.9.
As I was querying for solutions, I came across this blog post http://blog.sonatype.com/2010/01/how-to-generate-pgp-signatures-with-maven/, however, I don't know if I agree with not signing my SNAPSHOT artifacts (for if I didn't, then the GPG signatures and checksums would not be produced and I wouldn't have to worry about deleting them with the scheduled service). Moreover, OSSRH's guidelines illustrates signing SNAPSHOT artifacts. Maybe it is standard practice to NOT sign SNAPSHOT artifacts?
Any help is appreciated.

For internal usage of Nexus Repository Manager it is probably not standard practice to sign artifacts with GPG - not released and not snapshots either. However for distribution to the Central Repository via OSSRH it is pretty common although probably also not standard.
Typically the GPG plugin usage is part of a release profile and that is often not use for snapshot deployments. So depending on your internal needs you should be okay to just not use GPG at all or just not for snapshot builds.
On the other hands the snapshot deletion scheduled task should work for that deletion. There were some changes with regards to performance and behaviour in the last releases though. I just tested this and it turns out there is a regression in 2.11+. We created an issue at https://issues.sonatype.org/browse/NEXUS-10460 . Please follow that for updates. Hopefully we will include a fix in the next release (2.14).
Update 2016-07-13: A patch is now available and fixed jar is attached to the linked issue.

Related

Q: How can I save an artifact into Nexus Repository using a groovy pipeline?

My question is about saving artifacts into a repository. Especially, I am trying to upload into the Nexus Repository artifacts and release versions after the execution of a build pipeline for a Maven project (through Jenkins).
The only way that I want to do so, is just by using a pipeline written in Groovy so to integrate with Jenkins.
Note: I want the artifact version number to be always the same and the version number to change dynamically (not manually).
Is there a command or code generally which enables me to do that?
You are on the wrong level, this should happen in maven.
In pom.xml you need. (more here)
<distributionManagement>
<snapshotRepository>
<id>nexus-snapshots</id>
<url>http://localhost:8081/nexus/content/repositories/snapshots</url>
</snapshotRepository>
</distributionManagement>
and then in the plugins section
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<version>2.8.2</version>
<executions>
<execution>
<id>default-deploy</id>
<phase>deploy</phase>
<goals>
<goal>deploy</goal>
</goals>
</execution>
</executions>
</plugin>
and you should be able to just do mvn clean deploy from your pipeline.
EDIT
There is another way with Nexus Artifact Uploader plugin
nexusArtifactUploader {
nexusVersion('nexus2')
protocol('http')
nexusUrl('localhost:8080/nexus')
groupId('sp.sd')
version("2.4.${env.BUILD_NUMBER}")
repository('NexusArtifactUploader')
credentialsId('44620c50-1589-4617-a677-7563985e46e1')
artifact {
artifactId('nexus-artifact-uploader')
type('jar')
classifier('debug')
file('nexus-artifact-uploader.jar')
}
artifact {
artifactId('nexus-artifact-uploader')
type('hpi')
classifier('debug')
file('nexus-artifact-uploader.hpi')
}
}
As #hakamairi already said, it is not recommended to re-upload artifacts with the same version to Nexus repository, Maven is built around the idea that an artifact's GAV always corresponds to a unique artifact.
However, if you want to allow re-deployment, you need to set the deployment policy of a release repository to "allow redeploy", then you can redeploy the same version. You cannot do that without allowing on repository side.
And for deploying to Nexus repo, you can use either Nexus Platform Plugin or Nexus Artifact Uploader.
ADDITIONAL SOLUTION THAT ALSO WORKS
I executed it manually and I exported the result of Nexus call. The result was the following command. This command need to be inserted inside the Jenkins pipeline as a Groovy code:
nexusPublisher nexusInstanceId: 'nexus', nexusRepositoryId: 'maven-play-ground', packages: [[$class: 'MavenPackage', mavenAssetList: [[classifier: '', extension: '', filePath: '**PATH_NAME_OF_THE_ARTIFACT**.jar']], mavenCoordinate: [artifactId: '**YOUR_CUSTOM_ARTIFACT_ID**', groupId: 'maven-play-ground', packaging: 'jar', version: '1.0']]], tagName: '**NAME_OF_THE_FILE_IN_THE_REPOSITORY**' }
In the field of filePath we need to insert the path and the name of the artifact.jar file.
In the field of artifactId we need to insert the custom (in this occasion for mine artifact) artifact id
In the field of tagName we need to insert the custom name of the directory from inside the Nexus Repository
This is a solution that can be done automatically without manual changes and edits. Once we have created the directory in Nexus repository this is going to be executed without any issue and without the need of changing the version number.
Note: also we need to enable re-deploy feature from inside the Nexus Repository settings.

travis-ci failing to deploy to sonatype

I've started using travis-ci to automate my builds. I have several open source projects and they all deploy to nexus sonatype from where they go to maven central. They're all Java projects that use Maven to build and github as a repo.
I've been doing this manually for years and I have appropriate keys and logins and my pom is compatible etc.
Implementing the first one was easy enough, it is a single module project and it builds and deploys just fine. Then I did a second one, a multi module project and got that working in much the same way. My third, however, is baffling me.
The maven build on this thing is a bit tricky but it does run fine locally and I even have it running the actual build on travis successfully. But the deploy doesn't work.
The problem is that when it tries to connect to nexus sonatype I get an authorisation error:
Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy (default-deploy) on project madura-bundles:
Failed to deploy artifacts: Could not transfer artifact nz.co.senanque:madura-bundles:pom:4.5.6 from/to sonatype-nexus-staging (https://oss.sonatype.org/service/local/staging/deploy/maven2/):
Failed to transfer file: https://oss.sonatype.org/service/local/staging/deploy/maven2/nz/co/senanque/madura-bundles/4.5.6/madura-bundles-4.5.6.pom.
Return code is: 401, ReasonPhrase: Unauthorized.
It looks like I have not set up my sonatype credentials correctly. But I have set it up the same way as I did for the other two projects. Specifically I go into Nexus Sonatype and get my Access User Token and add those to my environment (SONATYPE_USERNAME and SONATYPE_PASSWORD, I deleted both of these and re-entered them in case it was a typo). I also add references to those in my local maven settings file:
...
<server>
<id>ossrh</id>
<username>${env.SONATYPE_USERNAME}</username>
<password>${env.SONATYPE_PASSWORD}</password>
</server>
...
The local maven settings file is a file in my project and the .travis.yml maven commands refer to it. The travis.yml file has a deploy section identical to the other two (working) projects, except I have been adding extra bits to try and make it work. But none of the differences there look relevant. The working deploys look like this:
deploy:
provider: script
script: "mvn versions:set -DnewVersion=${TRAVIS_TAG} && mvn clean deploy -B -U -P release --settings travis/settings.xml"
on:
tags: true
so this is only going to kick off if the repo has been tagged and it uses the tag as the version number. In the other projects this works fine, but not in the one I'm trying to get working. The tag does trigger the deploy as it should, but the deploy fails.
Does anyone know why I get the deploy on one project but not another? Thanks for any help.
Okay, I figured it out. The problem is that the parent pom of the failing project does not have a release profile, the parent pom of the working project does have one. The release profile in both cases looks like this:
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>sign-artifacts</id>
<phase>verify</phase>
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>1.6.3</version>
<extensions>true</extensions>
<configuration>
<serverId>ossrh</serverId>
<nexusUrl>https://oss.sonatype.org/</nexusUrl>
<autoReleaseAfterClose>true</autoReleaseAfterClose>
</configuration>
</plugin>
</plugins>
</build>
</profile>
It is needed to sign the generated artifacts (jar files, javadoc files etc) with the gpg plugin and to deploy them to nexus. The deploy to nexus is attempted without this but because it didn't have the reference to serverId:ossrh it doesn't pick up credentials from the maven settings file and therefore I get an authorization error.
The release profile needs to be on the parent project and all the module projects. I had added it to the modules but forgot the parent.

rpm-maven-plugin how to control the name under which rpm is stored in m2 repository?

Our organization has a convention for naming rpms. Typically, the rpm will have a shorter base name than the Maven project. There is also a convention around how releases are named. So we want a name like
${shortname}-${project.version}-${release}.noarch.rpm.
I want to build such rpms using the rpm-maven-plugin rather than older nmake technology.
And this is easily accomplished using the plugin's parameters. The rpm generated in the target directory has the desired name.
However, when mvn install installs this rpm into the maven repository, it insists on storing it the "maven way": ${project.artifactId}-${project.version}.rpm
I want the rpm stored in the standard maven repository directory using the name that is initially created on package.
How may I accomplish this?
Update:
I tried using the maven-install-plugin (install-file goal) and did not get the results I was after. But this was partly because I wasn't invoking it properly. It wasn't being invoked. If you define an install-file goal, it must be explicitly tied to the install phase. Doing so, ie, adding a <phase>install</phase> to the configuration at least invoked the execution of the install that I wanted but it still did not allow me to name the rpm as desired.
According to Karl Heinz Marbaise, a committer on the Apache Maven Project, what I am trying to do is impossible, and should not be attempted.
However, I need what I need, and have found a compromise that gives me most of that. The only thing I had to sacrifice was the assumption that the repository RPM must live in the same repository directory as the rest of the project. This is a very minor sacrifice. Once I make that, I can store the rpm, named as I want it to be, in a directory of the Maven repository, named with the short name.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-install-plugin</artifactId>
<version>2.5.2</version>
<executions>
<execution>
<id>install-rpm</id>
<goals>
<goal>install-file</goal>
</goals>
<phase>install</phase>
<configuration>
<file>${project.build.directory}/rpm/${rpm.name}/RPMS/noarch/${rpm.name}-${project.version}-${rpm.release}.noarch.rpm</file>
<groupId>${project.groupId}.rpms</groupId>
<artifactId>${rpm.name}</artifactId>
<version>${project.version}-${rpm.release}</version>
<classifier>noarch</classifier>
<packaging>rpm</packaging>
</configuration>
</execution>
</executions>
</plugin>
Using a groupId of ${project.groupId}.rpms rather than just ${project.groupId} allows all rpms built this way to live outside of the main branch of the repository, which is useful in our situation.
Using a version of ${project.version}-${rpm.release} rather than just ${project.version} allows the release to be incorporated into the name.
And the noarch classifier gets that into the name as well.

How to stop maven-deploy-plugin:deploy-file deploying source?

I have a WAR project that also produces some extra artefacts that I want to deploy to my artifact repo. So I have configured executions under the deploy plugin to deploy each of the extra artefacts
<execution>
<id>deploy-exe</id>
<phase>deploy</phase>
<goals>
<goal>deploy-file</goal>
</goals>
<configuration>
<file>target/${project.build.finalName}.exe</file>
<packaging>exe</packaging>
<!-- pom, sources and javadoc already deployed by project. Release repo will fail redeployment -->
<generatePom>false</generatePom>
<sources/>
<javadoc/>
</configuration>
</execution>
But each execution will try and deploy the javadoc and sources for the project, even though I have tried to explicitly switch them off for the execution. NB I want javadoc and sources for the project, but I only want them deployed once (by deploy mojo).
This isn't a big deal until it comes to release time at which point my build breaks because it tries to deploy the javadoc and source for the deploy mojo as well as each of the deploy-file mojo executions to a release repo that doesn't allow redeploy of artifacts.
Is it possible to configure the maven-deploy-plugin to not deploy source & javadoc for the deploy-file mojo?

Ant to Maven - multiple build targets

I have an Ant build that is currently being converted to Maven. However, the Ant build has 2 build targets - one that builds the entire app, and one that builds a JAR from some of those files (only a few). In Ant, it's easy to have multiple build targets to handle this, but I'm trying to determine the best way to handle this in Maven.
I could split the subset of files into a second project and it will have its own POM. Then the first project could depend on this one. However, since the subset of files is so small (less than 10), it seems like it might be overkill to have an entirely new project for that.
Are there other ways to handle this?
You could do this with profiles...
If you really wanted to use two separate profiles and customize the JAR plugin to include and exclude patterns of class and package names, you could easily do this by putting something like this in your POM:
<profiles>
<profile>
<id>everything</id>
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<classifier>everything</classifier>
<includes>
<include>**/*</include>
</includes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
<profile>
<id>only-library</id>
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<configuration>
<classifier>only-library</classifier>
<excludes>
<exclude>**/Main*</exclude>
</excludes>
</configuration>
</plugin>
</plugins>
</build>
</profile>
</profiles>
Aside: If that seems like a lot of configuration, polyglot Maven's support for Groovy POMs is just about ready. It will cut the line count down considerably.
You would put this at the end of your pom.xml (withing the project element), and it adds two profiles. The first profile "everything" is really just there to demonstrate the configuration. This "everything" profile is unnecesary because it simply duplicates the behavior of the default JAR plugin jar goal execution. The second profile "only-library" excludes any class in any package that starts with the text "Main". To invoke these profiles:
mvn package -Peverything
mvn package -Ponly-library
I tested this against the sample application that ships with Chapter 6 of Maven by Example, and running either of these commands will produce a JAR file in ${basedir}/target that has a classifier. Since the JAR plugin's jar goal is bound to the package phase in the default maven lifecycle, these two profiles are going to modify the configuration for this plugin.
Or, you could do this with two JAR plugin executions...
If you need to create two JARs without using profiles. You can bind the JAR plugin's jar goal to the package lifecycle phase multiple times and use different configuration for each configured execution. If you configure two separate executions, each execution has an execution-specific configuration block so you can supply a unique identifier and include/exclude pattern for each execution.
Here is the build element you would use to add both custom JARs to the lifecycle phase "package". Doing this on a project with packaging "jar" would result in the jar goal being run three times. Once as the default lifecycle binding, and then twice for two custom, classified JARs.
<build>
<plugins>
<plugin>
<artifactId>maven-jar-plugin</artifactId>
<executions>
<execution>
<id>only-library</id>
<goals><goal>jar</goal></goals>
<phase>package</phase>
<configuration>
<classifier>only-library</classifier>
<excludes>
<exclude>**/Main*</exclude>
</excludes>
</configuration>
</execution>
<execution>
<id>everything</id>
<goals><goal>jar</goal></goals>
<phase>package</phase>
<configuration>
<classifier>everything</classifier>
<includes>
<include>**/*</include>
</includes>
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
If you are not talking about including a different set of classes in each artifact, you'll want to use Maven Assemblies. If you want to know the details of assemblies, there is a chapter listed at the end of this answer from Maven: The Complete Reference. Frankly, i don't think that this particular chapter is a good introductory reference; in fact, I've had numerous reports that this chapter is nearly unreadable (and we're working to fix that). If you are looking to use assemblies, I'd recommend the Maven Assembly Plugin's documentation. In the left-hand nav menu you'll see a list of sample assembly descriptors.
Disclaimer: (Please) don't do this. If you are creating two difference JARs with two different set of classes I strongly recommend that you split the project up into two interdependent modules.
While you can do this with profiles, it is going to be easier for you to split the project into two (actually three). Longer term there are going to be challenges that you are going to face as your application scales. You will be responsible for figuring out this manual list of classes and packages to be included in each of your classified JARs.
There is minimal overhead to having a simple parent project that references two separate modules. If you look at the free Maven by Example book, we show how to make the transition between a single-module and a multi-module project. Chapters 3-5 focus on single module projects, and Chapter 6 shows you how you would combine these single module components into a larger multi-module project.
For more information:
You question involves the following topics, here are some links that will provide more details for each:
The Maven JAR Plugin: http://maven.apache.org/plugins/maven-jar-plugin/jar-mojo.html
Multi-module Maven Projects: Chapter 6 of Maven by Example and Section 3.6.2 of Maven: The Complete Reference.
The Maven Lifecycle (jar is bound to package if your packagin is "jar"): Section 3.5.2 of Maven by Example "Core Concepts" and Chapter 4 of Maven: The Complete Reference
Maven Assemblies: First, the Maven Assembly Plugin site, then Chapter 8 of Maven: The Complete Reference for some heavy (almost too heavy) details.
Your first thought is the correct one. Split the 2 pieces into 2 projects.
The maven philosophy is that each project should build one and only artifact (jar, war, whatever)
You could probably hack something together so that you only have one maven project building 2 atrifacts, but it would be a hack.
You can call ant from maven, so if you really want to do this, then I suggest you start looking at the maven ant plugin. The artifact id is "maven-antrun-plugin"
You've got 2 choices:
Profiles
Assemblies
If the subset is only a collection of resources, then I wouldn't make it a separate module.
If the project is always dependent on the subset which is packaged in a uniform way, then the subset is a good candidate to become a module.
If the subset is repackaged in multiple different "flavours", then I would define assemblies for each "flavour" and qualify the artifact names with a "classifier" see maven coordinates.
Finally, you could use profiles to determine which assemblies get produced, your default profile may only create the initial artifact "flavour" that is required during development.
A "full" profile may produce all "flavour" variants of the artifact for final deployment once development is complete.

Resources