I am trying to include a dependency in my maven module but it doesn't appear to be working, the following is what I have:
<dependencySet>
<useProjectArtifact>false</useProjectArtifact>
<includes>
<include>com.sam:myWebApp</include>
</includes>
<outputDirectory>/mywebapp/files</outputDirectory>
<unpack>false</unpack>
</dependencySet>
The full code is in my github repo:
https://github.com/darkcloudi/PuppetExample/tree/master/puppet/exampleWebApp
When I build my parent pom it builds 2 modules:
1.) the webapp which was created using mvn archetype (a simple app)
2.) a puppet module
What I want is when the puppet maven project is built the webapp to appear in the files directory so me including the dependencySet and specifying it in the directory should work, shouldn't it? I cannot see where I am going wrong, its should be as simple as including the war dependency to the files folder.
Thanks in advance for any responses.
Not sure why but it worked with moduleSets:
<moduleSets>
<moduleSet>
<useAllReactorProjects>true</useAllReactorProjects>
<includes>
<include>com.sam:myWebApp:*:*</include>
</includes>
<binaries>
<outputDirectory>myWebApp/files</outputDirectory>
<unpack>false</unpack>
</binaries>
</moduleSet>
</moduleSets>
Related
I've started using travis-ci to automate my builds. I have several open source projects and they all deploy to nexus sonatype from where they go to maven central. They're all Java projects that use Maven to build and github as a repo.
I've been doing this manually for years and I have appropriate keys and logins and my pom is compatible etc.
Implementing the first one was easy enough, it is a single module project and it builds and deploys just fine. Then I did a second one, a multi module project and got that working in much the same way. My third, however, is baffling me.
The maven build on this thing is a bit tricky but it does run fine locally and I even have it running the actual build on travis successfully. But the deploy doesn't work.
The problem is that when it tries to connect to nexus sonatype I get an authorisation error:
Failed to execute goal org.apache.maven.plugins:maven-deploy-plugin:2.7:deploy (default-deploy) on project madura-bundles:
Failed to deploy artifacts: Could not transfer artifact nz.co.senanque:madura-bundles:pom:4.5.6 from/to sonatype-nexus-staging (https://oss.sonatype.org/service/local/staging/deploy/maven2/):
Failed to transfer file: https://oss.sonatype.org/service/local/staging/deploy/maven2/nz/co/senanque/madura-bundles/4.5.6/madura-bundles-4.5.6.pom.
Return code is: 401, ReasonPhrase: Unauthorized.
It looks like I have not set up my sonatype credentials correctly. But I have set it up the same way as I did for the other two projects. Specifically I go into Nexus Sonatype and get my Access User Token and add those to my environment (SONATYPE_USERNAME and SONATYPE_PASSWORD, I deleted both of these and re-entered them in case it was a typo). I also add references to those in my local maven settings file:
...
<server>
<id>ossrh</id>
<username>${env.SONATYPE_USERNAME}</username>
<password>${env.SONATYPE_PASSWORD}</password>
</server>
...
The local maven settings file is a file in my project and the .travis.yml maven commands refer to it. The travis.yml file has a deploy section identical to the other two (working) projects, except I have been adding extra bits to try and make it work. But none of the differences there look relevant. The working deploys look like this:
deploy:
provider: script
script: "mvn versions:set -DnewVersion=${TRAVIS_TAG} && mvn clean deploy -B -U -P release --settings travis/settings.xml"
on:
tags: true
so this is only going to kick off if the repo has been tagged and it uses the tag as the version number. In the other projects this works fine, but not in the one I'm trying to get working. The tag does trigger the deploy as it should, but the deploy fails.
Does anyone know why I get the deploy on one project but not another? Thanks for any help.
Okay, I figured it out. The problem is that the parent pom of the failing project does not have a release profile, the parent pom of the working project does have one. The release profile in both cases looks like this:
<profile>
<id>release</id>
<build>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-gpg-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<id>sign-artifacts</id>
<phase>verify</phase>
<goals>
<goal>sign</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>org.sonatype.plugins</groupId>
<artifactId>nexus-staging-maven-plugin</artifactId>
<version>1.6.3</version>
<extensions>true</extensions>
<configuration>
<serverId>ossrh</serverId>
<nexusUrl>https://oss.sonatype.org/</nexusUrl>
<autoReleaseAfterClose>true</autoReleaseAfterClose>
</configuration>
</plugin>
</plugins>
</build>
</profile>
It is needed to sign the generated artifacts (jar files, javadoc files etc) with the gpg plugin and to deploy them to nexus. The deploy to nexus is attempted without this but because it didn't have the reference to serverId:ossrh it doesn't pick up credentials from the maven settings file and therefore I get an authorization error.
The release profile needs to be on the parent project and all the module projects. I had added it to the modules but forgot the parent.
I had a situation where I have developed my own Jenkins plugin for the first time. The main purpose of the plugin is to publish a message to Google Cloud Platform. All the code that I have written in Jenkins is working fine in the local environment from eclipse. But when I am using the same code in Jenkins it is causing some dependency errors. Any help is really appreciated.
Thank you.
Note: Jenkins and Eclipse are on the same machine
How Jenkins resolves its dependencies is really a concern here for me.
Eclipse uses the M2eclipse plugin to add your dependencies to the classpath when running your plugin from Eclipse.
Jenkins only resolves dependencies between plugins. Furthermore Jenkins expects the .hpi packages to be self-contained, i.e. containing all JAR dependencies you need. mvn package should copy the jars of all your dependencies and put them in the .hpi file in the WEB-INF/lib folder.
In your specific case it seems that the Google Cloud implementation expects some implementation of a channel service provider on the classpath, so you should add a dependency on grpc-okhttp or grpc-netty, so they get packaged into the .hpi file as well.
Sometimes there could be a choice of class loader issue so please add follwing lines of code before calling classes of Google.
Thread.currentThread().setContextClassLoader(getClass().getClassLoader());
Also add the following code in Jenkins plugin pom.xml to specify Jenkins that the dependencies in the pom.xml should be loaded first rather than Jenkins dependencies.
<pluginManagement>
<plugins>
<plugin>
<groupId>org.jenkins-ci.tools</groupId>
<artifactId>maven-hpi-plugin</artifactId>
**<configuration>
<pluginFirstClassLoader>true</pluginFirstClassLoader>
</configuration>**
</plugin>
</plugins>
</pluginManagement>
I am currently working on a multi-module project with the following structure.
root
-module A
-module B
What I want to do is to execute module B (The main method of the module) after the compiling of module B (Module B depends on module A). But I need to do this with a customized command.
Ex -
mvn runb
I know that the exec maven plugin can be used to run a project using maven. What I don't understand is how to create a custom command (phase) in maven. Is there anyway to achieve this without writing a maven plugin?
I referred various guides such as https://community.jboss.org/wiki/CreatingACustomLifecycleInMaven trying to achieve this. But they need to create components.xml and lifecycle.xml files under src/resources/META-INF. I don't understand how to apply that file structure to my project since it is a multi-module project where each module has seperate src directories.
(I'm using maven 3)
You cannot create a custom lifecycle without writing a Maven plugin.
And without hacking Maven itself, at least as of Maven 3.0.5, it is not possible to add a custom phase to Maven through a plugin. The phases are loaded up by the core of Maven from its configuration before any plugins are processed.
If you really have your heart set on using one command to do what you want, writing a plugin is the only way. With some pluginGroup mappings in your settings.xml, this can be made simpler (you can specify mvn my:plugin rather than mvn com.mygroupid:plugin).
But if you are willing to have a slightly more verbose syntax on the command line, what you want could be achieved through profiles and the exec maven plugin.
Add a profile to module B that uses the exec plugin to run itself.
Something like this:
<project>
...
<profiles>
<!-- This profile uses local webapp security instead of the BlueCoat device, useful for testing -->
<profile>
<id>execb</id>
<build>
<plugins>
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>exec-maven-plugin</artifactId>
<version>1.2.1</version>
<executions>
<execution>
<id>runb</id>
<goals>
<goal>java</goal>
</goals>
<phase>verify</phase> <!-- Anything after package phase -->
<configuration>
<!-- Exec plugin configuration goes here -->
</configuration>
</execution>
</executions>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
You'll need to configure the exec plugin depending on how you run your JAR, more info here.
What this does is run the exec plugin as part of module B's build, but only if the execb profile is activated.
Now, when you just want to build your project (without any exec), build like normal (e.g. mvn install).
When you want to build and run, use the command line:
mvn install -Pexecb
and it will do the exec as part of the build.
I have a multimodule maven project. Project layout is described below:
PARENT
|-CHILD1
|-CHILD2
Parent project has pom packaging type and declares CHILD1 and CHILD2 projects as modules. Also PARENT project declares profile dev which declares some property.
CHILD1 project has jar packaging type and "overrides" PARENT dev profile by adding some dependency(dependency on commons-collections for example).
CHILD2 project has war packaging type and has dependency on CHILD1 project. Also CHILD2 "overrides" parent dev profile by adding another dependency(dependency on commons-io for example, I mean dependency that is not related with that one in project CHILD1).
Then when I run mvn clean install -Pdev maven doesn't put commons-collections.jar(dependency that is declared in CHILD1 project) to WEB-INF/lib of CHILD2 project, but commons-io.jar is there.
So, the question is: Why does not maven put dependencies from profiles that are declared in dependent projects of target project if target project declares another set of dependencies in that profile?
Actually I have much more projects and much more dependencies that varies in different profiles. And I want to declare project specific dependencies in that project pom.xml(supposing that declaring profile in project will "override" parent profile declaration)
I am assuming that you want to be able to test locally when developing, test your changes against a staging environment and finally deploy to production.
The critical thing that you need to keep in mind is that when an artifact gets deployed to the local/remote repository, the active profiles is not part of what gets deployed, so when you add dependencies via profiles things become very dangerous as you have no way of knowing if the webapp was built with the DEV profile active or the PROD profile active, and then when that built artifact gets deployed into production you could be royally screwed over.
So the short of this is that you ensure that your artifacts are independent of deployment environment.
This means that, for example, you will pick up configuration from:
files on the classpath
system properties
jndi entries
So for example, if deploying to Tomcat, you might put a configuration.properties into $CATALINA_HOME/lib
Your webapp on startup will use getClass().getResource('/configuration.properties') to resolve the properties file and fail to start-up if the file is missing (fail-fast)
you can let your unit/integration tests use a different config by putting a test version of configuration.properties in src/test/resources.
You use the same principle for the <scope>provided</scope> style dependencies of your application. In otherwords a dependency that the container is contracted with providing should be provided by the container. So you might build the production version of tomcat/jetty for yourself using Maven also and add in the required dependencies into that assembly. This would be things like the production version uses a MySQL database, so you need to add the mysql-jdbc driver into to $CATALINA_HOME/lib. It is relatively easy to do this with the assembly plugin as you are really just repacking a zip with some bits included and others excluded.
When testing locally you will want to make use of the helper plugins' run goals such as jetty:run and tomcat:run. The solution here is that there is nothing wrong with giving these plugins dependencies via profiles because you are not affecting the dependencies of the artifact you are only affecting the plugin's classpath.
e.g.
<project>
<!-- ... some stuff .. -->
<profiles>
<profile>
<id>DEV</id>
<build>
<plugins>
<plugin>
<groupId>org.mortbay.jetty</groupId>
<artifactId>jetty-maven-plugin</artifactId>
<dependencies>
<dependency>
<groupId>commons-dbcp</groupId>
<artifactId>commons-dbcp</artifactId>
<version>1.4</version>
</dependency>
<dependency>
<groupId>mysql</groupId>
<artifactId>mysql-connector-java</artifactId>
<version>5.1.18</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
</profile>
</profiles>
</project>
You can also configure system properties or classpath additions to pull in the required configuration file.
The net result of all this is that the artifact remains environment independent and you can test easily against the various environments
Hope this answers your question (even if sideways)
I'm using IntelliJ IDEA 10.0.2 (with groovy/grails support), maven 2.2.1 and grails 1.3.6.
We have a big maven project, which depends on many other maven projects. Let's say the workspace structure looks as follows:
backend-project (Java project, without further project dependencies)
output-project (Java project, without further project dependencies)
frontend-project (Grails project, which dependes on both, backend and output)
That means, within my frontend-project's pom.xml I have defined 2 Project Dependencies:
e.g.
<dependency>
<groupId>com.company.project</groupId>
<artifactId>backend-project</artifactId>
<version>${project.version}</version>
</dependency>
<dependency>
<groupId>com.company.project</groupId>
<artifactId>output-project</artifactId>
<version>${project.version}</version>
</dependency>
Let's assume that I change some Java Source within the output or backend project. When I
run the grails application now, then it won't consider the changes. I have to publish the changed artifact locally and then resolve it again by the grails project before running the application in order to take effect.
This tells me that the grails project just depends on the project dependency jars within the maven repository and does not care about any existing project dependency "sources" within the workspace.
Does it have to be that complicated and if so, why?
Note that if my frontend project was a spring web project, the changes will be seen in IDEA and tomcat will even reload the change dynamically.
Note that when IDEA recognizes a mavenized grails project, it won't run the grails project with: "grail run-app" anymore but with a more complicated version of: "mvn grails:exec -Dcommand=run-app". Don't know if this is of any relevance..
Thanks!
Mr. Slash
Maven always picks up the jar files from the repositories (local and then remote etc depending on your pom.xml config).
Think about it: How would your main project know where the backend-project or the output-project files are located?
If you want a direct dependency then remove it from pom.xml and modify the project build path to directly add the projects' outputs to your main projects. In Eclipse open the properties page of the main project => build path => projects => add.