Downloading maven metadata connection refused causing slow build - maven-3

I'm using maven 3.1.1 and am working on a big Java project in which we have a large number of dependencies on different components of the Spring Framework. Whenever I build via mvn clean install the build takes forever. I've looked at the terminal, and essentially the build takes long because maven is trying to download metadata from a source from which it can't establish a connection:
It basically blocks here:
Downloading: http://source.mysema.com/maven2/releases/org/springframework/spring-beans/maven-metadata.xml
After a while, I'll get a connection refused before the client can even proceed any further. I have no such direct dependency from source.mysema.com so I'm not exactly sure why maven would even attempt to download from here.
A few questions:
1) How does metadata actaully work? When does maven actually attempt to pull this information? Does metadata exist for every dependency, or is this repository-specific?
2) Is there a way to force a timeout on the client to not pull this metadata if it can't establish a connection after let's say, 2 seconds?
3) I've looked into our own internal repository in which this project depends via the <repository> tag. In trying to debug this issue, I've looked directly at our Nexus repository and saw that the metadata.xml file contains a huge list of versions for this specific Spring dependency. Why should my build always download ALL the versions for this dependency?
My suspicion was that my repository definitions in my pom.xml was causing maven to download from that source.mysema.com transitively (via some remote repository I've definied). So I commented out all my repository definitions in hopes that maven wouldn't talk to any remote repositories and instead pull out dependencies from my local m2, but somehow it keeps trying to download from source.mysema.com.
Thoughts? Thank you.

Answering your questions
1) How does metadata actaully work? When does maven actually attempt to pull this information? Does metadata exist for every dependency, or is this repository-specific?
Answer : maven-metadata is downloaded since you have specified Release into one of your version tags for eg
<spring.version>3.2.4.RELEASE</spring.version>
So now with daily build, you will see the metadata being downloaded as maven checks for latest build versions for this release. You will be seeing something like this in your screen
Downloading: http://nexus.mynexuslocation/nexus/content/repositories/public/org/springframework/spring-tx/maven-metadata.xml
Downloading: http://central.maven.org/maven2/org/springframework/spring-tx/maven-metadata.xml
Also check the updatePolicy in your pom.xml if it is has been set to daily
2) Is there a way to force a timeout on the client to not pull this metadata if it can't establish a connection after let's say, 2 seconds?
Answer : Not sure of this one.
3) I've looked into our own internal repository in which this project depends via the tag. In trying to debug this issue, I've looked directly at our Nexus repository and saw that the metadata.xml file contains a huge list of versions for this specific Spring dependency. Why should my build always download ALL the versions for this dependency?
Answer : You have to move nexus repository up in the hierarchy. Maven will look at nexus first and only when not able to find it, will go outside nexus.

Related

Pointing Jenkins to Use Another Plugin Repository

Good afternoon,
As I understand Jenkins, if I need to install a plugin, it goes to Jenkins Plugins
The problem I have is Jenkins is installed on a closed network, it cannot access the internet. Is there a way I can download all of the plugins, place them on a web server on my local LAN, and have Jenkins reach out and download plugins as necessary? I could download everything and install one plugin at a time, but that seems a little tedious.
You could follow some or all of the instructions for setting up an artifactory mirror for the plugin repo.
It will need to be a http/https server and you will find that many plugins have a multitude of dependencies
The closed network problem:
You can take a cue from the Jenkins Docker install-plugins.sh approach ...
This script takes as input a list of plugins, and optionally versions (eg: $0 workflow-aggregator:2.6 pipeline-maven:3.6.5 job-dsl:1.70) and will download all the plugins and dependencies into a working directory.
Our approach is to create a file (under version control) and redirect that to the command line input (ie: install-plugins.sh $(< plugins.lst).
You can download from where you do have internet access and then place on your network, manually copying them to your ${JENKINS_HOME}/plugins directory and restart the instance.
The tedious list problem:
If you only specify top-level plugins (ie: what you need), every time you run the script, it will resolve the latest dependencies. Makes for a short list, but random dependencies if they get updated at https://updates.jenkins.io. You can use a two-step approach to address this. Use the short-list to download the required plugins and dependencies. Store the generated explicit list for future reference or repeatability.

Correct usage of Nexus IQ for Javascript based projects

I have just started out trying to use Nexus IQ server to scan a Javascript based project of mine which uses libraries from npm and bower.
I am using the Jenkins Nexus Platfom Plugin and have configured a build step to connect to our Nexus IQ server instance. As part of the plugin I have configured it to scan for Javascript files within locations of the built project where the npm and bower dependencies are installed to.
The final report that gets generated on our Nexus IQ server is huge, in fact it reaches a limit of results (10000 rows) it can display and so is unable to display everything it finds.
I'm not 100% sure if I am doing things right here, and wondered whether anyone else out there has any experience of how to get sensible results from Nexus when scanning npm\bower installed dependencies.
I'm looking at the Licence Analysis section now and can see over 3000 rows of various 'Not supported' licence threats coming from libraries that havent directly been included in the project, e.g. listed in my projects package.json file, but I guess these are child dependencies of libraries I have specified to be installed.
Can anyone offer any advice on the best approach to getting Nexus IQ to handle Javascript projects that rely on npm\bower dependencies?

Maven repositories

We are using maven in the development process. Maven provides a nice feature of configuring the repositories. Using this feature I have created a remote internal repository and I can download the dependencies from that repository.
The development machines are pointing to this remote internal repository. Each development machine has its own local repository(~/.m2/repository/) and hence the dependencies of the project are downloaded from the remote internal repositor**y to the **local repository(~/.m2/repository/) on each developer machine.
Is there any way that the local repository(~/.m2/repository/) on developer machines can be set to the internal remote repository that we have created and which is used for downloading the dependencies from.
If take a look on Maven Introduction to Repositories first paragraph says:
There are strictly only two types of repositories: local and remote.
There is no way how you could change this behavior.
If you would handle that differently it would cause many problems. E.g. build would take much longer because of downloading file all files, IDE would work not work properly (project dependencies would not be stored local), ...
May I suggest another approach to share dependencies and artifacts. In our projects we use nexus as a proxy and repository for our artifacts. It works well with no issues. A basic configuration I already posted here.
After nexus is running you could also setup continous integration using jenkins and enjoy a fully automated environment.
Is your requirement to avoid each developer from having to download all dependencies to his local repository?
Assuming your remote internal repository has the same format as a maven local repository, you can achieve this by adding the following line in the settings.xml of all your developers.
<localRepository>shared-drive-location-of-remote-repository</localRepository>

Grails: working offline with snapshot dependencies

I am using some plugins which depend on snapshot versions of other plugins.
As I understand, Ivy tries to fetch the newest version of these plugins every time you start grails. If Ivy does not succeed, grails will not start :-(
As I like to develop offline, I am now looking for a way which lets me avoid this behaviour...
You could pull them down and store them on your machine using a local repository and comment out any remote repositories. Here is some documentation. Scroll down to "local resolvers"

Resolving snapshot versions of Grails plugins (deployed using the maven-publisher plugin)

Good day,
I'm trying to integrate our company's Grails plugins into our Maven repositories (our repositories are named 'snapshots' and 'releases').
To do that, I installed the maven-publisher plugin in all of our plugins, and I'm deploying them using the "grails maven-deploy" command. This works well.
However, if I deploy a SNAPSHOT version of a plugin (say, version 1.0.0-SNAPSHOT), it gets properly deployed in our repository, but I can't install it in our applications (using version "latest.integration").
I'm using Grails 1.3.7.
First of all, when deployed, the actual artifact name has a timestamp added to it ("blablabla-1.0.0-20110421.122823-1.zip"). However, the version is still 1.0.0-SNAPSHOT. I'm guessing that it's Maven that does that transformation.
However, Ivy doesn't seem to understand the transformation, or to handle SNAPSHOT versions. I get errors like:
==== http://myRepo/repository/snapshots: tried
-- artifact myOrg#blablabla;latest.integration!blablabla.zip:
http://myRepo/repository/snapshots/myOrg/blablabla/[revision]/blablabla-[revision].zip
Initial research has revealed that I could create a resolver pattern, but that seems a little bit complicated for something that should work out of the box, and my initial tests were not conclusive anyway (I tried a few patterns, none of which worked).
I should note that deploying my plugins locally using the "maven-install" command works, because the script create an artifact with the proper version (blablabla-1.0.0-SNAPSHOT.zip) alongside the one with timestamps.
Does anybody have a solution?
Thanks!
Guillaume.
I resolved this modifying Artifactory snapshot repository configuration:
<snapshotVersionBehavior>non-unique</snapshotVersionBehavior>
Now when you have foo-plugin-1.0-SNAPSHOT.zip and you upload it the name stays same.

Resources