I am attempting to refactor a maven build process, and I am trying to populate a local maven repository to help with this refactor.
My build depends on obsolete versions of jar files that exist only in a maven repo on my network (not in maven central). Example: org.foo:example:1.7:jar
I have been attempting to run my maven build in a docker image image with the hope that I could identify all of the obsolete components being pulled from my maven repository.
My goal is to explicitly pull down dependencies from my maven repo and then build the application using only maven central as an external repository.
I have a docker file to run the build
FROM maven:3-jdk-8 as build
WORKDIR /build
# This pom.xml file only references maven central.
COPY pom.xml .
# Explicitly download artifacts into /root/.m2/...
RUN mvn dependency:get -Dartifact=org.foo:example:1.7.jar \
-DrepoUrl=https://my.maven.repo
# Run the build making use of the dependencies loaded into the local repo
RUN mvn install
Unfortunately, I see an error: could not resolve dependencies for project ... The following artifacts could not be resolved: org.foo:example:jar:1.7.
I presume there might be some metadata in my local org.foo:example:1.7:pom that has an awareness of its origin repository. I had hoped I could satisfy this dependency by pulling it into my local repository.
I also attempted to add the following flag
RUN mvn install --no-snapshot-updates
After further investigation, I discovered that the downloads in my .m2/repository contained files named _remote.repositories.
Solution: set -Dmaven.legacyLocalRepo=true when running dependency:get
With this flag, the _remote.repositories files are not created.
RUN mvn dependency:get -Dmaven.legacyLocalRepo=true \
-Dartifact=org.foo:example:1.7.jar \
-DrepoUrl=https://my.maven.repo
This question helped to provide a solution: _remote.repositories prevents maven from resolving remote parent
Related
I have a set up where a quarkus microservice docker image is based on the docker image of the mavencache. Mavencache image is created by
RUN maven dependency:go-offline
and it fetches a certain amount of the dependencies from the pom.xml. I'd rather say almost all dependencies.
But when the quarkus microservice docker container is being created, it runs:
CMD ["mvn", "quarkus:dev"]
and this command fetches some "Additional dependencies" which takes additional time which I would like to save. That's actually why I've created mavencache base docker image.
Does anyone from the quarkus can help with understanding why mvn quarkus:dev fetches additional maven dependencies instead of fetching them during mvn dependency:go-offline?
UPD1
The following picture describes how I expect the whole caching schema to be working. Despite of the mvn package or mvn quarkus:dev it should NOT fetch any additional dependencies on when some file in the src/ directory.
How to configure docker specific artifact dependencies which are managed in a different source code repository. My docker image depends on jar files (say project-auth), configuration (say project-theme) which is actually maintained in a different repository than the docker image.
What would be the best way to copy dependencies for a docker image (say project-deploy repo), prior to building the image. i.e in the above case project-deploy needs jar files and configuration which needs to be mounted as a volume from the current folder.
I don't want this to be committed as the dependencies tend to get stale and I want the docker image creation to be part of the build process itself.
You can use Docker multi-stage builds for this purpose.
With multi-stage builds, you use multiple FROM statements in your Dockerfile. Each FROM instruction can use a different base, and each of them begins a new stage of the build. You can selectively copy artifacts from one stage to another, leaving behind everything you don’t want in the final image.
For example:
Suppose that the source code for dependencies is present in repo - "https://github.com/demo/demo.git"
Using multi stage builds, you can create a stage in which you'll clone the git repo and create the dependency Jar (or anything else that you need) at runtime.
At last, you can copy the jar into your final image.
# Use any base image. I took centos here
FROM centos:7 as builder
# Install only those packages which are required.
RUN yum install -y maven git \
&& git clone <YOUR_GIT_REPO_URL>
WORKDIR /myfolder
# Create jar at run time. You can update this step according to your project requirements.
RUN mvn clean package
# From here our normal Dockerfile steps starts.
FROM centos:7
# Add all the necessary steps required to build your image
.
.
.
# This is how you can copy the jar which was created above (Step 4) in your final docker image.
COPY --from=builder SOURCE_PATH DESTINATION_PATH
Please refer this to get a better understanding about multi stage builds in docker.
I do have a jenkins job that builds XML beans jar files from the internal gitlab project and puts it on the artifactory. While having a build, this XML beans jar files are downloaded to the .m2 maven local repository. However, if this jar file exists in the .m2 repository then maven does not bother to download it from the artifactory. With being said, if there is a gitlab change, it does build it and put it on the artifactory. As there is already a jar file exist in .m2 repository, an old jar file is not being replaced with the new one. We ended up a wrong dependency to the customer with a release.
The question is , What am I doing wrong here?
mvn clean install -U
-U means maven will force update snapshot dependencies. Release dependencies can't not be updated this way.
I have a question related to maven - generating a war. Please see below.
- In one of my project (war), I am using a 3rd party jar (-SNAPSHOT version) whose entry I have made into my project pom.xml. So far it gets bundled correctly into the project war.
- But we encountered one issue in one of the java file inside this jar. For which my developer took the source code for the jar and modified-compiled and updated the jar file into local maven_repo directory.
- But whenever I build the project using maven clean:install command, my updated jar gets deleted from my local maven-repo dir and a fresh copy is downloaded from remote maven repo (where the actual 3rd party jar resides).
Can someone please help on this how can I manage so that maven use my modified jar and does not replace it with old jar during build process.
I am using maven-3.2.5.
you can run maven offline by running with the "-o" argument.
Example:
mvn clean install -o
Keep in mind that this will affect all your other dependencies and your need to have all the dependencies in your local .m2 repository.
Here is another thread taking up the issue of running maven offline:
How do I configure Maven for offline development?
I have a project that is building fine on my laptop. Today I started to set up the build for this on our Bamboo server. Everything is checked in. Both my laptop and the build server are using Maven 3.0.4.
I have a top-level aggregator pom that specifies several modules, but this pom is not the parent of any module. I do use parent poms, but those parents are in peer submodules of the submodules that depend on them, and I have blank "relativePath" elements in all poms.
In the Bamboo build of the top-level aggregator POM, I see several errors like this:
[ERROR] The project com.example.cde:java-project-parent:1.0.1 (/volatile/bamboo/bamboo3.4.3_data/xml-data/build-dir/FOO-BUILD-JOB1/java-project-parent/pom.xml) has 1 error
18-Dec-2012 16:40:21 [ERROR] Non-resolvable parent POM: Failure to find com.example.cde:project-parent:pom:1.0.0 in http://hostname.net:8081/nexus/content/groups/stuff was cached in the local repository, resolution will not be reattempted until the update interval of nexus has elapsed or updates are forced and 'parent.relativePath' points at no local POM # line 6, column 11 -> [Help 2]
The "java-project-parent" is one of the poms in the parent hierarchy.
What I've discovered is that running "mvn install" in the top-level aggregator pom isn't actually installing the artifacts in the submodules. When I looked in the local repo, the only thing in each directory in the local repo was a file like "...pom.lastUpdated". The actual POM wasn't there.
When I had the admin manually run "mvn install" in the first submodule, that actually installed the POM into the local repo. I have a feeling if he manually installs the other two parent poms, the build of the project that depends on all three of them will build fine.
I must be misunderstanding an important detail of how a build with submodules works. What am I missing?
run maven clean install with Force update option as below:
mvn clean install -U
Your hierarchy is probably broken. You can test that by building it in your local machine after wiping the local repository. Most likely you will find the same failures as on the build machine.
To fix it I would suggest remove all the relative path elements and adjust the structure so a build will work fine. Ideally you even break the pure parent projects out into separate projects and release them into your repository manager so that any other builds get them from there..
Try doing a mvn -U install to force a mvn trip to your nexus repo for updated aritfacts
Also run mvn with the -e switch to see detailed error messages