Use multiple tags when creating Docker image using Spring Boot Maven Plugin - docker

I am using the Spring Boot Maven Plugin to create Docker images. They are tagged with latest, but I would like to have 2 tags added to it.
This is my current config:
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>

</configuration>
</plugin>
I would like to have latest and a certain build number (which will come from the Azure DevOps pipeline).
Is this possible with the Maven plugin? I could not find any info in the docs about it.

Is this possible with the Maven plugin? I could not find any info in the docs about it.
Indeed, just as the usuario pointed that the Spring Boot Maven Plugin does not support tagged multiple tags when creating Docker image at this moment.
Add option to create tags for the built image
As workaround, we could use other plugin, like jib-maven-plugin:
-Djib.to.tags=a,b,c
You could check this thread for some more details.

Related

Option to auto-generate Dockerfile and other deployment tooling in IntelliJ?

I'm exploring the initial steps of containerising a Tomcat-based Java project using Docker. With IntelliJ as my preferred IDE, I have successfully:
written a proof-of-concept Servlet;
set up a build artefact to create the resulting WAR;
with the IntelliJ Docker plugin and one of the official Tomcat Docker images, set up a container configuration that includes the WAR contents as one of its mount points;
deployed the container to Docker locally through IntelliJ and confirmed that I can successfully hit the Servlet through my local browser.
So in terms of the basic development cycle, I'm up and running.
But when I eventually come to external deployment (and even at some point during the development process), I will need to add libraries and resources and generate a truly self-contained container: in other words, I will need to go from the simple deployment that the IntelliJ plugin is currently doing of an "image with mount points" to having a full-fledged Dockerfile with all the relevant configuration specified, including my mounts effectively being translated into instructions to copy in the relevant content.
Now my question: how do people generally achieve this? Is there tooling built into IntelliJ that will assist with this? In the container deployment configuration settings in IntelliJ (where the mount points, base image etc are specified), there doesn't seem to be an option to configure resources to copy, for example (or an option to "copy into standalone container rather than mount from host FS"). Am I missing a tool/option somewhere, or is the scripting of the Docker file essentially a manual process? Or am I just barking up the wrong tree with my whole approach? I'd appreciate any advice on the process that people generally use for this!
Jib by Google
I think, Jib would provide, what you need. It also provides plugins for both Maven and Gradle, and the respective plugin can be triggered in IntelliJ via Run/Debug Configuration (see the example at the very bottom).
What is Jib?
Jib builds optimized Docker and OCI images for your Java applications without a Docker daemon - and without deep mastery of Docker best-practices. It is available as plugins for Maven and Gradle and as a Java library.
What does Jib do?
Jib handles all steps of packaging your application into a container image. You don't need to know best practices for creating Dockerfiles or have Docker installed.
Jib organizes your application into distinct layers; dependencies, resources, and classes; and utilizes Docker image layer caching to keep builds fast by only rebuilding changes. Jib's layer organization and small base image keeps overall image size small, which improves performance and portability.
Configuration
You can check the documentation. It contains a lot of information about different kind of configuration options regarding the creation and deploying a Docker image. Where you can also simply make use of environment variables.
For projects with Maven:
https://github.com/GoogleContainerTools/jib/tree/master/jib-maven-plugin
Build your image
Build to Docker daemon
Build an image tarball
For projects with Gradle:
https://github.com/GoogleContainerTools/jib/tree/master/jib-gradle-plugin
Same options as for Maven
Regarding your question, check this part for example: adding Arbitrary Files to the Image
In the container deployment configuration settings in IntelliJ (where the mount points, base image etc are specified), there doesn't seem to be an option to configure resources to copy, for example (or an option to "copy into standalone container rather than mount from host FS").
Demo
For demonstration purposes, I've created a simple project with Maven, where I also used the base image tomcat:9.0.36-jdk8-openjdk, which is also optional by the way - see Jib WAR Projects:
Servlet:
#WebServlet(urlPatterns = {"/hello-world"})
public class HelloWorld extends HttpServlet {
public void doGet(HttpServletRequest request, HttpServletResponse response) throws IOException {
response.setContentType("text/html");
PrintWriter out = response.getWriter();
out.println("Hello World");
}
}
pom.xml:
<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>org.example</groupId>
<artifactId>mvn-jib-example</artifactId>
<version>1.0</version>
<packaging>war</packaging>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<failOnMissingWebXml>false</failOnMissingWebXml>
</properties>
<dependencies>
<dependency>
<groupId>javax.servlet</groupId>
<artifactId>javax.servlet-api</artifactId>
<version>4.0.1</version>
<scope>provided</scope>
</dependency>
</dependencies>
<build>
<finalName>servlet-hello-world</finalName>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<version>3.8.1</version>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>com.google.cloud.tools</groupId>
<artifactId>jib-maven-plugin</artifactId>
<version>2.5.0</version>
<configuration>
<allowInsecureRegistries>true</allowInsecureRegistries>
<from>

</from>
<to>

<auth>
<username>registry_username</username>
<password>registry_password</password>
</auth>
<tags>
<tag>latest</tag>
</tags>
</to>
<container>
<appRoot>/usr/local/tomcat/webapps/ROOT</appRoot>
</container>
<extraDirectories>
<paths>
<path>
<from>./src/main/resources/extra-stuff</from>
<into>/path/in/docker/image/extra-stuff</into>
</path>
<path>
<from>/absolute/path/to/other/stuff</from>
<into>/path/in/docker/image/other-stuff</into>
</path>
</paths>
</extraDirectories>
</configuration>
</plugin>
</plugins>
</build>
</project>
Executing the following goals will create the docker image on the fly:
mvn clean package jib:dockerBuild
Confirm that the image was created:
docker image ls
Starting a container from the image
docker run --rm -p 8082:8080 registry.localhost/hello-world:latest
Result:
Deployment
To deploy the image to an external docker registry, you can check the sections below:
Authentication Methods
Using Specific Credentials
IDE
Last but not least, since you are working with IntellIJ IDEA, you can simply create a RUN/Debug configuration to automate the image creation and deployment via button button, e.g. one configuration for building the image, one for deploying it to localhost and one for deploying to extern registry and so on.
Here an example for maven (see):
The project that I am doing right now is using Spring-boot which actually has embedded Tomcat inside. And I use Docker Gradle plugin(https://plugins.gradle.org/plugin/com.bmuschko.docker-spring-boot-application) to build and push Docker image to registry which can be docker hub or AWS ECR. The combination is playing well with IntelliJ as it is Gradle task anyway.
Because it is Spring-boot, the plugin can build image based on any basic JRE image(I use https://hub.docker.com/_/adoptopenjdk) with minimum configuration. Do not need to write your own Dockerfile at all.
docker {
def registryHost = 'xxx.dkr.ecr.us-west-2.amazonaws.com'
springBootApplication {
baseImage = "${registryHost}/caelus:springboot-jdk14-openj9"
images = ["${registryHost}/caelus:app"]
ports = [8080,8081]
jvmArgs =['-Djdk.httpclient.allowRestrictedHeaders=content-length']
}
}
I advice these questions:
How to use docker in the development phase of a devops life cycle?
How to deploy java application in a cloud instance from the scratch to an advanced architecture?
What code-repository should the Dockerfile get committed to?
As a summary, IntelliJ, Eclipse, VStudio are just IDEs, so they are not an option for deployment in real environments environments.
If you are talking about external deployment, you need a kind of site to store your docker images and at minimum a continuous integration server(Jenkins, Travis, Bamboo, Circle CI, buddy.works)
Basic Flow
Architect, Sysadmin, senior developer or something with infrastructure skills, must create the Dockerfile an other required files.
Developer does not need to worry about docker, volumes, ports, etc. Developer only needs to develop code(java in your case).
Developer perform a git push
Your continuous integration server detects this event and start docker build....
After docker build, the continuous integration server, push the new created docker image to you Docker Hub Repository
Using some configurations, your continuous integration server knows where the deployment is required (external deployment as you say). Example could be the next classic environment : testing or staging. In this case deployment is just the download of the requested docker image.
If Quality Assurance team and automated tests, ensure that everything is fine, your continuous integration server, performs last step: deploy docker image in your production environment, or as I said, just the docker image download.
Your questions
is the scripting of the Docker file essentially a manual process?
As I explained, Dockerfile is cornerstone of all. It's creation is manually and funny or challenging if you need a surgery or your are an artisan like :
tomcat user configuration at container start
tomcat advance variables configuration
any advanced tomcat configuration in which a human is required, but you want to automate it.
Java war inside a tomcat/webapps is a very common requirement, so you will find a lot of Dockerfiles or you could use the generated by your IntelliJ if it meets your requirements.
Fell free to contact me if you don't find a Dockerfile for you java app.

Q: How can I save an artifact into Nexus Repository using a groovy pipeline?

My question is about saving artifacts into a repository. Especially, I am trying to upload into the Nexus Repository artifacts and release versions after the execution of a build pipeline for a Maven project (through Jenkins).
The only way that I want to do so, is just by using a pipeline written in Groovy so to integrate with Jenkins.
Note: I want the artifact version number to be always the same and the version number to change dynamically (not manually).
Is there a command or code generally which enables me to do that?
You are on the wrong level, this should happen in maven.
In pom.xml you need. (more here)
<distributionManagement>
<snapshotRepository>
<id>nexus-snapshots</id>
<url>http://localhost:8081/nexus/content/repositories/snapshots</url>
</snapshotRepository>
</distributionManagement>
and then in the plugins section
<plugin>
<artifactId>maven-deploy-plugin</artifactId>
<version>2.8.2</version>
<executions>
<execution>
<id>default-deploy</id>
<phase>deploy</phase>
<goals>
<goal>deploy</goal>
</goals>
</execution>
</executions>
</plugin>
and you should be able to just do mvn clean deploy from your pipeline.
EDIT
There is another way with Nexus Artifact Uploader plugin
nexusArtifactUploader {
nexusVersion('nexus2')
protocol('http')
nexusUrl('localhost:8080/nexus')
groupId('sp.sd')
version("2.4.${env.BUILD_NUMBER}")
repository('NexusArtifactUploader')
credentialsId('44620c50-1589-4617-a677-7563985e46e1')
artifact {
artifactId('nexus-artifact-uploader')
type('jar')
classifier('debug')
file('nexus-artifact-uploader.jar')
}
artifact {
artifactId('nexus-artifact-uploader')
type('hpi')
classifier('debug')
file('nexus-artifact-uploader.hpi')
}
}
As #hakamairi already said, it is not recommended to re-upload artifacts with the same version to Nexus repository, Maven is built around the idea that an artifact's GAV always corresponds to a unique artifact.
However, if you want to allow re-deployment, you need to set the deployment policy of a release repository to "allow redeploy", then you can redeploy the same version. You cannot do that without allowing on repository side.
And for deploying to Nexus repo, you can use either Nexus Platform Plugin or Nexus Artifact Uploader.
ADDITIONAL SOLUTION THAT ALSO WORKS
I executed it manually and I exported the result of Nexus call. The result was the following command. This command need to be inserted inside the Jenkins pipeline as a Groovy code:
nexusPublisher nexusInstanceId: 'nexus', nexusRepositoryId: 'maven-play-ground', packages: [[$class: 'MavenPackage', mavenAssetList: [[classifier: '', extension: '', filePath: '**PATH_NAME_OF_THE_ARTIFACT**.jar']], mavenCoordinate: [artifactId: '**YOUR_CUSTOM_ARTIFACT_ID**', groupId: 'maven-play-ground', packaging: 'jar', version: '1.0']]], tagName: '**NAME_OF_THE_FILE_IN_THE_REPOSITORY**' }
In the field of filePath we need to insert the path and the name of the artifact.jar file.
In the field of artifactId we need to insert the custom (in this occasion for mine artifact) artifact id
In the field of tagName we need to insert the custom name of the directory from inside the Nexus Repository
This is a solution that can be done automatically without manual changes and edits. Once we have created the directory in Nexus repository this is going to be executed without any issue and without the need of changing the version number.
Note: also we need to enable re-deploy feature from inside the Nexus Repository settings.

How to use Confluence markup language with Maven 3 site Plugin?

I'd like to use Confluence markup language to generate my site using the Maven site plugin.
As said in the doxia documentation, it seems to be possible.
Here is my file structure (as required by the site plugin documentation) :
src
+- site
+- confluence
+- index.confluence
But just like that, nothing is generated. By looking at the FAQ, I tried to include the "doxia-module-confluence" in the plugin build :
<build>
<plugins>
<plugin>
<artifactId>maven-site-plugin</artifactId>
<version>3.1</version>
<dependencies>
<dependency>
<groupId>org.apache.maven.doxia</groupId>
<artifactId>doxia-module-confluence</artifactId>
<version>1.3</version>
</dependency>
</dependencies>
</plugin>
</plugins>
</build>
But it's still the same, my index page is not rendered (I've got the default one).
Does somebody know how to do ?
Actually, the file structure + the usage of that dependency (doxia-module-confluence) is OK.
The problem was I first generate the page without the dependency (so it didn't use my confluence file to generate the index), and then I added the dependency. But I didn't run mvn clean before the second mvn site so the index was directly used from the previously generated index.html !
generated maven sites does not match very well with Confluence style.
If you need to update Confluence pages at each deploy or release I have to suggest a small opensource plugin I have done called confluence-maven-plugin.
In this way, you can use maven site for other purposes...

Maven: dynamic specification of dependencies

I have started to learn Maven and have the following question:
I would like to dynamically specify a dependency for building maven project instead of using the dependency specified in POMs - is there a way to do that?
So although I have the following dependencies specified in POM
...
<dependencies>
<dependency>
<groupId>group</groupId>
<artifactId>ProjectComponent</artifactId>
<version>1.0</version>
</dependency>
...
I would like to specify in the build command that I want to use a different version.
Is there a way to specify this?
The idea is that I want to have an integration build made in Jenkins with a dependency on the latest available snapshot of the system for a particular branch. That snapshot is not released to the maven repository yet, so I would like to fetch it in Jenkins and specify a dependency for mvn build.
Thanks!
POSSIBLE SOLUTION: What I ended up with is to use ${my.lib.version} construction and specify it with -Dmy.lib.version=1.0-SNAPSHOT" when calling to mvn. Thus I can use it for Jenkins integration builds by fetching arbitrary snapshot versions of dependencies from svn and feeding their snapshot versions to the integration build pom.
Maven may use "dynamically" specified property (ex: group.ProjectComponent.version) with the help of profiles.
<dependencies>
<dependency>
<groupId>group</groupId>
<artifactId>ProjectComponent</artifactId>
<version>${group.ProjectComponent.version}</version>
</dependency>
So if you create some profiles you may switch between them (see maven references)
Example:
<profile>
<id>stable-builds</id>
<properties>
<group.ProjectComponent.version>1.0</group.ProjectComponent.version>
</properties>
<activation>
<activeByDefault>true</activeByDefault>
</activation>
</profile>
<profile>
<id>beta-builds</id>
<properties>
<group.ProjectComponent.version>2.0.Beta1</group.ProjectComponent.version>
</properties>
<activation>
<activeByDefault>false</activeByDefault>
</activation>
</profile>
What I ended up with is to use ${my.lib.version} construction and specify it with -Dmy.lib.version=1.0-SNAPSHOT" when calling to mvn. Thus I can use it for Jenkins integration builds by fetching arbitrary snapshot versions of dependencies from svn and feeding their snapshot versions to the integration build pom.
Just came across this as I was looking for something similar. In my case same application code is being reused on different stacks which means using different "drivers" for accessing data. Although drivers implement same interface they do come from different artifacts.
No you can't change the dependencies dynamically. Furthermore it does not make sense, cause you would like to have a reproducible build.

Using Maven API download a JAR

In a simple java program, How can I download a JAR from a Maven repository ?
The repository can be local as well as remote ? I am using Maven 3.
As noted by #amit your use of Maven-3 is not relevant. Your application is interested in accessing a JAR at runtime. It just so happens that this JAR is available at a Maven repository. Maven is a build-time tool. It cannot help you at runtime.
So if we have interpreted your question correctly, the issue is one of formulating the URL and making an HTTP request. Since you say the JAR is hosted by a Maven repository you know the format of the URL:
http://repository.url/group_id_segments_separated_with_slashes/artifact_id/version/artifact_id-version.jar
You can take advantage of this in your program if you need to access more than one JAR in this fashion.
define the necessary mapping for dependency tags in POM.xml and than provide repository information inside repository tags... for example..
<dependency>
<groupId>com.sun.jersey</groupId>
<artifactId>jersey-server</artifactId>
<version>1.11</version>
</dependency>
and
<repository>
<id>snapshot-repository.java.net</id>
<name>Java.net Snapshot Repository for Maven</name>
<url>https://maven.java.net/content/repositories/snapshots/</url>
<layout>default</layout>
</repository>
see more about this here

Resources