How to versionate artifacts on Artifactory without overwriting - jenkins

I'm trying to extend our Jenkins job (which builds the entire project) to deploy the built artifacts to our Artifactory but then I faced some problems related to the versioning of the artifacts. If I try to redeploy an artifact whose version didn't change (not a snapshot), I get an error 403 (user 'foo' needs DELETE permission) which is understandable, I should not replace an already released artifact. If the artifact version contains -SNAPSHOT then there are no problems, it's always uploaded. My question is: how we should approach the scenario of having locked overwriting in Artifactory?
Shouldn't the artifactory plugin from Jenkins just ignore the deploy of the artifact in case is already deployed instead of failing the job?
Or should we use always -SNAPSHOT (during development) even the artifact has not changed?
Do we increase the version on every release even the artifact has not changed?

Shouldn't the artifactory plugin from Jenkins just ignore the deploy
of the artifact in case is already deployed instead of failing the
job?
The job should fail if the artifact is already deployed with a fixed version (non -SNAPSHOT). For instance on a manual job trigger, I would like to know if I tried to build and deploy using a version name that is already published (maybe by someone else in the team)
Or should we use always -SNAPSHOT (during development) even the
artifact has not changed?
-SNAPSHOT is made for development. Yes we usually push the artifact at the end of the build, even if it did not change because you updated for instance a README and the job was triggered.
Usually SNAPSHOT have a lifetime depending on how you binary repository (here Artifactory) is configured. SNAPSHOT can be cleaned every 2 weeks for instance.
The link shared by Manuel has other interesting definitions like
Usually, only the most recently deployed SNAPSHOT,
for a particular version of an artifact is kept in the artifact repository.
Although the repository can be configured to maintain a rolling archive
with a number of the most recent deployments of a given artifact
https://docs.oracle.com/middleware/1212/core/MAVEN/maven_version.htm#MAVEN401
Do we increase the version on every release
even the artifact has not changed?
yup we increase the version number at every release. I call release what the customer will get. Except an exceptional occasion, you wont go through the process of release if the artifact didn't change. A release usually involves a lot of people in an organization, even people that are not from Development. A popular standard is to use semantic versioning https://semver.org/ Sometime people prefer to version with the date. My advice is to use semver and have a file in the artifact with the date of the build. This file could be used by the artifact itself to tell its version at runtime.

You could work with build numbers, and you wouldn't overwrite existing versions. Instead a buildNumber could include some bugfixes/security fixes.
https://docs.oracle.com/middleware/1212/core/MAVEN/maven_version.htm#A1000661
If you're using the depenfency, you can handle the versions with expressions. Exact version or expression which covers the buildNumber.

Related

Jenkins Multibranch Pipeline can't find Jenkinsfile in subdirectory using svn

I'm trying to set up a build using Multibranch. I'm basically having the same problem as stated here, but our SCM is Subversion. The Bug in the Bitbucket Branch Source Plugin as described here can therefore be ruled out, especially since our Jenkins has the newest version installed anyway.
I tried to find a similar ticket regarding my problem, but couldn't find one, so here I am.
As this particular project is configured in a way that configuration files (including something like the Jenkinsfile) are to be stored in a subfolder, I don't know what else to try, apart from configuring individual jobs. I'd rather stick to using Multipipelines, however, as they help keeping the build jobs tidy.

Versioning of artifacts - CI/CD

Build process of java code is currently generating artifacts with name having no version number.
For ex: serial-framework-SNAPSHOT.jar
We are currently in build phase of CI/CD pipeline
All the artifacts generated through maven build has no version number for back-end services, in build phase of CI/CD pipeline
Dependent artifacts that are required to build a specific jar are only stored in JFrog artifactory
1)
Does it require versioning of artifacts for Build/QA/prod phase of CI/CD pipeline?
2)
Does it require to store every artifact in JFrog artifactory? Because only dependent artifacts that are required by pom.xml during maven build are stored in JFrog
The versioning was well explained in the first answer by snukone.Note the below points which might be helpful regarding versioning
For development always follow the version as “versionNumber-SNAPSHOT “(capital letters)
Eg:- 1.0 -SNAPSHOT
2) For test/prod branch follow the version as “versionNumber-RELEASE”
Eg:- 1.0 – RELEASE
a) Snapshots are mutable, so they are used for development purpose.
b) Releases are immutable. Once committed we cannot override the artifact in the
artifactory. So releases are used for higher environments.
c) Snapshots capture a work in progress and are used during development. A Snapshot artifact has both a version number such as “1.3.0” or “1.3” and a timestamp. For example, a snapshot artifact for commons-lang 1.3.0 might have the name commons-lang-1.3.0-20090314.182342-1.jar
So in your case if you are using "serial-framework-SNAPSHOT" it will store as "serial-framework-version-timestamp.jar" in your artifactory.
Similarly if you are using "serial-framework-RELEASE" it will store as "serial-framework-version.jar" in your artifactory.
How versioning helps:
Versioning helps in case you want to restore an older version of your application (due to bugs that are heavily decreasing performance in production)
If you are running integrationtests on api or ui level, you can specify which versions are fitting together (ie via contract testing: https://github.com/pact-foundation/pact_broker)
Default cleaning processes helps you to prevent your artifactory from huge storage usage
Storing every artifact or not?
My personal experience: Just store the artifacts which are dependencies to other artifacts. Like Libs for example. If you are working with Docker Container you should think about to version the Docker images which you are producing on every build.

Staging of Repository within profile ID='X' is not yet started

Trying to deploy on a staging repository leads on the maven side to
400 , ReasonPhrase:Bad Request
and the server log contains
Staging of Repository within profile ID='X' is not yet started!
It makes no difference using maven-deploy-plugin or maven-release-plugin. All three leads to the error from above.
My deployment user has (admin) rights to deploy to every staging profile.
maven-release-plugin:
mvn release:stage -DstagingRepository=nexus::default::http://localhost:8081/nexus/service/local/staging/deploy/maven2
If you doesn't use versions with the maven-release-plugin like SNAPSHOT qualifier and similar, nexus-staging-plugin works fine.
What did I miss?
Staging of snapshot versions is not allowed, you need to use release versions.
At first glance you might think that this could be done by having Nexus rewrite the pom files and rename the artifacts. But it's not that simple, the version number is often embedded in the artifacts themselves. This is particularly true of assembled artifacts such as war/ear files, you'll find the version numbers inside contained artifacts, and inside configuration files within the artifact. Even if these could be rewritten by Nexus changing the version numbers potentially changes the behavior of the artifacts. In any case, Nexus will not change staged artifacts, any changes made could potentially lead to regressions. Staged artifacts (like all artifacts deployed to Nexus) are immutable.
Consequentially, you need to use a release version when staging.

Jenkins putting '$' characters in file/folder names, breaking automatic backups

I'm using Jenkins v1.546, hosted on a Windows Server 2008 R2 SP1 machine.
I've set up a fairly simple job for building a Maven Java project. It polls the SCM with no schedule and picks up remote build triggers, requiring an authentication token. It uses Subversion and performs clean checkouts with svn update. Additionally, it has a post-build step that archives some build artifacts (i.e., the resulting WAR and WSDLs).
The issue I'm experiencing is that the builds that it stores on the filesystem itself contain invalid characters in their filenames. This causes our automatic backup process to blow up, it being unable to alter or remove those directories/files with the '$'. I myself cannot move/delete those folders or files either, but if I rename it and remove the $, then things work fine. Oh, and if I try to follow one of these links with the $ in it, it doesn't resolve. None of the other jobs seem to do this - just my job, of course. Anyone know why this may be occurring and what I can do to resolve this?
I've attached multiple screenshots that show the bad filename and my Jenkins job setup. I had to white out some company information. If I can provide any additional information to help troubleshoot this, just let me know.
Also, as an update, I did some additional research, looking through the changelogs for each released version of Jenkins since my version (latest is 1.557). I saw three possible issues in the changelogs that could be related, but it's hard for me to tell. I cannot simply upgrade our Jenkins to test out this theory, since I'll need to provide a reason for upgrading beyond a hunch.
https://issues.jenkins-ci.org/browse/JENKINS-21023
https://issues.jenkins-ci.org/browse/JENKINS-20534
https://issues.jenkins-ci.org/browse/JENKINS-21958
The $ is a perfectly valid character in Windows directory name. You can manually make a folder with it, and delete it without any problems.
The com.company$moduleName syntax is used by Jenkins Maven-style job to separate modules of your build. If you don't see this structure for other people's jobs, it is because they are either not building a Maven job, or they don't have multiple modules in a single job.
What is strange though it that these are symlinks (I don't see that in my environment). It is possible that the location that is referenced by the symlink is deleted, but the link remains. In this case, you would not be able to navigate to that location through the link (this is what you are experiencing)
Is it possible that your backup software is deleting the target directories before deleting the links?
In any case, do a simple dir on the directory with the links to see what they link to. And then verify those target locations exists. If they don't, you need to figure out who/what is deleting the links' targets
Edit:
This seems to be more related to the issue that you are facing. Unfortunately, it's marked as "unresolved"
https://issues.jenkins-ci.org/browse/JENKINS-20725
The issue stems from the fact that the symlinks are referencing to targets with / instead of \
My Maven plugin (not Maven version) is 2.6. See if upgrading your Maven plugin in Jenkins will help you. Also, I am running Maven 3.2.2 from the automatic installers. Try with that, as I don't see symlinks in my modules.

How do I do release versioning with Gradle and Jenkins?

We're building a continuous integration pipeline for the project I'm working on. We have a number of build artifacts (both JAR and WAR files) which we have versioned and deployed to an Artifactory server.
All our JARs start at version 0.0.1-SNAPSHOT. As we develop, we'd like to mark milestones by setting a particular point in the codebase as 0.0.1, and starting development on 0.0.2-SNAPSHOT. Eventually, a particular version will get accepted by QA, and promoted to 0.1.0, and we will start working on 0.1.1-SNAPSHOT. The same process will happen with a release to Production, when we reach 1.0.0.
I can't seem to find a plugin for Jenkins that supports this kind of versioning. Ideally, it would track the current version of each WAR and JAR, and once it hit a particular point (after running acceptance tests) it would automatically increment the version. Does such a thing exist?
You can make use of the gradle-release plugin. Please find different approaches that are documented here

Resources