Project Versioning - CI/CD - Jenkins - AEM - jenkins

AEM muti-module project has pom.xml at each module level. Version can we changed at that level, new build shows the artifacts with updated version.
With versions, there would be a new AEM Package created for each version. We would want to uninstall/delete the old package before installing new version.
Question is, how can we handle the uninstallation/deletion part of old package (new package to be installed is of different version) in CI-CD job? We could think of following solutions
Query the list of packages to get the version and use that to uninstall.
Pass version as parameter to Jenkins Job in "Build with Parameters". But automatic build trigger on code check-in could have issue.
Has anyone faced similar situation? Please share if there is best practice which is followed for AEM CI-CD jobs for handling different release versions.

Querying the list of packages is your best option I think. You can collect any previous version of your package then, no matter when you deployed that specific package to that server the last time.
An alternative option might be to delete the package immediately after installing it, but I fear there might be lots of problems waiting for you then:
Some packages need a restart, you may not delete the package to early, ...
As I said, I would go for the querying solution.

Related

Jenkins plugins for old versions

Fore some reasons I have to use Jenkins 2.32 and I need to install some plugins there. Machine with it has no internet, so I only can upload plugin file to install it from file.
So, the problem is there any easy way to obtain required plugin for required Jenkins version with all it's dependencies?
p.s.
I can't update Jenkins - it's out of my power.
p.p.s.
I find only way to download old versions of plugin, but by this way I can't check dependencies and required jenkins version before loading.
I had such an environment before.
Warning: it's an annoying process.
Because there was no internet, we uploaded all plugins manually, i.e. looking at the plugin page (e.g. https://plugins.jenkins.io/git/) and then downloading from the archive (e.g. https://updates.jenkins.io/download/plugins/git) the .hpi file. As you have to use a relatively old version of Jenkins you may want to check the changelog of the plugin, if you have to use an older version.
In addition on each plugin page the dependencies are listed and you have to repeat the above steps for each dependency.
The only good thing is that usually Jenkins gives you hints, which dependencies are missing after you uploaded a plugin.
You can probably extract the information out of the plugin-versions.json in the Jenkins Update Center.
For more information about the layout of update center, see this document.
You may also find my previous response on jenkins failed to install plugins - docker image (with groovy scripts) helpful

Promoting NuGet Packages to release versions

We have several assemblies that we share over all our projects. Since last year we use NuSpec files to create packages and share them all in a internal feed. The packaging is done as part of the build process (TFS 2015). Versioning is set to automatic, use date and time. The build is a CI build and triggered when merging from the Development branch to the CI branch.
When one wants to use the packages, one has to enable "include prelease" in the NuGet Package Manager to get these packages. This is ok, for time while a package is not completely tested, but ready to release.
Question
What I am looking for is a straight forward way now, to promote such packages, once they've been created and tested, to a release version, leaving the original Major.Minor.Revision but removing the date portion of the prelease version and share that new version in a - ideally the same - feed.

How to set NuGet restore in TFS to take the highest version available?

Let's say my solution have 2 package dependencies:
3rdParty.CompanyA.ProductA version 4.0.1
and 3rdParty.CompanyB.ProductB version 2.0.1
Both packages reference other package dependency "C".
The dependency for the first one is set to "higher then 1.2 - (>1.2)" , and the other one set to "higher then 1.8 - (>1.8)".
When I'm locally compiling the solution all works fine and the "C" version in the bin directory is the newest "1.8" as expected....
But when I compile this solution in TFS and reference the same .Nuget\NuGet.config file I'm getting the old 1.2v "C" version in the bin directory and therefore my product getting broken.
Is there a special configuration to notify TFS restore to take the version that will match all dependencies?
Update
I think I know where my issue is coming from and yet I'm net quite understand it.
In TFS I'm compiling and restoring packages for 8 projects.
After the compilation is over I run UnitTest with the "Test Assemblies" step.
I found out that with the default setting my unittest get broken due it referencing old DLL. But when I turn on the "Run In Parallel" it's working as expected.
Not sure if totally got your point. In my opinion, the two packages should in the some project. If not, there will be two bin folders one for 1.2 and one for 1.8.
It's a cousin dependencies concept in Nuget Restore:
When different package versions are referred to at the same distance
in the graph from the application, NuGet uses the lowest version
that satisfies all version requirements (as with the lowest
applicable version and floating versions rules). In the image below,
for example, version 2.0 of Package B will satisfy the other >=1.0
constraint, and will thus be used:
Source: How NuGet resolves package dependencies
It's also using NuGet.exe int TFS NuGet restore task. There shouldn't be any additional settings or parameters. Suggest you to enable debug model in TFS and go through more detail logs for trouble shooting. Otherwise, just use the simply workaround change the dependency for the first one also setting to "higher then 1.8 - (>1.8)"
Update
Suggest you just turn on the "Run In Parallel" of the VS test task, tests will run in parallel leveraging available cores of the machine. For how tests are run in parallel please refer Parallel Test Execution A good choice in this situation.

How to force maven release plugin to use previous release dependencies?

I need some help about the maven release plugin !
FIRST
I encountered an issue that is obstructing my development on Jenkins.
After installing the maven-release-plugin on Jenkins I've tested it.
The fact is that for a project without any dependencies, the plugin works
well.
But since I have some project dependencies it doesn't. Indeed after
selecting the "Default versioning mode" as "Specify version(s)" ,when I
save he configuration and try to perform maven release build, the "Default
versioning mode" parameter keeps its state, whatever I change it to and
simply goes back to ''none'' state. There is no way to modify it !!!
My main goal here is to specify my release version of a dependency and
this is what the "Default versioning mode" is done for because
maven-release-plugin can't work with SNAPSHOT dependencies !!
SECONDLY
Finally to solve this problem I downgraded the maven release plugin to the 0.7.1 version and now I can select the default versioning mode :D.
But the second issue I faced is that I want to release a job which depends on other one.
Project B ---------> Project A
depends on
The fact is I released the Project A in a 0.0.X version.
Now I would like to use/implement this Project A release during the release of the Project B because I don't want any SNAPSHOT dependencies in my released projects !!
I also tried using versions-maven-plugin but the use of both those plugin seems not to work together ... Am I wrong ?
All I want to do is working while using the cmd !!
Anyone encountered this problem yet ? is there a solution to it ?
JENKINS 1.596 & maven-release-plugin 0.7.1

Jenkins putting '$' characters in file/folder names, breaking automatic backups

I'm using Jenkins v1.546, hosted on a Windows Server 2008 R2 SP1 machine.
I've set up a fairly simple job for building a Maven Java project. It polls the SCM with no schedule and picks up remote build triggers, requiring an authentication token. It uses Subversion and performs clean checkouts with svn update. Additionally, it has a post-build step that archives some build artifacts (i.e., the resulting WAR and WSDLs).
The issue I'm experiencing is that the builds that it stores on the filesystem itself contain invalid characters in their filenames. This causes our automatic backup process to blow up, it being unable to alter or remove those directories/files with the '$'. I myself cannot move/delete those folders or files either, but if I rename it and remove the $, then things work fine. Oh, and if I try to follow one of these links with the $ in it, it doesn't resolve. None of the other jobs seem to do this - just my job, of course. Anyone know why this may be occurring and what I can do to resolve this?
I've attached multiple screenshots that show the bad filename and my Jenkins job setup. I had to white out some company information. If I can provide any additional information to help troubleshoot this, just let me know.
Also, as an update, I did some additional research, looking through the changelogs for each released version of Jenkins since my version (latest is 1.557). I saw three possible issues in the changelogs that could be related, but it's hard for me to tell. I cannot simply upgrade our Jenkins to test out this theory, since I'll need to provide a reason for upgrading beyond a hunch.
https://issues.jenkins-ci.org/browse/JENKINS-21023
https://issues.jenkins-ci.org/browse/JENKINS-20534
https://issues.jenkins-ci.org/browse/JENKINS-21958
The $ is a perfectly valid character in Windows directory name. You can manually make a folder with it, and delete it without any problems.
The com.company$moduleName syntax is used by Jenkins Maven-style job to separate modules of your build. If you don't see this structure for other people's jobs, it is because they are either not building a Maven job, or they don't have multiple modules in a single job.
What is strange though it that these are symlinks (I don't see that in my environment). It is possible that the location that is referenced by the symlink is deleted, but the link remains. In this case, you would not be able to navigate to that location through the link (this is what you are experiencing)
Is it possible that your backup software is deleting the target directories before deleting the links?
In any case, do a simple dir on the directory with the links to see what they link to. And then verify those target locations exists. If they don't, you need to figure out who/what is deleting the links' targets
Edit:
This seems to be more related to the issue that you are facing. Unfortunately, it's marked as "unresolved"
https://issues.jenkins-ci.org/browse/JENKINS-20725
The issue stems from the fact that the symlinks are referencing to targets with / instead of \
My Maven plugin (not Maven version) is 2.6. See if upgrading your Maven plugin in Jenkins will help you. Also, I am running Maven 3.2.2 from the automatic installers. Try with that, as I don't see symlinks in my modules.

Resources