How to archive all the build versions (Artifacts) in target folder - jenkins

Each time i generate my build through jenkins, my existing jar file in the target folder is overwritten by maven. For example: i have a existing version of 1.0 in jenkins target folder, now if i create a new build with version 1.1, the previous version in my target folder gets overwritten.
I don't want that to happen, i want to archive all the versions (because we might provide some of the old features to certain set of customers). i am just trying to understand is there way to do this in jenkins pipeline. I don't prefer plugins, it would be nice to do it declarative way using jenkins file.

First of all, it's not the best solution to store your artifacts just in target folder without any copying to other place. Usually all needed build artifacts are stored in Nexus or Artifactory repositories (of course, you can copy them to some local directory also). You can do that in pipeline Jenkinsfile as well, but you still require to install needed plugin. For example, for publishing artifacts to Nexus repo, you can use Nexus Platform Plugin, see this answer for details.
About overwriting your target folder, I'm not sure if it's cleaned by Jenkins by default. To clean workspace, you need to specify Discard old builds option in job configuration first.
Seems to be that you just execute mvn clean ... command, that's why target folder is cleaned, so I would recommend to check that first.

Related

TFS - How to set build definition to build on workspace changes, but NOT download files to workspace?

I have our Solution on CI build. That works.
When dev's check in changes, the solution builds, but only for changes to that solution.
How do I get the build definition to build on changes to OTHER folder changes outside of the solution?
Yes, I can add a workspace in the workspace sources tab. But that means all that code is downloaded on every build.
Our solution has over a dozen dependencies. I would like to trigger a build if any of those dependencies change. We don't need the dependency source code to download into the build workspace at all. That's just pointless.
Yes, we have a folder in TFS for our Nuget Packages. We check them in automatically on build (Thank you TFS).
I could just add the NugetPackage workspace to the solution's workspace list, BUT that would result in every version of every dependency getting downloaded into the build workspace.
How can I trigger a build on a change that I do NOT list in the workspaces list of the build definition?
Btw, we are using TFS 2012
I was hoping the Cloaking feature would allow for this, but if it's cloaked, the automated build does not trigger. The automated build only fires if the workspace folder is set to active. Which also means download every NugetPackage in that same folder!
It's not able to trigger a CI build on a change which not list in the workspaces list of the build definition.
A few other things to know
Make sure the folders you include in your trigger are also included in your mappings on the Repository tab(the same to
workspace mappings).
Source Link
As a workaround you could set clean workspace to false, which will not get other unchanged files every time.
If your build process does not require a clean workspace or
repository, you can significantly reduce the time that is required to
run the build setting this parameter value to False.

How to keep artifacts directory during the multi-configuration build in TFS?

I have a solution with multiple projects where some are web apps. I have set up a multi-configuration build in TFS vNext that builds the single app, creates an MSDeploy package, gets the proper staging configuration files and add or replaces the files in the package archive file.
I'd like to use the deployment files created as artifacts to be used in a Release Management pipeline. The problem is that the artifacts directory is purged before each build (i.e. build of a web application). At the end, only the artefacts of the last app that was built are left there.
I can certainly configure the step to copy the artifacts somewhere else, but then the question is how to delete it only at the very start of the build (and by that I mean the build of all projects).
Is there a way how to disable purging of the artefacts directory or how to perform an operation only at the beginning of the build? Has anyone similar experience?
Use the "Publish Artifacts" task to store the artifacts in a UNC location or in TFS itself so they're available for release.

Jenkins switch build folder

I'm not even sure if I'm thinking about this correctly so I'm having difficulty googling it. I've got Jenkins set up and building a site and correctly sending the build artifacts over SSH to the live server.
My ideal workflow would be to ssh into the server, drop the new assets into a build folder, copy the old build files to a backup directory and drop all the new build files where the old build files used to be.
Not sure if that even makes sense or if there is a better way to do this. To be clear, I'm not talking about a single .war file or anything. I'm talking about a package of PHP files, images, CSS and other stuff.
I'm new to Jenkins in general so any help pointing me in the right direction is greatly appreciated.
See the ArtifactDeployer Plugin:
ArtifactDeployer plugin enables you to archive build artifacts to any remote locations such as to a separate file server.
...
ArtifactDeployer is a complete alternative to the built-in Jenkins feature "Archiving artefacts' and it is aimed at providing an uniform deployment mechanism.
Add it to your project's config with Post-build Actions → Add post-build action → [ArtifactDeployer] - Deploy the artifacts from build workspace to remote locations.
or the Flexible Publish Plugin:
...
[Send build artifacts over SSH]
...
Add it to your project's config with Post-build Actions → Add post-build action → Flexible publish.
or an idea I haven't tried myself yet, so no guarantees:
Configure your live server to be Jenkins slave node, create a project that is bound to this slave and use the Copy Artifact Plugin therein:
Adds a build step to copy artifacts from another project.
Add it to this project's config with Build → Add build step → Copy artifacts from another project.

Understanding Jenkins' Archive The Artifacts and the use of it

I'm trying to understand what it does. Currently, this is the value that I see - dist/.tgz
From what I understand, our grunt scripts makes a tgz file. However, I don't know what Jenkins does.
I got an error when I didn't specify any pattern
ERROR: No artifacts are configured for archiving.
You probably forgot to set the file pattern, so please go back to the configuration and specify it.
If you really did mean to archive all the files in the workspace, please specify "**"
Build step 'Archive the artifacts' changed build result to FAILURE
Most importantly, it allows you to archive items from your job's workspace in a persistent and accessible way, linked to the specific build number.
I.e. you have a job Build that compiles your sources into program.exe, archiving it linked to the build it was produced by, and keeping it accessible for developers or other jobs can come in very handy.
Additionally, archived artifacts are transferred to your jenkins master, so your job can run on any slave, but your archived files will be always accessible, even when that particular slave is offline.
Also, with the right configuration and plugins, other projects can access archived artifacts from other projects. I.e. a job Deploy that uploads your program.exe to some location is as trivial as copying the archived artifact of the last successful build into its workspace for the upload.
Theres quite some information on SO already, i.e. here.

Build multiproject Gradle on Jenkins

I have a Gradle multiproject hosted in Mercurial repo. I would like to setup my Jenkins in such a way, that if I commit changes into only 1 subproject, then only that subproject will be built and published to my Nexus repo.
Can somebody give me a hint? Or is it at all possible?
We sort of have this working.
We create a project in Jenkins for each gradle subproject. And in the Jenkins configuration we build only the subproject by doing something like:
gradle clean :<subproject>:build
We still have the problem that the job is fired for all checkins to the entire project. I would to configure Jenkins to build only when there's checkin to the subproject, but don't know how to specify this.
Leaving our final solution for the future here.
We created a separate Jenkins job for each subproject. Jenkins' Mercurial plugin allows to specify "modules":
Reduce unnecessary builds by specifying a comma or space delimited list of "modules" within the repository. A module is a directory name within the repository that this project lives in. If this field is set, changes outside the specified modules will not trigger a build (even though the whole repository is checked out anyway due to the Mercurial limitation.)
This way our jobs are triggered only when change occurred in the monitoring sub-project.
I guess you need to create a project in jenkins for each subproject.
Other option would be to find if there is a way to intercept the repo sync and see what subproject has changed and do the build dynamically.

Resources