Publishing/copying artifacts from one Ivy repository to another one - ant

We are using ant with ivy through Jenkins to do our daily builds. Initially we want to publish our artifacts to our test Ivy repository and once testing is done, we want to copy the same artifacts to our released Ivy repository without rebuilding them. Does the Ivy or Jenkins has any tasks that help with this? Also, when we publish them to test repository we are planning on using integration status, is there a way we can change the status to release when we copy to released Ivy repository? We have multiple modules that we do builds for but only selected of them need to be moved to released Ivy repository. Any help is greatly appreciated.

You can use the install task to copy artifacts between repositories.
A better solution to consider is using the staging suite a feature of Nexus professional. It works by creating a temporary repository for each release candidate, which can be promoted until deemed worthy of release. Other repository managers have similar features, worth considering rather than building your own.

Related

Does Jenkins support incremental pipeline builds?

I have been searching far and wide to see if I can find information on Jenkins incremental pipeline builds that does not involve Maven.
The general idea is that I want to build a generic project and run specific steps of the pipeline if the underlying code has changed. If the code did not change, I want to re-use the results from a previous build.
The reason why I want to do this, is to drastically reduce build times for huge projects.
Imagine that you only need to fix 1 line in a SCSS file, but the whole project needs to be rebuild, repackaged, etc because of this. In the meantime, the site is live and broken and waiting 15 mins to be fixed.
Can someone give a basic example of how such a build can be created or where I can find more information on incremental building?
The only thing I have been able to find is incremental building for Maven projects, but this is not applicable for me.
The standard solution is to create modules that depends on each others.
Publish the built artifact of your modules to a binary repository like Sonatype Nexus (you can easily create private npm repo as well as proxy npm repo).
During the build download the dependencies, instead of building them.
If this solution is not the one you want to take, you will have a hard time hacking a solution. To persist the state of your steps, an easy solution is to create files in the job workspace and read them at next build

Versioning of artifacts - CI/CD

Build process of java code is currently generating artifacts with name having no version number.
For ex: serial-framework-SNAPSHOT.jar
We are currently in build phase of CI/CD pipeline
All the artifacts generated through maven build has no version number for back-end services, in build phase of CI/CD pipeline
Dependent artifacts that are required to build a specific jar are only stored in JFrog artifactory
1)
Does it require versioning of artifacts for Build/QA/prod phase of CI/CD pipeline?
2)
Does it require to store every artifact in JFrog artifactory? Because only dependent artifacts that are required by pom.xml during maven build are stored in JFrog
The versioning was well explained in the first answer by snukone.Note the below points which might be helpful regarding versioning
For development always follow the version as “versionNumber-SNAPSHOT “(capital letters)
Eg:- 1.0 -SNAPSHOT
2) For test/prod branch follow the version as “versionNumber-RELEASE”
Eg:- 1.0 – RELEASE
a) Snapshots are mutable, so they are used for development purpose.
b) Releases are immutable. Once committed we cannot override the artifact in the
artifactory. So releases are used for higher environments.
c) Snapshots capture a work in progress and are used during development. A Snapshot artifact has both a version number such as “1.3.0” or “1.3” and a timestamp. For example, a snapshot artifact for commons-lang 1.3.0 might have the name commons-lang-1.3.0-20090314.182342-1.jar
So in your case if you are using "serial-framework-SNAPSHOT" it will store as "serial-framework-version-timestamp.jar" in your artifactory.
Similarly if you are using "serial-framework-RELEASE" it will store as "serial-framework-version.jar" in your artifactory.
How versioning helps:
Versioning helps in case you want to restore an older version of your application (due to bugs that are heavily decreasing performance in production)
If you are running integrationtests on api or ui level, you can specify which versions are fitting together (ie via contract testing: https://github.com/pact-foundation/pact_broker)
Default cleaning processes helps you to prevent your artifactory from huge storage usage
Storing every artifact or not?
My personal experience: Just store the artifacts which are dependencies to other artifacts. Like Libs for example. If you are working with Docker Container you should think about to version the Docker images which you are producing on every build.

How do you organize ant macrodefs used by multiple projects?

I would like each of my Git repositories to have their own build.xml file, but avoid having to copy paste a lot of macrodefs used by the different build scripts.
What would be the best way to organize this?
Adding the ant macrodefs to a seperate Git repository and make them available for all the build projects on my Jenkins server?
Adding them for instance to a directory of the Ant installation folder?
Does anybody have some experience with this kind of setup?
I do the same. I feel strongly that every project should be stand-alone and not depend on another source code repository. To achieve this I package my common macrodef's as an ANTLib. These are simply jar files that can be imported into the ANT build like other 3rd party tasks.
The following answer explains how to create an antlib:
How to manage a common ant build script across multiple project build jobs on jenkins?
The really big advantage of this approach is when you save the ANTlib jar in a versioned repository like Nexus. I use ivy to manage my dependencies and this allows my common build logic to be versioned, enabling me to safely change logic without breaking older builds. When you think about it this is the same kind of workflow implemented by Maven (which also downloads plugins).

Jenkins + Tycho: propagating update sites

I'm wondering if there is an easy way to "publish" p2 update sites in Jenkins (built with Tycho) so that they can easily be accessed in downstreams jobs? Currently I'm doing it semi-manually using Jenkins support for copying artifacts between jobs, and then specifying a repository-mirror element in a job-specific settings.xml which refers to the artifacts copied into the job, but this is all a little tricky and requires configuring jobs and build settings in a number of different places.
Is there any nicer way short of using an external solution such as Artifactory?
The only solution involving a repository manager that I am aware of is to use a Nexus and the Unzip Plug-in. (Disclaimer: The Unzip Plug-in is provided by the Tycho project, of which I am a committer.)
With such a setup, you could have one job deploy an update site to Nexus, and the next job use the update site via the unzip URL of the deployed site. Example: If the site was deployed under the GAV project.abc:site:1.0.0-SNAPSHOT, you could then access it via http://<nexus>/content/repositories/<unzip-repo-name>/project/abc/site/1.0.0-SNAPSHOT/site-1.0.0-SNAPSHOT-unzip/.
Note that you are slightly less flexible with such a setup that with what you have set up now: You need to have a version number for what your upstream project is building, so this may become tricky if you have multiple feature branches developing towards the same release version.
If you don't need this, you have the benefit of getting a portable build of your downstream project, i.e. developers build the project in the same way as your Jenkins does.

creating a local repository in Artifactory

I've just started installing Jenkins along with Artifactory and Ivy at our company.
Jenkins will be calling our ant build scripts and these scripts will then turn to Ivy to retrieve jars from the local repo (only).
We would like to have 2 local repositories : 1 for our local artifacts and 1 for 3rd party jars.
Our intent is to make sure developers retrieve only 3rd party jars that have been "approved" by admin in Artifactory.
The build scripts would then fetch from the local repository.
So my questiona are :
How do we setup a local repository to include the built artifacts?
How are 3rd party jars retrieved by Artifactory if it has no access to internet?
Is there a better way of doing this?
I'm not sure if I'm missing something, but setting up the local repository for your build artifacts should be straightforward, just follow the instructions in the Artifactory docs.
You would need to configure your build system to publish new artifacts to Artifactory. This is also straightforward if you're using a standard dependency management tool like Maven or Ivy, and is covered in the Artifactory docs.
Keeping a separate repo for 3rd party JARs is also not too hard; at my company we do this by submitting the JARs to source control, which triggers a build in our CI system that publishes the JARs to Artifactory.
If you want to give more details of your build and CI setup I can add additional guidelines.

Resources