How to rename existing folder while uploading generic artifacts to JFrog Artifactory in jenkins pipeline - jenkins

I am using jenkins pipeline. The folder which needs to be uploaded to Artifactory is generated in the *.tar.gz format. Everytime after Jenkins build, the folder name remains same, no change in folder name.
For generic Artifactory integration I don't want to override the previously uploaded *tar.gz. Want to know if tar.gz can uploaded in the incremented order based number/date/time.

Basically you can change the path where you are generating your artifacts (*.tar.gz), lets say give the path as below:-
$BRANCH_NAME/$BUILD_ID/*.tar.gz
It will upload the artifacts with the same name, but will be separated using $BRANCH_NAME/$BUILD_ID.

Related

Check for path existence in Artifactory through Jenkins

I have a Jenkins job that uploads multiple files to Artifactory and I'd like it to check for each file's path before actual upload with the following observations:
If the path does not exist for one file, I don't want that file to be uploaded. With the mention that rest of the files to be checked and not to fail the job.
If there is already a the path created I want my job to continue and create upload the files.
Do you have any idea how should I implement this?
Any idea/approach will help.
Thanks!!
The flow you mentioned is already implemented in the Jenkins Artifactory plugin. The plugin has an internal checksum-based upload optimization. This feature is supported out of the box and already enabled in all generic upload job types:
Scripted Pipeline, Declarative Pipeline and Freestyle job Generic upload.
Before uploading a file using one of the above methods, the Jenkins Artifactory plugin:
Calculates the checksum of the file.
Sends a PUT request to Artifactory with the checksum, but without the content of the file.
If the empty PUT request returned 200 - the new path is added to the artifact in Artifactory, so we don't have to upload it again.
If the empty PUT request returned 404 - we do a regular file upload.
This feature does not related to the target path of the file in Artifactory. Even if you see the file in other path, it is enough to skip uploading it again.

Is there any way to get Jenkins artifacts in Gitlab pipeline?

Basically the opposite of this question: Is there any way to get Gitlab pipeline artifacts in Jenkins?
My project is built on Jenkins. It generates a JavaDoc.
I want to publish this JavaDoc as a Gitlab Pages.
So I would need my gitlab-yml to be able to retrieve the produced javadoc from Jenkins.
Alternatively, my artifacts are stored on a network drive, so an access to this network drive would work too
I think that this plugin could be helpful to you...
There is a plugin called Archived Artifact Url Viewer. It seems to be like you needs.
"Jenkins plugin to view contents of a file inside a zip or jar file under a subdirectory of artifacts directory of a build The url to access a file inside a zip or jar archive within the artifact folder of a build is as follows"
/archivedArtifacts/artifact/<job_name>/<build_number/<relative location of zip or jarfile within artifact folder>/<location of file within archive>
Ex:
http://<jenkins_url>/archivedArtifacts/artifact/Build%20-%20Dev/10526/junit-logs.zip/junit.log
https://plugins.jenkins.io/archived-artifact-url-viewer/

How to customize file name of Jenkins archive artifact plugin post build action?

Jenkins archive artifact plugin compress files into "archive.zip" file. It has always the same file name. Even more, Jenkins doesn't archive actually(there is no any "archive.zip" files in "builds" directories). Jenkins just map url
https://www.my-jenkins-server.com/jenkins/job/$job_name/$job_number/artifact/*zip*/archive.zip
and always return everything in job directory, those matches to pattern configured in post build action archive artifact plugin.
Problem is, that job itself generates ZIP archive, so I need to publish this archive under original name. It is important, since archive's name clarify owner of job, data inside, parameters used to run job. Let's say users ran job 10 times using different parameters and don't wait each job to finish before to run next. Later user will start download results and get
archive.zip
archive(1).zip
archive(2).zip
...
archive(10).zip
Now he needs to extract archives from those downloaded archives, to get 10 another archives with qualified names. Then delete those downloaded archive. After that, identify by qualified archive name those he needs actually and delete rest of then. Easy to make mistake here, delete or miss archive file.
Solutions for me are:
Publish generated by job archive under it's original name.
Generate my files and form file name of archive under with it should be served, skip zipping inside of job. Final step, pass this file name as parameter into archive artifact plugin post build action, so Jenkins will serve archive under special name configured by job itself.
The name of the zip file is determined from the directory that contains the artifacts (see Jenkins source).
Internally, the top-most artifact directory has the name archive, that's why you will always see archive.zip.
Conversely, this means that you can get a custom zip file xyz.zip by putting the artifacts in a (sub-)directory xyz.
There are no other options to change the name.
You can run any post-build script (shell/batch/powershell) after the archive step, and rename archive.zip to archive_${BUILD_NUMBER}.zip so that you can easily track of the archive by the last successful build number of the job. But to do this, first you need to clean the workspace to keep a track of the archive files based on the build number.

Resolving Artifacts using Jenkins job with the parent directory

I am looking to download the artifacts using Jenkins job to resolve the artifacts from Artifactory. Specifying the file type and the path to the artifact works, However, unable to resolve all artifacts from the root directory.
Actual Artifactory Path:
repo_key:Group/Artifact/Version/path/to/artifact1/file.zip
repo_key:Group/Artifact/Version/path/to/artifact2/file.zip
Below Configuration in Jenkins job to Resolved Artifacts doesn't works:
repo_key:Group/Artifact/*=>Output
How do I download all files under the Artifact directory to the Output directory.
You need to use the format JBaruch mentioned and add the build metadata as matrix params, to support wildcard resolution for multiple files.
For instance:
repo_key:Group/Artifact/**/*#publishing_build_name#LATEST
Will get you the latest artifacts published by the job "publishing_build_name".
There's some helpful information and examples when clicking on the Question Mark next to the "Resolved Artifacts" field.
Artifact/* will resolve files, directly located under Artifact directory (and there are none). What you need is Artifact/**/*.

Can a Jenkins build access the archived artifacts from itself?

I'm using Jenkins and have the "Archive the Artifacts" step at the end of my builds to archive them into a zip file.
Instead of using this step, I'd like to use a script to push the artifacts to a remote server at the end of the build. The server I'm pushing to uses a REST API / HTTP PUT request in a script to upload files.
Note that I'm looking to access the artifact created in the same build. So if I'm on build #5, I want the artifacts from build #5, not build #4.
Is there any way to access this zip file with a script, in the same build that it was created in?
I need to upload this zip remotely and don't want to create another job to do so.
You can install one of the "Publish Over..." plugins to upload your artifacts at the end of a build.
The goal of the Publish Over plugins is to provide a consistent set of
features and behaviours when sending build artifacts ... somewhere.
See also the full list of "upload" plugins for other methods of publishing your artifacts.
Like #Christopher said, you can use any of the Publish Over plugins on the Jenkins Plugins page to upload the artifact to any of the
If you want to access the archived zip file from within the build itself, you can use the following link to access it:
http://<server>/job/${JOB_NAME}/lastSuccessfulBuild/artifact/<artifact name w/folder>
For example:
server = myserver.com
job name = myproject
artifact = del/project.zip
Your URL would be:
http://myserver.com/job/myproject/lastSuccessfulBuild/artifact/del/project.zip
EDIT: Question was changed. In any case, this would work for accessing the artifact of the previous build in the current one.
There is no way that I have found to access the "Archive the Artifacts" package of the build that generates it. This step always occurs last in the build. Accessing the URL prior to the build ending (during the build via script for example) results in a blank zip file. To get around this limitation, I'm making a second linked build job to grab the zip and run my script to deploy it.

Resources