Check for path existence in Artifactory through Jenkins - jenkins

I have a Jenkins job that uploads multiple files to Artifactory and I'd like it to check for each file's path before actual upload with the following observations:
If the path does not exist for one file, I don't want that file to be uploaded. With the mention that rest of the files to be checked and not to fail the job.
If there is already a the path created I want my job to continue and create upload the files.
Do you have any idea how should I implement this?
Any idea/approach will help.
Thanks!!

The flow you mentioned is already implemented in the Jenkins Artifactory plugin. The plugin has an internal checksum-based upload optimization. This feature is supported out of the box and already enabled in all generic upload job types:
Scripted Pipeline, Declarative Pipeline and Freestyle job Generic upload.
Before uploading a file using one of the above methods, the Jenkins Artifactory plugin:
Calculates the checksum of the file.
Sends a PUT request to Artifactory with the checksum, but without the content of the file.
If the empty PUT request returned 200 - the new path is added to the artifact in Artifactory, so we don't have to upload it again.
If the empty PUT request returned 404 - we do a regular file upload.
This feature does not related to the target path of the file in Artifactory. Even if you see the file in other path, it is enough to skip uploading it again.

Related

Is there any way to get Jenkins artifacts in Gitlab pipeline?

Basically the opposite of this question: Is there any way to get Gitlab pipeline artifacts in Jenkins?
My project is built on Jenkins. It generates a JavaDoc.
I want to publish this JavaDoc as a Gitlab Pages.
So I would need my gitlab-yml to be able to retrieve the produced javadoc from Jenkins.
Alternatively, my artifacts are stored on a network drive, so an access to this network drive would work too
I think that this plugin could be helpful to you...
There is a plugin called Archived Artifact Url Viewer. It seems to be like you needs.
"Jenkins plugin to view contents of a file inside a zip or jar file under a subdirectory of artifacts directory of a build The url to access a file inside a zip or jar archive within the artifact folder of a build is as follows"
/archivedArtifacts/artifact/<job_name>/<build_number/<relative location of zip or jarfile within artifact folder>/<location of file within archive>
Ex:
http://<jenkins_url>/archivedArtifacts/artifact/Build%20-%20Dev/10526/junit-logs.zip/junit.log
https://plugins.jenkins.io/archived-artifact-url-viewer/

How to rename existing folder while uploading generic artifacts to JFrog Artifactory in jenkins pipeline

I am using jenkins pipeline. The folder which needs to be uploaded to Artifactory is generated in the *.tar.gz format. Everytime after Jenkins build, the folder name remains same, no change in folder name.
For generic Artifactory integration I don't want to override the previously uploaded *tar.gz. Want to know if tar.gz can uploaded in the incremented order based number/date/time.
Basically you can change the path where you are generating your artifacts (*.tar.gz), lets say give the path as below:-
$BRANCH_NAME/$BUILD_ID/*.tar.gz
It will upload the artifacts with the same name, but will be separated using $BRANCH_NAME/$BUILD_ID.

Jenkins Pipeline S3 Upload: missing file

I'm in the process of migrating all our Jenkins jobs into pipelines and, using a JenkinsFile for better control (committed to CodeCommit, AWS' GIT).
One of the steps in our jobs is the Post Build Action that uploads files to S3, which works correctly in the Jenkins' jobs, but haven't been able to correctly replicate it in the JenkinsFile. I think I've tried every possible combination provided in the documentation but, despite the process says that "worked", no file appears in S3 console.
Since our target file gets named based on version number extracted from pom.xml, I need to use wildcards to get it's name using the following syntax:
s3Upload(bucket:"myBucket", path:'path/to/targetFolder/', includePathPattern:'**/*.war', workingDir:'target')
The 'path/to/targetFolder/' gets created and the log shows:
Uploading
file:/var/lib/jenkins/workspace/mailer%20pipeline/target/mailer%23%231.3.2.war to s3://myBucket/path/to/targetFolder/
Finished: Uploading to myBucket/path/to/targetFolder/
Upload complete
But no file gets into the target folder.
What could I be missing?
There was indeed an error in the plugin. I reported it to GitHub and a contributor released a fix.

How to get the list of the file names from build artifacts using jenkins rest api?

I am trying to get the list of file names from the build artifacts using jenkins rest api.
This URL http://your.jenkins.server/job/your.job/lastStableBuild/artifact/relativePath would download the artifact. Is it possible to get the names of the files contained in the artifact without downloading it using jenkins api?
Suppose,the build artifact contains around 5 text files with names ending with .config
Is it possible to retrieve only the file names ending with .config using jenkins API?
I found this jenkins api as useful to get list of artifacts http://your.jenkins.server/job/your.job/lastStableBuild/api/json?tree=artifacts%5BrelativePath%5D from (ref: Is there any Jenkins API to get artifacts name and download it?) the artifacts then can be retrieved by parsing the JSON object in my application.

Can a Jenkins build access the archived artifacts from itself?

I'm using Jenkins and have the "Archive the Artifacts" step at the end of my builds to archive them into a zip file.
Instead of using this step, I'd like to use a script to push the artifacts to a remote server at the end of the build. The server I'm pushing to uses a REST API / HTTP PUT request in a script to upload files.
Note that I'm looking to access the artifact created in the same build. So if I'm on build #5, I want the artifacts from build #5, not build #4.
Is there any way to access this zip file with a script, in the same build that it was created in?
I need to upload this zip remotely and don't want to create another job to do so.
You can install one of the "Publish Over..." plugins to upload your artifacts at the end of a build.
The goal of the Publish Over plugins is to provide a consistent set of
features and behaviours when sending build artifacts ... somewhere.
See also the full list of "upload" plugins for other methods of publishing your artifacts.
Like #Christopher said, you can use any of the Publish Over plugins on the Jenkins Plugins page to upload the artifact to any of the
If you want to access the archived zip file from within the build itself, you can use the following link to access it:
http://<server>/job/${JOB_NAME}/lastSuccessfulBuild/artifact/<artifact name w/folder>
For example:
server = myserver.com
job name = myproject
artifact = del/project.zip
Your URL would be:
http://myserver.com/job/myproject/lastSuccessfulBuild/artifact/del/project.zip
EDIT: Question was changed. In any case, this would work for accessing the artifact of the previous build in the current one.
There is no way that I have found to access the "Archive the Artifacts" package of the build that generates it. This step always occurs last in the build. Accessing the URL prior to the build ending (during the build via script for example) results in a blank zip file. To get around this limitation, I'm making a second linked build job to grab the zip and run my script to deploy it.

Resources