Jenkins Pipeline S3 Upload: missing file - jenkins

I'm in the process of migrating all our Jenkins jobs into pipelines and, using a JenkinsFile for better control (committed to CodeCommit, AWS' GIT).
One of the steps in our jobs is the Post Build Action that uploads files to S3, which works correctly in the Jenkins' jobs, but haven't been able to correctly replicate it in the JenkinsFile. I think I've tried every possible combination provided in the documentation but, despite the process says that "worked", no file appears in S3 console.
Since our target file gets named based on version number extracted from pom.xml, I need to use wildcards to get it's name using the following syntax:
s3Upload(bucket:"myBucket", path:'path/to/targetFolder/', includePathPattern:'**/*.war', workingDir:'target')
The 'path/to/targetFolder/' gets created and the log shows:
Uploading
file:/var/lib/jenkins/workspace/mailer%20pipeline/target/mailer%23%231.3.2.war to s3://myBucket/path/to/targetFolder/
Finished: Uploading to myBucket/path/to/targetFolder/
Upload complete
But no file gets into the target folder.
What could I be missing?

There was indeed an error in the plugin. I reported it to GitHub and a contributor released a fix.

Related

Check for path existence in Artifactory through Jenkins

I have a Jenkins job that uploads multiple files to Artifactory and I'd like it to check for each file's path before actual upload with the following observations:
If the path does not exist for one file, I don't want that file to be uploaded. With the mention that rest of the files to be checked and not to fail the job.
If there is already a the path created I want my job to continue and create upload the files.
Do you have any idea how should I implement this?
Any idea/approach will help.
Thanks!!
The flow you mentioned is already implemented in the Jenkins Artifactory plugin. The plugin has an internal checksum-based upload optimization. This feature is supported out of the box and already enabled in all generic upload job types:
Scripted Pipeline, Declarative Pipeline and Freestyle job Generic upload.
Before uploading a file using one of the above methods, the Jenkins Artifactory plugin:
Calculates the checksum of the file.
Sends a PUT request to Artifactory with the checksum, but without the content of the file.
If the empty PUT request returned 200 - the new path is added to the artifact in Artifactory, so we don't have to upload it again.
If the empty PUT request returned 404 - we do a regular file upload.
This feature does not related to the target path of the file in Artifactory. Even if you see the file in other path, it is enough to skip uploading it again.

How to use a GitLab link for applying jenkins.yml file for the concept of Jenkins Configuration as Code

I have a local instance of Jenkins. I have previously tried storing the jenkins.yml in my system and giving its path on http://localhost:8080/configuration-as-code. This worked but I want to use a Gitlab repository to store the jenkins.yml file.
I have already tried giving the gitlab link of my jenkins.yml in the path or URL textbox. Some weird things happened, like
1. jenkins broke or huge error console
2. It reapplies the previous configuration(from system path)
jenkins:
systemMessage: "Hello, world"
Your problem as described: you want the job configuration to be saved in GIT and, when a build is triggered, the job should get the current stand of its configuration from there and then, run the build.
Maybe there is a kind of plug-in that does it for you, but I am not aware of any. Maybe anyone?
My suggestion is to define a pipeline job and use a declarative pipeline. It is a file, normally named Jenkinsfile that can be stored in GIT. In the Job, you define the GIT address and when you trigger a build, the file is got from GIT and executed.
There are several flaws in this: pipelines learning curve is not small, you are confronted with groovy (not XML!) and your current XML file is barelly useful.
Maybe someone shows up and tells us about new (for me) plugin that solves your problem using the configuration XML file. In the other hand, pipelines are such a beautyful feature that I encourage you to give it a try

Creating artifacts in jenkins

I have been tasked with looking into using Jenkins as a build server. So far I have managed to pull a project from git, restore the Nuget packages, build the project and run the unit tests. However I am struggling to find out how to generate the artifact.
The way the business would like to have the build server generate a zip file to a directory on the build server or a remote server for the systems team then to pick up and deploy to the relevant location. E.g. given a windows service project the built bin directory would be zipped up and put in the relevant artifact directory.
I thought that in order to do this I add an archive the artifacts post-build action. However I am getting the below error:
‘Watchdog.WinService.Monitor/bin/Release/*.zip’ doesn’t match anything:
‘Watchdog.WinService.Monitor’ exists but not
‘Watchdog.WinService.Monitor/bin/Release/*.zip’
If I look in the workspace for this project I can browse to the bin directory and see all the files so I unsure what I have done wrong.
Can someone please let me know if what I am trying to accomplish is possible, and also if our approach to using Jenkins is correct?
The problem is that you try to create the artifact using the archive artifatcs step.
But the step is to collect artifacts and show them on the job page.
That means you need to create the artifact first e.g. using a shell or batch script.
You can combine this with the Flexible Publish Plugin.
When you select this as post build step you can create a conditional action that runs the artifact archive task and as condition executes the script that creates the zip file.
So if that fails the task won't be executed. Also it may causes your job to 'fail' but that may not be the case in your job.

Jenkins JUnit Attachments Plugin throws 404 for attached files

Currently I'm running a Maven 3 build with Selenium/WebDriver tests. Whenever a test fails, it will snap a screenshot and save it to the correct folder (for the plugin). When the job finishes, I can see all the attachments listed. I can also copy the attachments off of the Jenkins server and view them.
However, whenever I try to view them in Jenkins, I'm seeing a 404 file not found exception. I've double checked the permissions of all the files involved, tried using both .jpg and .png extensions. I've commented on the JUnit Attachments Plugin wiki page. I'm thinking that it might be a bug at this point, but wanted to see if anyone has had it work with Maven builds.
n.b. I'm using version 1.3 of the plugin, version 1.540 of Jenkins, and the correct dependency for JQuery.
Edit: This is not specific to images. I attempted this with a text file, and still get a 404.

Can a Jenkins build access the archived artifacts from itself?

I'm using Jenkins and have the "Archive the Artifacts" step at the end of my builds to archive them into a zip file.
Instead of using this step, I'd like to use a script to push the artifacts to a remote server at the end of the build. The server I'm pushing to uses a REST API / HTTP PUT request in a script to upload files.
Note that I'm looking to access the artifact created in the same build. So if I'm on build #5, I want the artifacts from build #5, not build #4.
Is there any way to access this zip file with a script, in the same build that it was created in?
I need to upload this zip remotely and don't want to create another job to do so.
You can install one of the "Publish Over..." plugins to upload your artifacts at the end of a build.
The goal of the Publish Over plugins is to provide a consistent set of
features and behaviours when sending build artifacts ... somewhere.
See also the full list of "upload" plugins for other methods of publishing your artifacts.
Like #Christopher said, you can use any of the Publish Over plugins on the Jenkins Plugins page to upload the artifact to any of the
If you want to access the archived zip file from within the build itself, you can use the following link to access it:
http://<server>/job/${JOB_NAME}/lastSuccessfulBuild/artifact/<artifact name w/folder>
For example:
server = myserver.com
job name = myproject
artifact = del/project.zip
Your URL would be:
http://myserver.com/job/myproject/lastSuccessfulBuild/artifact/del/project.zip
EDIT: Question was changed. In any case, this would work for accessing the artifact of the previous build in the current one.
There is no way that I have found to access the "Archive the Artifacts" package of the build that generates it. This step always occurs last in the build. Accessing the URL prior to the build ending (during the build via script for example) results in a blank zip file. To get around this limitation, I'm making a second linked build job to grab the zip and run my script to deploy it.

Resources