When configuring the Artifactory Plugin for "Generic-Artifactory" integration, someone changed the artifact name without updating the Jenkins2 plan and the upload no longer worked. Unfortunately, the Jenkins build never failed or warned us.
Is there an option in the specs that I have yet to find that would allow it to fail the build in this case?
I'm sure there's an obvious answer here somewhere, but I'm missing it. I don't want to write a script that checks for the artifact and exits if it's not there, although it would work. I'm looking for the right way to do this.
https://www.jfrog.com/confluence/display/RTF/Using+File+Specs
{
"files": [
{
"pattern": "$WORKSPACE/foobar.jar",
"target": "libs-release-local/com/mycompany/foo-1.1.jar"
}
]
}
Your desired functionality is the fail-no-op flag, which will fail the build if no files were affected (uploaded/downloaded) during the process.
The fail-no-op flag is only available in pipeline jobs, both in declarative and scripted syntax.
Related
I am using Pipelines to build our projects in Jenkins.
If the build of a pipeline fails I'd like to automatically start a new build after a predefined period of time.
Thus it is not possible for me to use the retry("3") command in the Jenkinsfiles because, the way I understand it, there would be no possibility for a delay.
Sleep or something similar wont do because it will block an executor.
The Naginator Plugin seems to do exactly what I need but it doesn't seem to work with pipelines.
I tried implementing it in the Jenkinsfile like:
post {
always {
echo '-------post called-------'
retryBuild {
rerunIfUnstable(true)
retryLimit(2)
fixedDelay(60)
}
echo '-------post finished-------'
}
}
This does not throw any errors and both echos will be shown in the pipeline build. However it doesn't do anything either.
Does anyone have any experience with a similar problem or is there potentially even a way to use Naginator (or other plugins) with Jenkins pipelines?
I am using Jenkins Pipeline to build, install and deploy a maven project, deploying to artifactory.
I use the Artifactory plugin for this, and mostly it is working really well.
However, in the scripted pipeline I need to get a hold of the URLs that the artifacts were deployed to. I would expect them to be placed inside the resulting buildInfo, but "artifacts" is always empty.
I have tried creating a new buildinfo manually, and then using it with rtMaven run function.
def rtBuildInfo = Artifactory.newBuildInfo()
rtBuildInfo.name = "Something"
rtBuildInfo.number = env.BUILD_NUMBER
I have tried to create no manual buildInfo, and just running rtMaven run. Empty.
I have tried manually running rtServer.publishBuildInfo rtBuildInfo before and after deploy. Empty.
I have tried using append for the different buildInfo, and giving it manually as an argument to the different steps. Empty.
The files are definitely uploaded. I can see:
[pool-9-thread-3] INFO org.jfrog.build.extractor.maven.BuildInfoClientBuilder - [pool-9-thread-3] Deploying artifact: https://...
In the log, and I can see the files uploaded inside Artifactory.
It does however also say in the log: [main] INFO org.jfrog.build.extractor.maven.BuildDeploymentHelper - Artifactory Build Info Recorder: publish build info set to false, build info will not be published...
But I have found no way to set this to be done rather than calling on rtServer.publishBuildInfo as a separate step.
But when I print the buildInfo, it looks like this:
"name": "Something",
"artifacts": [
],
"number": "68"
What do I have to do to get a buildInfo with "artifacts" filled with a list of entries with "localPath" and "remotePath" set?
In a nutshell:
How can I access the location of the produced artifacts within a shell script started in a build or post-build action?
The longer story:
I'm trying to setup a jenkins job to automate the building and propagation of debian packages.
So far, I was already successfull in using the debian-pbuilder plugin to perform the build process, such that jenkins presents the final artifacts after successfully finishing the job:
mypackage_1+020200224114528.NOREV.4_all.deb
mypackage_1+020200224114528.NOREV.4_amd64.buildinfo
mypackage_1+020200224114528.NOREV.4_amd64.changes
mypackage_1+020200224114528.NOREV.4.dsc
mypackage_1+020200224114528.NOREV.4.tar.xz
Now I would like to also automate the deployment process into the local reprepro repository, which would actually just require a simple shell script invocation, I've put together.
My problem: I find no way to determine the artifact location for that deployment script to operate on. The "debian-pbuilder" plugin generates the artifacts in a temporary directory ($WORKSPACE/binaries.tmp15567690749093469649), which changes with every build.
Since the artifacts are listed properly in the finished job status view, I would expect that the artifact details are provided to the script (e.g. by environment variables). But that is obvously not the case.
I've already search extensively for a solution, but didn't find anything helpful.
Or is it me (still somewhat a Rookie in Jenkins), following a wron approach here?
You can use archiveArtifacts. You have binaries.tmp directory in the Workspace and you can use it, but before execute clear workspace using deleteDir().
Pipeline example:
pipeline {
agent any
stages {
stage('Build') {
steps {
deleteDir()
...
}
}
}
post {
always {
archiveArtifacts artifacts: 'binaries*/**', fingerprint: true
}
}
}
You can also check https://plugins.jenkins.io/copyartifact/
I missed one of the semicolon to see the red build on the jenkins, while using the Git
public class SpringbootController {
public void callSerivce() {
System.out.println("to check se changes");
System.out.println("to check se changes")
}
}
but still on the jenkins, it is showing the build is successful.
don't know what exactly i missed,please help
newbie in jenkins
is there anything i need to add in the shell to make it work, right now it is empty.
To sum up the conversation in the comments: in order for Jenkins to build your code you need to tell it how to build your code. By default Jenkins doesn't do anything beyond the SCM checkout on its own.
Since you're building a Java project with Maven you should add a Maven build step after your SCM checkout. You will need the Maven Project Plugin.
In your post-build steps you should add an "Archive the artifacts" step to gather up anything built during your job you want to keep since your workspace will get potentially wiped out next run.
While running a job in jenkins using ant script i got the build in artifactory . My problem is when a change in build happens it should store as a new version in artifactory. How can i do this?
Thank You
The question is missing a lot of details, but assuming you are using a Jenkinsfile together with the Artifactory Jenkins plugin, you would need to use something like ${env.BUILD_NUMBER} when defining the target in the Jenkinsfile.
It should be something similar to:
{
"files": [
{
"pattern": "my-build-directory/*.tar.gz",
"target": "my-repo/${env.BUILD_NUMBER}/"
}
]
}
If this helps answer your question, I would appreciate it if you could mark it as the accepted answer.