We have artifactory pro and We need to know if We can promote an artifact from Jenkins through the plugin. Currently, We are deploying on artifactory and when We need promote first, We download the artifact from artifactory to jenkins and then We publish it on artifactory with other tag.
Exists some way to promote from Jenkin’s pipeline without API, because we need a solution that not expose our credentials.
Thanks,
You can promote a build inside a jenkins script by using artifactory.promote instead of Artifactory.addInteractivePromotion as shown:
#NonCPS
def promote(artifactory, buildInfo) {
echo "currentBuild.result is ${currentBuild.result}"
def QAPromotionConfig = [
'buildName' : buildInfo.name,
'buildNumber' : buildInfo.number,
'targetRepo' : 'debian-local-qa',
'sourceRepo' : 'debian-local-debug',
'copy' : true
]
def RelPromotionConfig = [
'buildName' : buildInfo.name,
'buildNumber' : buildInfo.number,
'targetRepo' : 'debian-local-release',
'sourceRepo' : 'debian-local-debug',
'copy' : true
]
if(currentBuild.result == "SUCCESS") {
artifactory.promote(QAPromotionConfig)
} else {
Artifactory.addInteractivePromotion(server: artifactory, promotionConfig: QAPromotionConfig, displayName: "Promote to QA")
}
Artifactory.addInteractivePromotion(server: artifactory, promotionConfig: RelPromotionConfig, displayName: "Promote to Release")
}
Related
I use Jenkins to build my project, once that is done I upload the directory with all artifact files or just a single file to JFrog.
Pipeline-code example:
def server = Artifactory.server "jfrog1"
sh "touch uploadconfig.json"
writeFile(file: "uploadconfig.json", text: "{ \"files\": [ { \"pattern\": \"./uploadconfig.json\", \"target\": \"test-generic-local/uploadconfig1.json\", \"flat\" : \"true\" }]}")
uploadSpec = readFile 'uploadconfig.json'
uploadInfo = server.upload spec: uploadSpec
When the file is uploaded it has the build.name and build.number property.
But my issue is it will not show up under Artifactory -> Builds.
This is an issue since I want to have the option to download the latest file from this build using:
"files": [
{
"pattern": "test-generic-local/uploadconfig*",
"target": "./",
"build" : "nameOfMyJenkinsJob/LATEST"
}
]
When I try to run it now without the artifacts being listed under Build I get the following error message:
The build name nameOfMyJenkinsJob could not be found.
I see that you are not publishing the build to Artifactory. Kindly refer to this JFrog wiki on how to publish the build info. Also, I would recommend you referring this Github Jenkins file where in the build publish stage is where the build is pushed.
I am trying to write a pipeline script to publish *.war/*.jar file to JFrogArtifactory. I don't find any syntax for the same.
Anyone can help me out on the same.
please help me with a sample script.
JFrog has a dedicated GitHub repository with many examples for such cases.
There are Jenkins Pipelines examples there.
First, you must install Artifactory Plugin and config it in Jenkins server.
Refer: https://www.jfrog.com/confluence/display/JFROG/Configuring+Jenkins+Artifactory+Plug-in
And then try add below script to Jenkinsfile:
script {
def server = Artifactory.server '<artifactory id>'
def uploadSpec = '''{
"files": [{
"pattern": "<name of war or jar file>",
"target": "<artifactory repo>/path-to/war-or-jar/file/in-Artifactory"
}]
}'''
server.upload(uploadSpec)
}
Don't forget replace <artifactory id> <name of war or jar file> and <artifactory repo>/path-to/war-or-jar/file/in-Artifactory
More information: https://www.jfrog.com/confluence/display/JFROG/Declarative+Pipeline+Syntax
The scripted pipeline syntax for deploying war files to JFrog artifactory is :
env.ARTIFACTORY = 'True'
if(env.ARTIFACTORY == 'True')
{
stage('Deploying to Artifactory')
{
FAILED_STAGE = env.STAGE_NAME
bat 'mvn deploy'
}
}
Note :
1.) 'bat' command is for Windows batch file. If you're using Linux, replace 'bat' with 'sh'
2.) env.ARTIFACTORY is used to give you control over whether or not you want to execute this particular stage in your pipeline job. if you don't want this stage to execute, simply set env.ARTIFACTORY = 'False'
3.) Also note, you've to configure JFrog in : Manage Jenkins -> Configure system -> JFrog Platform Instances
4.) Include JFrog in your pom.xml file under distributionManagement tag.
I use Jenkins Artifactory Plug-in for promotion. I try to create separate job to promote the artifact. Here the stage of promotion
stage('promote artifact') {
steps {
script {
String buildName = "${BUILD_NAME}"
String buildNumber = "${BUILD_NUMBER}"
def promotionConfig = [
//Mandatory parameters
'buildName' : buildName,
'buildNumber' : buildNumber,
'targetRepo' : "tst",
//Optional parameters
'comment' : "this is the promotion comment",
'sourceRepo' : "dev",
'status' : "Released",
'includeDependencies': true,
'failFast' : true,
'copy' : true
]
// Promote build
server.promote promotionConfig
}
}
}
As BUILD_NAME I try name of commit job and name of artifact at snapshot repository
As BUILD_NUMBER I use number of build like 1 without version number like 1.0.0.
But after all I got
Performing dry run promotion (no changes are made during dry run) ...
ERROR: Promotion failed during dry run (no change in Artifactory was done): HTTP/1.1 404 Not Found
{
"errors" : [ {
"status" : 404,
"message" : "Cannot find builds by the name 'artifact-name' and the number '1'."
} ]
}
Do you have some idea how to run it successfully? Or some flag to get more information about this error?
The reason why it fails for me that I skipped part with upload build info to Artifactory. You should create the upload spec during build task and upload it to Artifactory. BuildInfo will be stored separately and contains service infromation. After that promotion will work fine.
def uploadSpec = """{
"files": [
{
"pattern": "bazinga/*froggy*.zip",
"target": "bazinga-repo/froggy-files/"
}
]
}"""
server.upload(uploadSpec)
I'm trying to set up a jenkins pipeline for publishing a zip file to jfrog artifactory.
I am using com.jfrog.artifactory plugin to do so. This works great from command line gradle and I can run the artifactoryPublish task to publish the artifacts and tie them back to the module, which then has a tie back to the artifacts.
The artifacts show up with the properties:
build.name = `projectname`
build.number = `some large number`
And I can click from them to the build/module and back to the artifact.
However, when I run this from a jenkinsfile pipeline, the artifacts get published and get tied back to the module, but then the module does not successfully tie the module back to the artifacts.
The artifacts do not receives the build.name and build.number properties and i cannot click from the module back to the artifacts, as the module cannot find or resolve the paths back to the artifacts(a zip file and a generated pom).
I am passing the params from jenkins like:
ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER} which seems to work on other projects... but for whatever reason I cannot shake it.
I can include more jenkinsfile if that would help debug, but i'm really just checking out a repository and trying to publish it.
I have been reading heavily the documentation here:
https://www.jfrog.com/confluence/display/RTF/Gradle+Artifactory+Plugin
and haven't been able to make it work through -Pproject stuff.
Does anyone have any idea what else I can try? i don't really want to use the jenkins pipeline artifactory plugin directly because it's so nice to be able to deploy from the command line too.
build.gradle:
publishing {
publications {
ManualUpdaterPackage(MavenPublication){
artifact assembleManualUpdaterPackage
}
}
}
artifactory {
contextUrl = "${artifactoryUrl}" //The base Artifactory URL if not overridden by the publisher/resolver
publish {
defaults {
publications('ManualUpdaterPackage')
}
repository {
repoKey = project.version.endsWith('-SNAPSHOT') ? snapshotRepo : releaseRepo
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
}
task assembleManualUpdaterPackage (type: Zip){
dependsOn anotherTask
from (packageDir + "/")
include '**'
// archiveName "manualUpdaterPackage-${version}.zip"
destinationDir(file(manualUpdaterZipDir))
}
jenkinsfile snip:
withCredentials(
[
[
$class : 'UsernamePasswordMultiBinding',
credentialsId : 'validcreds',
passwordVariable: 'ORG_GRADLE_PROJECT_artifactory_password',
usernameVariable: 'ORG_GRADLE_PROJECT_artifactory_user'
]
]
) {
withEnv(
[
"ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER}",
"ORG_GRADLE_PROJECT_buildInfo.build.name=${artifactName}",
"ORG_GRADLE_PROJECT_buildInfo.build.url=${env.JOB_URL}"
]
) {
sh 'chmod +x gradlew'
sh "./gradlew --no-daemon clean artifactoryPublish"
}
}
https://www.jfrog.com/confluence/display/RTF/Working+With+Pipeline+Jobs+in+Jenkins#WorkingWithPipelineJobsinJenkins-GradleBuildswithArtifactory
Eventually my coworker recommended looking into the Artifactory Pipeline Gradle plugin instead. It is very nice to work with and we've had much quicker success with it.
I'm looking to create and upload an artifact to Archiva through a Jenkins pipeline node. I've found plenty of documentation for doing this in Artifactory, but I am having trouble getting footing on how to handle this with Archiva.
For reference, the Artifactory equivalent of what I'm trying to do is something on the order of:
node {
def server = Artifactory.server 'my-server-id'
stage('Build') {
// ...
}
stage('Test') {
// ...
}
// ...
stage('Archive') {
def uploadSpec = """{
"files": [
{
"pattern": "build/files",
"target": "repo/path/"
}
]
}"""
server.upload(uploadSpec)
}
}
But now I want to handle this in Archiva instead (or with a generic Maven repository in general). For what it's worth, I'm using a Gradle build system, if getting Jenkins to tell Gradle to upload to Archiva would be an easier prospect.