I am trying to write a pipeline script to publish *.war/*.jar file to JFrogArtifactory. I don't find any syntax for the same.
Anyone can help me out on the same.
please help me with a sample script.
JFrog has a dedicated GitHub repository with many examples for such cases.
There are Jenkins Pipelines examples there.
First, you must install Artifactory Plugin and config it in Jenkins server.
Refer: https://www.jfrog.com/confluence/display/JFROG/Configuring+Jenkins+Artifactory+Plug-in
And then try add below script to Jenkinsfile:
script {
def server = Artifactory.server '<artifactory id>'
def uploadSpec = '''{
"files": [{
"pattern": "<name of war or jar file>",
"target": "<artifactory repo>/path-to/war-or-jar/file/in-Artifactory"
}]
}'''
server.upload(uploadSpec)
}
Don't forget replace <artifactory id> <name of war or jar file> and <artifactory repo>/path-to/war-or-jar/file/in-Artifactory
More information: https://www.jfrog.com/confluence/display/JFROG/Declarative+Pipeline+Syntax
The scripted pipeline syntax for deploying war files to JFrog artifactory is :
env.ARTIFACTORY = 'True'
if(env.ARTIFACTORY == 'True')
{
stage('Deploying to Artifactory')
{
FAILED_STAGE = env.STAGE_NAME
bat 'mvn deploy'
}
}
Note :
1.) 'bat' command is for Windows batch file. If you're using Linux, replace 'bat' with 'sh'
2.) env.ARTIFACTORY is used to give you control over whether or not you want to execute this particular stage in your pipeline job. if you don't want this stage to execute, simply set env.ARTIFACTORY = 'False'
3.) Also note, you've to configure JFrog in : Manage Jenkins -> Configure system -> JFrog Platform Instances
4.) Include JFrog in your pom.xml file under distributionManagement tag.
Related
I use Jenkins to build my project, once that is done I upload the directory with all artifact files or just a single file to JFrog.
Pipeline-code example:
def server = Artifactory.server "jfrog1"
sh "touch uploadconfig.json"
writeFile(file: "uploadconfig.json", text: "{ \"files\": [ { \"pattern\": \"./uploadconfig.json\", \"target\": \"test-generic-local/uploadconfig1.json\", \"flat\" : \"true\" }]}")
uploadSpec = readFile 'uploadconfig.json'
uploadInfo = server.upload spec: uploadSpec
When the file is uploaded it has the build.name and build.number property.
But my issue is it will not show up under Artifactory -> Builds.
This is an issue since I want to have the option to download the latest file from this build using:
"files": [
{
"pattern": "test-generic-local/uploadconfig*",
"target": "./",
"build" : "nameOfMyJenkinsJob/LATEST"
}
]
When I try to run it now without the artifacts being listed under Build I get the following error message:
The build name nameOfMyJenkinsJob could not be found.
I see that you are not publishing the build to Artifactory. Kindly refer to this JFrog wiki on how to publish the build info. Also, I would recommend you referring this Github Jenkins file where in the build publish stage is where the build is pushed.
I try to use Jenkinsfile for executing build.ps1 file. However, when I scan multibranch-pipeline, I see Does not meet criteria log. Why Jenkins cannot find file. My repo url is this.
Jenkins version : 2.138.3
Jenkinsfile is:
#!groovy
node {
stage ('Checkout') {
checkout scm
}
stage('Check Env Parameters'){
echo "Branch Name : ${env.GIT_BRANCH}"
echo "Octo Server Address : ${env.octoServer}"
}
stage('Run Cake') {
powershell -File build.ps1 -projectName="Jenkins_PowerShell_Cake_Tutorial" -branchName=${env.GIT_BRANCH} -octoServer=${env.octoServer} -octoApiKey=${env.octoApiKey}
}
}
Jenkinsfile's type is not txt
Jenkin Log:
Jenkins Configuration :
The Jenkinsfile in your repo is named .Jenkinsfile (with a dot as first character). Either rename the file or configure the script path with the dot.
Check Jenkinsfile path if not on the same level provide a relative path. This worked for me.
There is also a bug in Jenkins related to this issue:
https://issues.jenkins-ci.org/browse/JENKINS-54126
I'm trying to set up a jenkins pipeline for publishing a zip file to jfrog artifactory.
I am using com.jfrog.artifactory plugin to do so. This works great from command line gradle and I can run the artifactoryPublish task to publish the artifacts and tie them back to the module, which then has a tie back to the artifacts.
The artifacts show up with the properties:
build.name = `projectname`
build.number = `some large number`
And I can click from them to the build/module and back to the artifact.
However, when I run this from a jenkinsfile pipeline, the artifacts get published and get tied back to the module, but then the module does not successfully tie the module back to the artifacts.
The artifacts do not receives the build.name and build.number properties and i cannot click from the module back to the artifacts, as the module cannot find or resolve the paths back to the artifacts(a zip file and a generated pom).
I am passing the params from jenkins like:
ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER} which seems to work on other projects... but for whatever reason I cannot shake it.
I can include more jenkinsfile if that would help debug, but i'm really just checking out a repository and trying to publish it.
I have been reading heavily the documentation here:
https://www.jfrog.com/confluence/display/RTF/Gradle+Artifactory+Plugin
and haven't been able to make it work through -Pproject stuff.
Does anyone have any idea what else I can try? i don't really want to use the jenkins pipeline artifactory plugin directly because it's so nice to be able to deploy from the command line too.
build.gradle:
publishing {
publications {
ManualUpdaterPackage(MavenPublication){
artifact assembleManualUpdaterPackage
}
}
}
artifactory {
contextUrl = "${artifactoryUrl}" //The base Artifactory URL if not overridden by the publisher/resolver
publish {
defaults {
publications('ManualUpdaterPackage')
}
repository {
repoKey = project.version.endsWith('-SNAPSHOT') ? snapshotRepo : releaseRepo
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
}
task assembleManualUpdaterPackage (type: Zip){
dependsOn anotherTask
from (packageDir + "/")
include '**'
// archiveName "manualUpdaterPackage-${version}.zip"
destinationDir(file(manualUpdaterZipDir))
}
jenkinsfile snip:
withCredentials(
[
[
$class : 'UsernamePasswordMultiBinding',
credentialsId : 'validcreds',
passwordVariable: 'ORG_GRADLE_PROJECT_artifactory_password',
usernameVariable: 'ORG_GRADLE_PROJECT_artifactory_user'
]
]
) {
withEnv(
[
"ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER}",
"ORG_GRADLE_PROJECT_buildInfo.build.name=${artifactName}",
"ORG_GRADLE_PROJECT_buildInfo.build.url=${env.JOB_URL}"
]
) {
sh 'chmod +x gradlew'
sh "./gradlew --no-daemon clean artifactoryPublish"
}
}
https://www.jfrog.com/confluence/display/RTF/Working+With+Pipeline+Jobs+in+Jenkins#WorkingWithPipelineJobsinJenkins-GradleBuildswithArtifactory
Eventually my coworker recommended looking into the Artifactory Pipeline Gradle plugin instead. It is very nice to work with and we've had much quicker success with it.
I have created a Jenkins Pipeline job. In this job I want to do the build using Ant. I have configured the Ant variable in Manage **Jenkins > Global Tool Configuration** as Ant1.9.1= D:\path_to_hybris\hybris\bin\platform\apache-ant-1.9.1.
In a freestyle jenkins Job, I know that the build.xml location can be specified as in the below screenshot:
but I am unable to understand how to specify the ant groovy script beyond this point, especially where can we mention the path to build.xml file:
def antHome = tool 'Ant1.9.1'
????
????
you can use ant wrapper in Jenkins`s pipeline groovy script.
withAnt(installation: 'LocalAnt') {
// some block
sh "ant build"
//for windows
bat "ant build"
}
Remember to configure the ant tool in the Jenkins "Global Tool Configuration" with the same name "LocalAnt".
You can try this:
def antVersion = 'Ant1.9.1'
withEnv( ["ANT_HOME=${tool antVersion}"] ) {
sh '$ANT_HOME/bin/ant target1 target2'
}
Under Windows this would look like this (I didn't test it though):
def antVersion = 'Ant1.9.1'
withEnv( ["ANT_HOME=${tool antVersion}"] ) {
bat '%ANT_HOME%/bin/ant.bat target1 target2'
}
This assumes that you have Ant configured in Jenkins with name 'Ant1.9.1'.
I needed this multiple times within the same Jenkinsfile that needs to be executed on both linux and windows agents so I created a method for it.
You can call ant like this
callAnt("-v -p")
if you add this method definition to your jenkinsfile
def callAnt(String Parameters) {
if (isUnix()) {
env.PATH = "${tool 'ant'}/bin;${env.PATH}"
sh "ant ${Parameters}"
}
else {
env.PATH = "${tool 'ant'}\\bin;${env.PATH}"
bat "ant ${Parameters}"
}
}
I'm trying to migrate my build pipelines to the "Pipeline plugin" using the groovy build scripts.
My pipelines are usually:
Test (gradle)
IntegrationTest (gradle)
Build (gradle)
Publish (artifactory)
I would like to use the gradle variables like version/group etc. in my jenkins build script to publish to the correct folders in artifactory. Something the artifactory plugin would take care of for me in the past. How can this be achieved?
For a single gradle project I use something like this:
node('master')
{
def version = 1.0
def gitUrl = 'some.git'
def projectRoot = ""
def group = "dashboard/frontend/"
def artifactName = "dashboard_ui"
def artifactRepo = "ext-release-local"
stage "git"
git branch: 'develop', poll: true, url: "${gitUrl}"
dir(projectRoot)
{
sh 'chmod +x gradlew'
stage "test"
sh './gradlew clean test'
stage "build"
sh './gradlew build createPom'
stage "artifact"
def server = Artifactory.server('artifactory_dev01')
def uploadSpec = """{
"files": [
{
"pattern": "build/**.jar",
"target": "${artifactRepo}/$group/${artifactName}/${version}/${artifactName}-${version}.jar"
},
{
"pattern": "pom.xml",
"target": "${artifactRepo}/$group/${artifactName}/${version}/${artifactName}.pom"
}
]
}"""
def buildInfo1 = server.upload spec: uploadSpec
server.publishBuildInfo buildInfo1
}
}
For future reference here an example with the more modern declarative pipeline:
pipeline {
agent any
stages {
stage('somestage') {
steps {
script {
def version = sh (
script: "./gradlew properties -q | grep \"version:\" | awk '{print \$2}'",
returnStdout: true
).trim()
sh "echo Building project in version: $version"
}
}
}
}
}
see also:
Gradle plugin project version number
How to do I get the output of a shell command executed using into a variable from Jenkinsfile (groovy)?
I think you actually have two different approaches to tackle this problem :
1. Get version/group from sh script
Find a way to get Gradle version from gradle build tool (e.g. gradle getVersion(), but I'm not familiar with Gradle) and then use shell script to get this version. If Gradle command to get the version is gradle getVersion(), you would do in your pipeline :
def projectVersion = sh script: "gradle getVersion()", returnStdout: true
def projectGroup= sh script: "gradle getGroup()", returnStdout: true
and then just inject your $projectVersion and $projectGroup variables in your current pipeline.
2. Configure your Gradle build script to publish to Artifactory
This is the reverse approach, which I personnaly prefer : instead of giving Artifactory all your Gradle project information, juste give Gradle your Artifactory settings and use Gradle goal to easily publish to Artifactory.
JFrog has a good documentation for this solution in their Working with Gradle section. Basically, you will follow the following steps :
Generate a compliant Gradle build script from Artifactory using Gradle Build Script Generator and include it to your project build script
Use Gradle goal gradle artifactoryPublish to simply publish your current artifact to Artifactory
For others who Google'd their way here, if you have the Pipeline Utility Steps plugin and store what you need in your gradle.properties file, you can do something like this in the environment block:
MY_PROPS = readProperties file:"${WORKSPACE}/gradle.properties"
MY_VERSION = MY_PROPS['version']