I am new to Artifactory and I find that there is a lot of documentation on the different ways to get artifacts to Artifactory but nothing that is complete.
For example:
The pipeline plugin used by jenkins:
sample code:
node {
def server = Artifactory.newServer url: SERVER_URL, credentialsId: CREDENTIALS
def rtMaven = Artifactory.newMavenBuild()
stage 'Build'
git url: 'https://github.com/jfrogdev/project-examples.git'
stage 'Artifactory configuration'
rtMaven.tool = MAVEN_TOOL // Tool name from Jenkins configuration
rtMaven.deployer releaseRepo:'libs-release-local', snapshotRepo:'libs-snapshot-local', server: server
rtMaven.resolver releaseRepo:'libs-release', snapshotRepo:'libs-snapshot', server: server
def buildInfo = Artifactory.newBuildInfo()
stage 'Exec Maven'
rtMaven.run pom: 'maven-example/pom.xml', goals: 'clean install', buildInfo: buildInfo
stage 'Publish build info'
server.publishBuildInfo buildInfo
}
I am not sure how to set up some variables like CREDENTIALS. specially if I want to use api key not user id and password.
Also, if I want to use the Artifactory rest API to build and promote my project(Maven Build). should I be using:
curl -X PUT "http://localhost:8080/artifactory/api/build" -H "Content-Type: application/json" --upload-file build.json
Where build.json is the sample JSON at https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API under Build Upload.
If i use the API do i still need the above jenkins plugin code or can just use the API?
Where do I pass my credentials (Userid and APi key) in the curl command?
If any experienced user could guide me with these questions that will be of great help.
Related
I have configured a Github web hook with the below settings:
Payload URL: https:///github-webhook/
Content Type: application/x-www-form-urlencoded
Events : Pushes, Pull Requests
The Jenkins job that I have, is a pipeline job that has the below enabled:
Build Trigger: GitHub hook trigger for GITScm polling
With the above configuration, I see that in response to an event ie; push/PR in GitHub, the jenkins job gets triggered successfully. In GitHub, under Recent Deliveries for the web hook, I see the details of the payload and a successful Response of 200.
I am trying to get the payload in Jenkins Pipeline for further processing. I need some details eg: PR URL/PR number, refs type, branch name etc for conditional processing in the Jenkins Pipeline.
I tried accessing the "payload" variable (as mentioned in other stack overflow posts and the documentations available around) and printing it as part of the pipeline, but I have had no luck yet.
So my question is, How can I get the payload from GitHub web hook trigger in my Jenkins Pipeline ?
You need to select Content type: application/json in your webhook in GitHub. Then you would be able to access any variable from the payload GitHub sends as follows: $. pull_request.url for pr url, for example.
Unsure if this is possible.
With the GitHub plugin we use (Pipeline Github), PR number is stored in the variable CHANGE_ID.
PR URL is pretty easy to generate given the PR number. Branch name is stored in the variable BRANCH_NAME. In case of pull requests, the global variable pullRequest is populated with lots of data.
Missing information can be obtained from Github by using their API. Here's an example of checking if PR is "behind", you can modify that to your specific requirements:
def checkPrIsNotBehind(String repo) {
withCredentials([usernamePassword(credentialsId: "<...>",
passwordVariable: 'TOKEN',
usernameVariable: 'USER')]) {
def headers = ' -H "Content-Type: application/json" -H "Authorization: token $TOKEN" '
def url = "https://api.github.com/repos/<...>/<...>/pulls/${env.CHANGE_ID}"
def head_sha = sh (label: "Check PR head SHA",
returnStdout: true,
script: "curl -s ${url} ${headers} | jq -r .head.sha").trim().toUpperCase()
println "PR head sha is ${head_sha}"
headers = ' -H "Accept: application/vnd.github.v3+json" -H "Authorization: token $TOKEN" '
url = "https://api.github.com/repos/<...>/${repo}/compare/${pullRequest.base}...${head_sha}"
def behind_by = sh (label: "Check PR commits behind",
returnStdout: true,
script: "curl -s ${url} ${headers} | jq -r .behind_by").trim().toUpperCase()
if (behind_by != '0') {
currentBuild.result = "ABORTED"
currentBuild.displayName = "#${env.BUILD_NUMBER}-Out of date"
error("The head ref is out of date. Please update your branch.")
}
}
}
Using Jenkins DSL I can create and publish build info using Artifactory.newBuildInfo but am looking for the complementary method to read the BuildInfo JSON data that is generated on Artifactory. Have trolled through many resources. Any suggestions would be appreciated.
From Artifactory REST API it sure looks like you can retrieve buildInfo. I'd expect this must be exposed from the jenkins plugin as well.
Build Info
Description: Build Info
Since: 2.2.0
Security: Requires a privileged user with deploy permissions (can be anonymous)
Usage: GET /api/build/{buildName}/{buildNumber}
Produces: application/vnd.org.jfrog.build.BuildInfo+json
...
JFrog's project examples on github is a fabulous resource as is their jenkins plugin
From a quick search it looks like you'd define a download spec and then use server.download method (see Working with Pipeline Jobs in Jenkins
def buildInfo1 = server.download downloadSpec
The previous answer creates a new buildInfo, it does not download the original buildInfo into I've been trying for days to try to figure out how to do what the original poster wants to do. The best I've succeeded at is downloading the buildinfo into a hashtable, work with that, then upload the changes doing REST calls.
def curlstr = "curl -H 'X-JFrog-Art-Api:${password}' ${arturl}api/build/${buildName}/${buildNumber}"
def buildInfoString = sh(
script: curlstr,
returnStdout: true
).trim()
buildInfo = (new JsonSlurperClassic().parseText(buildInfoString))
sh("echo '${JsonOutput.toJson(buildInfo)}'|curl -XPUT -H 'X-JFrog-Art-Api:${password}' -H 'Content-Type: application/json' ${arturl}api/build --upload-file - ")
I was able to modify the buildInfo in the artifactory repository using this technique. Not as clean as I would like. I've been unable to get the jfrogCLI to modify existing buildInfo files either.
For whatever it's worth the intent of what I'm trying to do is promote a docker artifact and change the name while doing it. There is no way I've found to express this to artifactory not involving downloading the artifact to docker and then pushing it again. I'd love it if someone from #jfrog could clue me in how to do it.
UPDATE: Attention! I've got the question wrong. This is how you get the local BuildInfo-Object in a declarative pipeline script.
I managed this by using an internal api from jenkins-artifactory-plugin.
// found in org.jfrog.hudson.pipeline.declarative.utils.DeclarativePipelineUtils
/**
* Get build info as defined in previous rtBuildInfo{...} scope.
*
* #param rootWs - Step's root workspace.
* #param build - Step's build.
* #param customBuildName - Step's custom build name if exist.
* #param customBuildNumber - Step's custom build number if exist.
* #return build info object as defined in previous rtBuildInfo{...} scope or a new build info.
*/
public static BuildInfo getBuildInfo(FilePath rootWs, Run<?, ?> build, String customBuildName, String customBuildNumber, String project) throws IOException, InterruptedException {
...
}
Whith this code you can fetch the BuildInfo inside a declarative pipeline script step.
def buildInfo = org.jfrog.hudson.pipeline.declarative.utils.DeclarativePipelineUtils.getBuildInfo(new hudson.FilePath(new java.io.File(env.WORKSPACE)), currentBuild.rawBuild, null, null, null);
UPDATE: Beware of custom build names and numbers. I you have defined a custom build name and/or build number, you have to provide it with the getBuildInfo call.
I have Jenkinsfile written in groovy as follows:
env.MVN_Goals = MVN_Goals
node {
// Get Artifactory server instance, defined in the Artifactory Plugin administration page.
def server = Artifactory.newServer url: 'http://localhost:8085/artifactory', username: 'admin', password: 'password'
// Create an Artifactory Maven instance.
def rtMaven = Artifactory.newMavenBuild()
stage ('Clone sources'){
git url: 'D:/Sample GIT_Maven Repo'
}
stage 'Artifactory configuration'
rtMaven.deployer releaseRepo:'libs-release-local', snapshotRepo:'libs-snapshot-local', server: server
rtMaven.resolver releaseRepo:'libs-release', snapshotRepo:'libs-snapshot', server: server
def buildInfo = Artifactory.newBuildInfo()
stage('Maven_Build'){
if (isUnix()) {
sh "D:/apache-maven-3.3.9/bin/mvn -B -Dmaven ${MVN_Goals}"
}
else{
bat "D:/apache-maven-3.3.9/bin/mvn -B -Dmaven ${MVN_Goals}"
}
step([$class: 'ArtifactArchiver', artifacts: '**/target/*.jar', fingerprint: true])
}
stage ('Publish build info'){
server.publishBuildInfo buildInfo
}
}
I tried configuring the Artifactory in Jenkins by adding the Artifactory plugin for Jenkins. When I tried to test the connection, I am getting an error as There is either an incompatible or no instance of Artifactory at the provided URL. The same error is occurring when i tried to build my job in Jenkins. Is there a way to resolve it?
Artifactory plugin version - 2.9.1
Artifactory Version - 4.15.0
def buildInfo = Artifactory.newBuildInfo() is within that particular stage.
Modify that as
env.MVN_Goals = MVN_Goals
node {
// Get Artifactory server instance,
// defined in the Artifactory Plugin administration page.
def server = Artifactory.newServer url: 'http://localhost:8085/artifactory', username: 'admin', password: 'password'
// Create an Artifactory Maven instance.
def rtMaven = Artifactory.newMavenBuild()
def buildInfo = Artifactory.newBuildInfo()
stage ('Clone sources'){
git url: 'D:/Sample GIT_Maven Repo'
}
Is it possible to do a gradle.run (see below), without running the artifactoryPublish task? I thought I could accomplish this by specifying the tasks parameter, but the plugin appears to add the task back in. For example, the following:
def server = Artifactory.server('artifactory-primary')
def gradle = Artifactory.newGradleBuild()
gradle.resolver server: server, repo: 'gradle-all-virtual'
gradle.deployer server: server, repo: 'gradle-libs-snapshot-local'
gradle.deployer.mavenCompatible = true
gradle.useWrapper = true
gradle.usesPlugin = true
def buildInfo = gradle.run(
rootDir: ".",
buildFile: 'build.gradle',
tasks: 'build',
switches: '--no-daemon -x check')
server.publishBuildInfo buildInfo
Results in:
...
gradlew -x check build artifactoryPublish -b build.gradle
...
When what I really want is:
...
gradlew -x check build -b build.gradle
...
Ultimately I want to build in one stage and deploy in another.
The same snippet but with references to artifactory removed from my Gradle file and with Tamir's addition added in:
def server = Artifactory.server('artifactory-primary')
def gradle = Artifactory.newGradleBuild()
gradle.resolver server: server, repo: 'gradle-all-virtual'
gradle.deployer server: server, repo: 'gradle-libs-snapshot-local'
gradle.deployer.mavenCompatible = true
gradle.deployer.deployArtifacts = false
gradle.useWrapper = true
gradle.usesPlugin = false
def buildInfo = gradle.run(
rootDir: ".",
buildFile: 'build.gradle',
tasks: 'build',
switches: '--no-daemon -x check')
server.publishBuildInfo buildInfo
Produces the same result.
The artifactoryPublish task is added by default, you can see that in the Jenkins Artifactory plugin code.
If you prefere not to deploy artifacts to artifactory you can do so by configuring deployer.deployArtifacts = false.
In your case:
gradle.deployer.deployArtifacts = false
If you want to build you project in two phases you can once build it with deployArtifacts=false and in the second time to build it with deployArtifacts=true
I'm trying to push my artifacts to Artifactory with Jenkins Pipeline, which call Gradle tool.
I am following the examples published on GitHub:
Example1
Example2
My Jenkins Pipeline script:
stage('Perform Gradle Release') {
//ssh-agent required to perform GIT push (when tagging the branch on release)
sshagent([git_credential]) {
sh "./gradlew clean release unSnapshotVersion -Prelease.useAutomaticVersion=true -Prelease.releaseVersion=${release_version} -Prelease.newVersion=${development_version}"
}
// Create an Artifactory server instance
def server = Artifactory.server('my-artifactory')
// Create and set an Artifactory Gradle Build instance:
def rtGradle = Artifactory.newGradleBuild()
rtGradle.resolver server: server, repo: 'libs-release'
rtGradle.deployer server: server, repo: 'libs-release-local'
//Use Gradle Wrapper
rtGradle.useWrapper = true
//Creates buildinfo
def buildInfo = Artifactory.newBuildInfo()
buildInfo.env.capture = true
buildInfo.env.filter.addInclude("*")
// Run Gradle:
rtGradle.run rootDir: "./", buildFile: 'build.gradle', tasks: 'clean artifactoryPublish', buildInfo: buildInfo
// Publish the build-info to Artifactory:
server.publishBuildInfo buildInfo
}
My Gradle file is very light, I'm just using the plugin Gradle Release Plugin to perform gradle release.
When executing the pipeline, it fails with this message:
:artifactoryPublish
BUILD SUCCESSFUL
Total time: 17.451 secs
ERROR: Couldn't read generated build info at : /tmp/generated.build.info4898776990575217114.json
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
hudson.model.Run$RunnerAbortedException
at org.jfrog.hudson.pipeline.Utils.getGeneratedBuildInfo(Utils.java:188)
at org.jfrog.hudson.pipeline.steps.ArtifactoryGradleBuild$Execution.run(ArtifactoryGradleBuild.java:127)
at org.jfrog.hudson.pipeline.steps.ArtifactoryGradleBuild$Execution.run(ArtifactoryGradleBuild.java:96)
at org.jenkinsci.plugins.workflow.steps.AbstractSynchronousStepExecution.start(AbstractSynchronousStepExecution.java:40)
...
Finished: FAILURE
When I check on the server, there is no such file /tmp/generated.build.info4898776990575217114.json (the user has of course permission to write to /tmp).
Thanks for your help.
[EDIT] It is weird, but I found some files named "buildInfo2408849984051060030.properties", containing the informations. The name is not the same, neither the format, and these files are stores on my Jenkins machine, not my slave executing the pipeline.
Thanks #tamir-hadad, it has indeed been fixed on 2.8.2.