Using Jenkins DSL I can create and publish build info using Artifactory.newBuildInfo but am looking for the complementary method to read the BuildInfo JSON data that is generated on Artifactory. Have trolled through many resources. Any suggestions would be appreciated.
From Artifactory REST API it sure looks like you can retrieve buildInfo. I'd expect this must be exposed from the jenkins plugin as well.
Build Info
Description: Build Info
Since: 2.2.0
Security: Requires a privileged user with deploy permissions (can be anonymous)
Usage: GET /api/build/{buildName}/{buildNumber}
Produces: application/vnd.org.jfrog.build.BuildInfo+json
...
JFrog's project examples on github is a fabulous resource as is their jenkins plugin
From a quick search it looks like you'd define a download spec and then use server.download method (see Working with Pipeline Jobs in Jenkins
def buildInfo1 = server.download downloadSpec
The previous answer creates a new buildInfo, it does not download the original buildInfo into I've been trying for days to try to figure out how to do what the original poster wants to do. The best I've succeeded at is downloading the buildinfo into a hashtable, work with that, then upload the changes doing REST calls.
def curlstr = "curl -H 'X-JFrog-Art-Api:${password}' ${arturl}api/build/${buildName}/${buildNumber}"
def buildInfoString = sh(
script: curlstr,
returnStdout: true
).trim()
buildInfo = (new JsonSlurperClassic().parseText(buildInfoString))
sh("echo '${JsonOutput.toJson(buildInfo)}'|curl -XPUT -H 'X-JFrog-Art-Api:${password}' -H 'Content-Type: application/json' ${arturl}api/build --upload-file - ")
I was able to modify the buildInfo in the artifactory repository using this technique. Not as clean as I would like. I've been unable to get the jfrogCLI to modify existing buildInfo files either.
For whatever it's worth the intent of what I'm trying to do is promote a docker artifact and change the name while doing it. There is no way I've found to express this to artifactory not involving downloading the artifact to docker and then pushing it again. I'd love it if someone from #jfrog could clue me in how to do it.
UPDATE: Attention! I've got the question wrong. This is how you get the local BuildInfo-Object in a declarative pipeline script.
I managed this by using an internal api from jenkins-artifactory-plugin.
// found in org.jfrog.hudson.pipeline.declarative.utils.DeclarativePipelineUtils
/**
* Get build info as defined in previous rtBuildInfo{...} scope.
*
* #param rootWs - Step's root workspace.
* #param build - Step's build.
* #param customBuildName - Step's custom build name if exist.
* #param customBuildNumber - Step's custom build number if exist.
* #return build info object as defined in previous rtBuildInfo{...} scope or a new build info.
*/
public static BuildInfo getBuildInfo(FilePath rootWs, Run<?, ?> build, String customBuildName, String customBuildNumber, String project) throws IOException, InterruptedException {
...
}
Whith this code you can fetch the BuildInfo inside a declarative pipeline script step.
def buildInfo = org.jfrog.hudson.pipeline.declarative.utils.DeclarativePipelineUtils.getBuildInfo(new hudson.FilePath(new java.io.File(env.WORKSPACE)), currentBuild.rawBuild, null, null, null);
UPDATE: Beware of custom build names and numbers. I you have defined a custom build name and/or build number, you have to provide it with the getBuildInfo call.
Related
I'm using the Jfrog Artifactory plugin in my Jenkins pipeline to pull some in-house utilities that the pipelines use. I specify which version of the utility I want using a parameter.
After executing the server.download, I'd like to verify and report which version of the file was actually downloaded, but I can't seem to find any way at all to do that. I do get a buildInfo object returned from the server.download call, but I can find any way to pull information from that object. I just get an object reference if I try to print the buildInfo object. I'd like to abort the build and send a report out if the version of the utility downloaded is incorrect.
The question I have is, "How does one verify that a file specified by a download spec is successfully downloaded?"
This functionality is only available on scripted pipeline at the moment, and is described in the documentation.
For example:
node {
def server = Artifactory.server SERVER_ID
def downloadSpec = readFile 'downloadSpec.json'
def buildInfo = server.download spec: downloadSpec
if (buildInfo.getDependencies().size() > 0) {
def localPath = buildInfo.getDependencies()[0].getLocalPath()
def remotePath = buildInfo.getDependencies()[0].getRemotePath()
def md5 = buildInfo.getDependencies()[0].getMd5()
def sha1 = buildInfo.getDependencies()[0].getSha1()
echo localPath
}
server.publishBuildInfo buildInfo
}
I've defined a variable in my TFS/AzureDevops Build definition (say it's time)
and assign the value using PowerShell task within my build definition.
Like,
Type: Inline Script.
Inline script:
$date=$(Get-Date -Format g);
Write-Host "##vso[task.setvariable variable=time]$date"
You can refer to this similar example
Now I want to get this value in my release definition pipeline. I configured this build definition as continuous deployment to my release definition.
My Question is
How can I get the value of time in my release definition using some other variable? Is this possible?
The is no official way to pass variables from Build to Release. The only way to accomplish this is to store the values in a file (json, xml, yaml, what have you) and attach that as a Build Artifact. That way you can read the file in the release and set the variable again.
Martin Hinshelwood seems to have gotten frustrated enough by this problem and turned that functionality into an extension for Azure DevOps Pipelines.
Tasks included
Variable Save Task - During your build you can save the variables to a json file stored with your other build assets
Variable Load Task - During your Release you can load the saved variables and gain access to them.
What about using a variable within a variable group?
I managed to set the value of a variable in a variable group to a variable coming from a build pipeline, and then to read that variable in a release pipeline.
For that, it is necessary:
To have previously created the variable group, and the variable (its name identified by $(variableName). Let's assume its value would be stored in $(variableValue)).
To find the variable group ID (stored in $(variableGroupId)), which can be done by navigating on Azure DevOps to that variable group. The group ID will then be in the URL.
A Personal Access Token (PAT) with Read & Write access to group variables (called $(personalAccessToken) )
CI pipeline
- powershell: |
az pipelines variable-group variable update --group-id $(variableGroupId) --name $(variableName) --value $(variableValue)
displayName: 'Store the variable in a group variable'
env:
AZURE_DEVOPS_EXT_PAT: $(personalAccessToken)
Then all that's necessary, is to link the variable group to the release pipeline, and the variable will be seeded to that pipeline.
I found another solution that works even without text files just by using the Azure REST API to get the build parameters.
In my release pipeline I execute this Powershell tasks first to extract any build pipeline parameters:
function Get-ObjectMember {
[CmdletBinding()]
Param(
[Parameter(Mandatory=$True, ValueFromPipeline=$True)]
[PSCustomObject]$obj
)
$obj | Get-Member -MemberType NoteProperty | ForEach-Object {
$key = $_.Name
[PSCustomObject]#{Key = $key; Value = $obj."$key"}
}
}
$url = "https://dev.azure.com/$(account)/$(project)/_apis/build/builds/$(Build.BuildId)?api-version=6.0"
$build = Invoke-RestMethod -Uri $url -Headers #{
Authorization = "Bearer $env:SYSTEM_ACCESSTOKEN"
}
$params = $build.templateParameters
if ($params) {
$params | Get-ObjectMember | foreach {
Write-Host $_.Key : $_.Value
echo "##vso[task.setvariable variable=$($_.Key);]$($_.Value)"
}
}
My build pipeline contained the parameter: RELEASE_VERSION and after I executed the above code, I can use $(RELEASE_VERSION) in my other release tasks.
I'm trying to set up a jenkins pipeline for publishing a zip file to jfrog artifactory.
I am using com.jfrog.artifactory plugin to do so. This works great from command line gradle and I can run the artifactoryPublish task to publish the artifacts and tie them back to the module, which then has a tie back to the artifacts.
The artifacts show up with the properties:
build.name = `projectname`
build.number = `some large number`
And I can click from them to the build/module and back to the artifact.
However, when I run this from a jenkinsfile pipeline, the artifacts get published and get tied back to the module, but then the module does not successfully tie the module back to the artifacts.
The artifacts do not receives the build.name and build.number properties and i cannot click from the module back to the artifacts, as the module cannot find or resolve the paths back to the artifacts(a zip file and a generated pom).
I am passing the params from jenkins like:
ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER} which seems to work on other projects... but for whatever reason I cannot shake it.
I can include more jenkinsfile if that would help debug, but i'm really just checking out a repository and trying to publish it.
I have been reading heavily the documentation here:
https://www.jfrog.com/confluence/display/RTF/Gradle+Artifactory+Plugin
and haven't been able to make it work through -Pproject stuff.
Does anyone have any idea what else I can try? i don't really want to use the jenkins pipeline artifactory plugin directly because it's so nice to be able to deploy from the command line too.
build.gradle:
publishing {
publications {
ManualUpdaterPackage(MavenPublication){
artifact assembleManualUpdaterPackage
}
}
}
artifactory {
contextUrl = "${artifactoryUrl}" //The base Artifactory URL if not overridden by the publisher/resolver
publish {
defaults {
publications('ManualUpdaterPackage')
}
repository {
repoKey = project.version.endsWith('-SNAPSHOT') ? snapshotRepo : releaseRepo
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
}
task assembleManualUpdaterPackage (type: Zip){
dependsOn anotherTask
from (packageDir + "/")
include '**'
// archiveName "manualUpdaterPackage-${version}.zip"
destinationDir(file(manualUpdaterZipDir))
}
jenkinsfile snip:
withCredentials(
[
[
$class : 'UsernamePasswordMultiBinding',
credentialsId : 'validcreds',
passwordVariable: 'ORG_GRADLE_PROJECT_artifactory_password',
usernameVariable: 'ORG_GRADLE_PROJECT_artifactory_user'
]
]
) {
withEnv(
[
"ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER}",
"ORG_GRADLE_PROJECT_buildInfo.build.name=${artifactName}",
"ORG_GRADLE_PROJECT_buildInfo.build.url=${env.JOB_URL}"
]
) {
sh 'chmod +x gradlew'
sh "./gradlew --no-daemon clean artifactoryPublish"
}
}
https://www.jfrog.com/confluence/display/RTF/Working+With+Pipeline+Jobs+in+Jenkins#WorkingWithPipelineJobsinJenkins-GradleBuildswithArtifactory
Eventually my coworker recommended looking into the Artifactory Pipeline Gradle plugin instead. It is very nice to work with and we've had much quicker success with it.
I am new to Artifactory and I find that there is a lot of documentation on the different ways to get artifacts to Artifactory but nothing that is complete.
For example:
The pipeline plugin used by jenkins:
sample code:
node {
def server = Artifactory.newServer url: SERVER_URL, credentialsId: CREDENTIALS
def rtMaven = Artifactory.newMavenBuild()
stage 'Build'
git url: 'https://github.com/jfrogdev/project-examples.git'
stage 'Artifactory configuration'
rtMaven.tool = MAVEN_TOOL // Tool name from Jenkins configuration
rtMaven.deployer releaseRepo:'libs-release-local', snapshotRepo:'libs-snapshot-local', server: server
rtMaven.resolver releaseRepo:'libs-release', snapshotRepo:'libs-snapshot', server: server
def buildInfo = Artifactory.newBuildInfo()
stage 'Exec Maven'
rtMaven.run pom: 'maven-example/pom.xml', goals: 'clean install', buildInfo: buildInfo
stage 'Publish build info'
server.publishBuildInfo buildInfo
}
I am not sure how to set up some variables like CREDENTIALS. specially if I want to use api key not user id and password.
Also, if I want to use the Artifactory rest API to build and promote my project(Maven Build). should I be using:
curl -X PUT "http://localhost:8080/artifactory/api/build" -H "Content-Type: application/json" --upload-file build.json
Where build.json is the sample JSON at https://www.jfrog.com/confluence/display/RTF/Artifactory+REST+API under Build Upload.
If i use the API do i still need the above jenkins plugin code or can just use the API?
Where do I pass my credentials (Userid and APi key) in the curl command?
If any experienced user could guide me with these questions that will be of great help.
I would like to have a post-build hook or similar, so that I can have the same output as e. g. the IRC plugin, but give that to a script.
I was able to get all the info, except for the actual build status. This just doesn't work, neither as a "Post-build script", "Post-build task", "Parameterized Trigger" aso.
It is possible with some very ugly workarounds, but I wanted to ask, in case someone has a nicer option ... short of writing my own plugin.
It works as mentioned with the Groovy Post-Build Plugin, yet without any extra quoting within the string that gets executed. So I had to put the actual functionality into a shell script, that does a call to curl, which in turn needs quoting for the POST parameters aso.
def result = manager.build.result
def build_number = manager.build.number
def env = manager.build.getEnvironment(manager.listener)
def build_url = env['BUILD_URL']
def build_branch = env['SVN_BRANCH']
def short_branch = ( build_branch =~ /branches\//).replaceFirst("")
def host = env['NODE_NAME']
def svn_rev = env['SVN_REVISION']
def job_name = manager.build.project.getName()
"/usr/local/bin/skypeStagingNotify.sh Deployed ${short_branch} on ${host} - ${result} - ${build_url}".execute()
Use Groovy script in post-build step via Groovy Post-Build plugin. You can then access Jenkins internals via Jenkins Java API. The plugin provides the script with variable manager that can be used to access important parts of the API (see Usage section in the plugin documentation).
For example, here's how you can execute a simple external Python script on Windows and output its result (as well as the build result) to build console:
def command = """cmd /c python -c "for i in range(1,5): print i" """
manager.listener.logger.println command.execute().text
def result = manager.build.result
manager.listener.logger.println "And the result is: ${result}"
For this I really like the Conditional Build Step plugin. It's very flexible, and you can choose which actions to take based on build failure or success. For instance, here's a case where I use conditional build step to send a notification on build failure:
You can also use conditional build step to set an environment variable or write to a log file that you use in subsequent "execute shell" steps. So for instance, you might create a build with three steps: one step to compile code/run tests, another to set a STATUS="failed" environment variable, and then a third step which sends an email like The build finished with a status: ${STATUS}
Really easy solution, maybe not to elegant, but it works!
1: Catch all the build result you want to catch (in this case SUCCESS).
2: Inject an env variable valued with the job status
3: Do the Same for any kind of other status (in this case I catch from abort to unstable)
4: After you'll be able to use the value for whatever you wanna do.. in this case I'm passing it to an ANT script! (Or you can directly load it from ANT as Environment variable...)
Hope it can help!
Groovy script solution:-
Here I am using groovy script plugin to take the build status and setting it to the environmental variable, so the environmental variable can be used in post-build scripts using post-build task plugin.
Groovy script:-
import hudson.EnvVars
import hudson.model.Environment
def build = Thread.currentThread().executable
def result = manager.build.result.toString()
def vars = [BUILD_STATUS: result]
build.environments.add(0, Environment.create(new EnvVars(vars)))
Postscript:-
echo BUILD_STATUS="${BUILD_STATUS}"
Try Post Build Task plugin...
It lets you specify conditions based on the log output...
Basic solution (please don't laugh)
#!/bin/bash
STATUS='Not set'
if [ ! -z $UPSTREAM_BUILD_DIR ];then
ISFAIL=$(ls -l /var/lib/jenkins/jobs/$UPSTREAM_BUILD_DIR/builds | grep "lastFailedBuild\|lastUnsuccessfulBuild" | grep $UPSTREAM_BUILD_NR)
ISSUCCESS=$(ls -l /var/lib/jenkins/jobs/$UPSTREAM_BUILD_DIR/builds | grep "lastSuccessfulBuild\|lastStableBuild" | grep $UPSTREAM_BUILD_NR)
if [ ! -z "$ISFAIL" ];then
echo $ISFAIL
STATUS='FAIL'
elif [ ! -z "$ISSUCCESS" ]
then
STATUS='SUCCESS'
fi
fi
echo $STATUS
where
$UPSTREAM_BUILD_DIR=$JOB_NAME
$UPSTREAM_BUILD_NR=$BUILD_NUMBER
passed from upstream build
Of course "/var/lib/jenkins/jobs/" depends of your jenkins installation