Gradle Artifact depedency - Artifactory artifact - How to find the path - path

Building a Java/Groovy project, various tasks like compileJava, compileGroovy, test, etc requires various jar artifacts which Gradle provides them if you have properly defined what you need for each area of that task.
I'm making those available to the build script (build.gradle) and everything is working fine.
One of my other project that I'm working on, requires not only jar rtifacts but also an .xml file as an artifact for doing JIBX / XSLT transformation/processing.
My simple question:
- Gradle build process know how to fetch artifacts from Artifactory (as I have mentioned those Artifactory repositories in init.d/common.gradle file) and during the build it feeds the compile/test etc tasks with those jars, now if I have this .xml artifact uploaded to Artifactory as well, then:
a. How can I get the .xml artifact available to me in build.gradle so that I can perform some operation on it; for ex: Copy that .xml file to a x/y/z folder in a resultant project's jar/war file. Those jar files I can access via project.configurations.compile.each or .find or something like that but I'm not sure if I can access the .xml file the same way. The following code works fine for unjaring a jar file in build/tmpJibx/ folder i.e. if I need httpunit-1.1.1.jar during my build, then the following function when called, will create/unjar this jar in build/tmpJibx/httpunit folder.
// Unpack jar
def unpackJarItem( jarName ) {
println 'unpacking: ' + jarName
def dirName = "$buildDir/tmpJibx/$jarName"
new File( dirName ).mkdirs()
project.configurations.compile.find {
def nameJar = it.name
def iPos = nameJar.lastIndexOf( '-' )
if( iPos > 0 ) {
nameJar = nameJar.substring( 0, iPos )
if( nameJar == jarName ) {
def srcJar = it.toString()
ant {
unjar( src: "$srcJar", dest: "$dirName" )
}
}
}
}
}
Gradle maintains artifacts in its cache at user's ~(home directory inside ~/.gradle) or C:\Users\.gradle under ...\caches..\artifactory..\filestore.........*
All Im trying to achieve is:
If I can do something like below:
copy {
into "build/war/WEB-INF/conf"
from "......<THIS_PATH_IS_WHAT_Im_LOOKING_FOR>:
include "thatxmlartifactfile.xml"
}
I tried defining the entry under dependencies { ... } section, like below, but I'm not sure if Gradle will automatically have access to it somehow as Gradle is so great.
dependencies {
compile 'groupid:artifactid:x.x.x'
compile group: 'xxxx', artifac...: 'yyyy', version: 'x.x.x'
//for ex:
compile 'httpunit:httpunit:1.1.1'
jibxAnt 'groupidnameofxmlfile:artifactidnameofxmlfile:versionnumberofxml#xml"
...
...
....
}
It seems like I have to first copy that .xml from where ever Gradle know it's available to some location n then from that location to my target folder.
// Add libraries for acceptance tests
project.configurations.acceptanceTestCompile.each { File f ->
if( f.isFile() ) {
def nameJar = f.getName()
def jarName = f.getName()
def fileInc = true
def iPos = nameJar.lastIndexOf( '-' )
if( iPos > -1 ) {
jarName = nameJar.substring( 0, iPos )
// Here I can say that one of the file/entry will be that .xml file
// Now I have that in jarName variable and I can play with it, right?
// i.e. if jarName == name of that xml file, then
copy {
into "some/folder/location"
from jarName
}
}
}
}

The easiest solution is to commit the XML file to source control. If you put it under src/main/webapp/WEB-INF/conf/thatxmlartifactfile.xml, it will get included in the War automatically.
If you need to get the file from Artifactory, you can do so as follows:
configurations {
jibx
}
dependencies {
jibx "some.group:someArtifact:1.0#xml"
war {
from { configurations.jibx.singleFile }
}
PS: It's often possible, and also preferable, to add files directly to the final archive, rather than going through intermediate copy steps.

Related

Jenkins DSL custom config file folder

We are using DSL to build/setup our Jenkins structure.
In it, we create our folder structure and then all our jobs within the folders.
The jobs end up in the correct folders by including the folder name in the job name
pipelineJob('folder/subfolder/Job Name') {}
While the UI lets me create a config file within a folder, I cannot find a way within the dsl groovy script hierachy to put a custom config file in a folder.
While I can easily create a config file:
configFiles {
customConfig {
name('myCustom.yaml')
id('59f394fc-40fe-489d-989c-7556c1a01153')
content('yaml content goes here')
}
}
There seems to be no way to put this file into a folder / subfolder.
While the Job DSL plugin does not offer an easy way to do this, you can use a configure block to directly modify the xml.
folder('Config-File Example') {
description("Example of a Folder with a Config-File, created via Job DSL")
configure { folder ->
folder / 'properties' << 'org.jenkinsci.plugins.configfiles.folder.FolderConfigFileProperty'() {
configs(class: 'sorted-set') {
comparator(class: 'org.jenkinsci.plugins.configfiles.ConfigByIdComparator')
'org.jenkinsci.plugins.configfiles.json.JsonConfig'() {
id 'my-config-file-id'
providerId 'org.jenkinsci.plugins.configfiles.json.JsonConfig'
name 'My Config-File Name'
comment 'This contains my awesome configuration data'
// Use special characters as-is, they will be encoded automatically
content '[ "1", \'2\', "<>$%&" ]'
}
}
}
}
}

How do I make Jenkins fileCopyOperation copy all files including those starting with a dot

I have a Jenkins pipeline that copy all my source files unfortunately it skip one file named ".DS_store" I have be unable to locate sufficient detailed documentation to figure how to include hidden files control details on what files to include. The best I could find is: https://jenkins.io/doc/pipeline/steps/file-operations/
def CopyJobs = [:]
def include = Include(Target)
// copy all folders to c:\svn\vs\... for building there
for(int i = 0; i< include.size(); i++){
def index = i
CopyJobs["CopyJob${i}"] = {
Run.dir("D:\\Svn\\vs\\")
{
def includeItem = "${include[index]}\\**"
Run.echo "CopyJobs ${index} ${includeItem}"
Run.fileOperations([
Run.fileCopyOperation(excludes: '', flattenFiles: false, includes: "${includeItem}", targetLocation: "C:\\Svn\\vs\\")
])
}
}
}
Run.parallel CopyJobs
Edit:
The puzzle has increased I assumed that the problem was the starting dot (.) but I have another file starting with dot that gets copied without problems. That is however not binary but other binaries not starting with dot gets copied ok

Defining FOLDER level variables in Jenkins using a shared \vars library

So I'm trying to make define folder level variables by putting them in a groovy file in the \vars directory.
Alas, the documentation is so bad, that it's impossible to figure out how to do that...
Assuming we have to globals G1 and G2, is this how we define them in the groovy file?
#!Groovy
static string G1 = "G1"
static string G2 = "G2"
Assuming the Groovy file is called XYZ.Groovy, how do I define it in the folder so its available for the folder's script?
Assuming I get over that, and that that LIBXYZ is the name the folder associates with the stuff in the /vars directory, is it correct to assume that when I call
#Library("LIBXYZ") _
it will make XYZ available?
In that case, is XYZ.G1 the way to access the globals?
thanks, a.
I have a working example here as I was recently curious about this. I agree that the documentation is wretched.
The following is similar to the info in README.md.
Prep: note that folder here refers to Jenkins Folders from the CloudBees Folder plugin. It is a way to organize jobs.
Code Layout
The first part to note is src/net/codetojoy/shared/Bar.groovy :
package net.codetojoy.shared
class Bar {
static def G1 = "G1"
static def G2 = "G2"
def id
def emitLog() {
println "TRACER hello from Bar. id: ${id}"
}
}
The second part is vars/folderFoo.groovy:
def emitLog(message) {
println "TRACER folderFoo. message: ${message}"
def bar = new net.codetojoy.shared.Bar(id: 5150)
bar.emitLog()
println "TRACER test : " + net.codetojoy.shared.Bar.G1
}
Edit: To use a static/"global" variable in the vars folder, consider the following vars/Keys.groovy:
class Keys {
static def MY_GLOBAL_VAR3 = "beethoven"
}
The folderFoo.groovy script can use Keys.MY_GLOBAL_VAR3.
And then usage (in my example: Basic.Folder.Jenkinsfile):
#Library('folderFoo') _
stage "use shared library"
node {
script {
folderFoo.emitLog 'pipeline test!'
}
}
Jenkins Setup: Folder
Go to New Item and create a new Folder
configure the folder with a new Pipeline library:
Name is folderFoo
Default version is master
Retrieval Method is Modern SCM
Source Code Management in my example is this repo
Jenkins Setup: Pipeline Job
create a new Pipeline job in the folder created above
though a bit confusing (and self-referential), I create a pipeline job that uses this same this repo
specify the Jenkinsfile Basic.Folder.Jenkinsfile
the job should run and use the library

Upload multiple files using s3upload in Jenkins pipeline

Can we upload multiple files (not entire folder) to S3 using s3Upload in Jenkins file?
I was trying to upload all rpm files (*.rpm) in the root directory to S3 using the s3Upload function.
You can upload all the files with following command in one line.
s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.svg', workingDir:'dist')
Further explaining, You can create own filtering based on following two possibilities;
1.Include all the files of a certain extention.
s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.svg', workingDir:'dist')
2.Include all the files except certain file extention.
s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*', workingDir:'dist', excludePathPattern:'**/*.svg')
Reference: https://github.com/jenkinsci/pipeline-aws-plugin (Check under s3Upload)
findFiles solved the issue. Below is the snippet used for the same.
files = findFiles(glob: '*.rpm')
files.each {
println "RPM: ${it}"
withAWS(credentials: '****************'){
s3Upload(file:"${it}", bucket:'rpm-repo', path:"${bucket_path}")
}
}
Refer to the following link AWS s3 documentation. In that, refer section 'Use of Exclude and Include Filters'
Here is a way to upload multiple files of a particular type.
If you only want to upload files with a particular extension, you need to first exclude all files, then re-include the files with the particular extension. This command will upload only files ending with .jpg:
aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
This works for AWS Command Line Interface.
For pipelines, you need to wrap the iteration in script, like
pipeline {
environment {
// Extract concise branch name.
BRANCH = GIT_BRANCH.substring(GIT_BRANCH.lastIndexOf('/') + 1, GIT_BRANCH.length())
}
...
post {
success {
script {
def artifacts = ['file1', 'dir2/file3']
artifacts.each {
withAWS(credentials:'my-aws-token', region:'eu-west-1') {
s3Upload(
file: "build/${it}",
bucket: 'my-artifacts',
path: 'my-repo/',
metadatas: ["repo:${env.JOB_NAME}", "branch:${env.BRANCH}", "commit:${env.GIT_COMMIT}"]
)
}
}
}
}
}
}

Zip files/Directories in Groovy with AntBuilder

I am trying to zip files and directories in Groovy using AntBuilder. I have the following code:
def ant = new AntBuilder()
ant.zip(basedir: "./Testing", destfile:"${file}.zip",includes:file.name)
This zips the file "blah.txt", but not the file "New Text Document.txt". I think the issue is the spaces. I've tried the following:
ant.zip(basedir: "./Testing", destfile:"${file}.zip",includes:"${file.name}")
ant.zip(basedir: "./Testing", destfile:"${file}.zip",includes:"\"${file.name}\"")
Neither of the above resolved the issue. I'm using Ant because it will zip directories, and I don't have access to org.apache.commons.io.compression at work.
If you look at the docs for the ant zip task, the includes parameter is described as:
comma- or space-separated list of patterns of files that must be included
So you're right, that it is the space separator that's breaking it...
You need to use the longer route to get this to work:
new AntBuilder().zip( destFile: "${file}.zip" ) {
fileset( dir: './Testing' ) {
include( name:file.name )
}
}

Resources