Jenkins DSL custom config file folder - jenkins

We are using DSL to build/setup our Jenkins structure.
In it, we create our folder structure and then all our jobs within the folders.
The jobs end up in the correct folders by including the folder name in the job name
pipelineJob('folder/subfolder/Job Name') {}
While the UI lets me create a config file within a folder, I cannot find a way within the dsl groovy script hierachy to put a custom config file in a folder.
While I can easily create a config file:
configFiles {
customConfig {
name('myCustom.yaml')
id('59f394fc-40fe-489d-989c-7556c1a01153')
content('yaml content goes here')
}
}
There seems to be no way to put this file into a folder / subfolder.

While the Job DSL plugin does not offer an easy way to do this, you can use a configure block to directly modify the xml.
folder('Config-File Example') {
description("Example of a Folder with a Config-File, created via Job DSL")
configure { folder ->
folder / 'properties' << 'org.jenkinsci.plugins.configfiles.folder.FolderConfigFileProperty'() {
configs(class: 'sorted-set') {
comparator(class: 'org.jenkinsci.plugins.configfiles.ConfigByIdComparator')
'org.jenkinsci.plugins.configfiles.json.JsonConfig'() {
id 'my-config-file-id'
providerId 'org.jenkinsci.plugins.configfiles.json.JsonConfig'
name 'My Config-File Name'
comment 'This contains my awesome configuration data'
// Use special characters as-is, they will be encoded automatically
content '[ "1", \'2\', "<>$%&" ]'
}
}
}
}
}

Related

how to call property file syntax and define in JOB DSL in jenkins

I want to use property file in DSL job which will take my project name in job name and svn location . Can anyone have idea how to write and syntax?
For handling properties files stored outside your repository, you have a plugin called "Config File Provider Plugin".
You use it like this:
stage('Add Config files') {
steps {
configFileProvider([configFile(fileId: 'ID-of-file0in-jenkins', targetLocation: 'path/destinationfile')]) {
// some block
}
}
}
It is capable of replacing tokens in json and xml or the whole file (as in the example)
For handling data comming from the SVN or project name you can access the environment variables. See this thread and this link

Jenkins : [parameterized-trigger] Properties file

I am using "Parameterized Trigger Plugin" to trigger child job. I am using "parametres from properties file" and in the "Use properties from file" in need to pass the name of the file as a variable...I get this error.
[parameterized-trigger] Properties file $propeties_file did not exist.
enter image description here
If you click on the ? you will see the usage / syntax for the property file:
Comma seperated list of absolute or relative paths to file(s) that
contains the parameters for the new project. Relative paths are
originated from the workspace. The file should have KEY=value pairs,
one per line (Java properties file format). Backslashes are used for
escaping, so use "\\" for a single backslash. Current build
paramenters and/or environment variables can be used in form: ${PARAM}
or $PARAM.
So your file needs to exist and you should put the path to the file to where you are putting your $properties_file - I don't believe it will accept a variable, you should put the file name in there.
A sample pipeline to trigger parameterize build using parameters from the properties file
pipeline {
agent any
stages {
stage('S1') {
steps {
echo 'In S1'
sh '''
echo "param1=value1" > my.properties
echo "param2=value2" >> my.properties
'''
}
}
stage('s2'){
steps {
script {
def props = readProperties file:"${WORKSPACE}/my.properties"
build job: 'called_job', parameters: props.collect {string(name: it.key, value: it.value)}
}
}
}
}
}

Defining FOLDER level variables in Jenkins using a shared \vars library

So I'm trying to make define folder level variables by putting them in a groovy file in the \vars directory.
Alas, the documentation is so bad, that it's impossible to figure out how to do that...
Assuming we have to globals G1 and G2, is this how we define them in the groovy file?
#!Groovy
static string G1 = "G1"
static string G2 = "G2"
Assuming the Groovy file is called XYZ.Groovy, how do I define it in the folder so its available for the folder's script?
Assuming I get over that, and that that LIBXYZ is the name the folder associates with the stuff in the /vars directory, is it correct to assume that when I call
#Library("LIBXYZ") _
it will make XYZ available?
In that case, is XYZ.G1 the way to access the globals?
thanks, a.
I have a working example here as I was recently curious about this. I agree that the documentation is wretched.
The following is similar to the info in README.md.
Prep: note that folder here refers to Jenkins Folders from the CloudBees Folder plugin. It is a way to organize jobs.
Code Layout
The first part to note is src/net/codetojoy/shared/Bar.groovy :
package net.codetojoy.shared
class Bar {
static def G1 = "G1"
static def G2 = "G2"
def id
def emitLog() {
println "TRACER hello from Bar. id: ${id}"
}
}
The second part is vars/folderFoo.groovy:
def emitLog(message) {
println "TRACER folderFoo. message: ${message}"
def bar = new net.codetojoy.shared.Bar(id: 5150)
bar.emitLog()
println "TRACER test : " + net.codetojoy.shared.Bar.G1
}
Edit: To use a static/"global" variable in the vars folder, consider the following vars/Keys.groovy:
class Keys {
static def MY_GLOBAL_VAR3 = "beethoven"
}
The folderFoo.groovy script can use Keys.MY_GLOBAL_VAR3.
And then usage (in my example: Basic.Folder.Jenkinsfile):
#Library('folderFoo') _
stage "use shared library"
node {
script {
folderFoo.emitLog 'pipeline test!'
}
}
Jenkins Setup: Folder
Go to New Item and create a new Folder
configure the folder with a new Pipeline library:
Name is folderFoo
Default version is master
Retrieval Method is Modern SCM
Source Code Management in my example is this repo
Jenkins Setup: Pipeline Job
create a new Pipeline job in the folder created above
though a bit confusing (and self-referential), I create a pipeline job that uses this same this repo
specify the Jenkinsfile Basic.Folder.Jenkinsfile
the job should run and use the library

Upload multiple files using s3upload in Jenkins pipeline

Can we upload multiple files (not entire folder) to S3 using s3Upload in Jenkins file?
I was trying to upload all rpm files (*.rpm) in the root directory to S3 using the s3Upload function.
You can upload all the files with following command in one line.
s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.svg', workingDir:'dist')
Further explaining, You can create own filtering based on following two possibilities;
1.Include all the files of a certain extention.
s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*.svg', workingDir:'dist')
2.Include all the files except certain file extention.
s3Upload(bucket:"my-bucket", path:'path/to/targetFolder/', includePathPattern:'**/*', workingDir:'dist', excludePathPattern:'**/*.svg')
Reference: https://github.com/jenkinsci/pipeline-aws-plugin (Check under s3Upload)
findFiles solved the issue. Below is the snippet used for the same.
files = findFiles(glob: '*.rpm')
files.each {
println "RPM: ${it}"
withAWS(credentials: '****************'){
s3Upload(file:"${it}", bucket:'rpm-repo', path:"${bucket_path}")
}
}
Refer to the following link AWS s3 documentation. In that, refer section 'Use of Exclude and Include Filters'
Here is a way to upload multiple files of a particular type.
If you only want to upload files with a particular extension, you need to first exclude all files, then re-include the files with the particular extension. This command will upload only files ending with .jpg:
aws s3 cp /tmp/foo/ s3://bucket/ --recursive --exclude "*" --include "*.jpg"
This works for AWS Command Line Interface.
For pipelines, you need to wrap the iteration in script, like
pipeline {
environment {
// Extract concise branch name.
BRANCH = GIT_BRANCH.substring(GIT_BRANCH.lastIndexOf('/') + 1, GIT_BRANCH.length())
}
...
post {
success {
script {
def artifacts = ['file1', 'dir2/file3']
artifacts.each {
withAWS(credentials:'my-aws-token', region:'eu-west-1') {
s3Upload(
file: "build/${it}",
bucket: 'my-artifacts',
path: 'my-repo/',
metadatas: ["repo:${env.JOB_NAME}", "branch:${env.BRANCH}", "commit:${env.GIT_COMMIT}"]
)
}
}
}
}
}
}

Grails configuration ConfigSlurper

I want to separate config files into few small parts. In Config.groovy I have defined grails.config.locations array to point these files:
grails.config.locations = [
"classpath:config.properties",
"classpath:some-config.groovy",
]
And then I am checking configuration map by accessing: grailsApplication.config
The first configuration file is Java properties file, which loads properly:
config.properties
grails.serverURL=http://localhost:8080/selly
..
The second one is .groovy file which in reference to the documentation (http://grails.org/doc/latest/guide/conf.html#configExternalized) should be loaded from automatically parsed ConfigSlurper file format:
some-config.groovy:
app {
testvar {
foo = true
}
}
But grailsApplication.config.app does not exists (no field in debug and println returns empty map [:]).
Can anyone give an example of loading groovy files?
Files are placed in: grails-app\conf\, for example grails-app\conf\config.properties
It looks like you've configured both files correctly. grailsApplication.config.app might be null simply because it is not a leaf node, have you tried grailsApplication.config.app.testvar.foo?

Resources