how to write to a file using DSL [Jenkins]? - jenkins

I'm presently making a build flow using DSL.
After searching for I while I've been able to find how to read from a text file, but not how to write to one.
Is there a command for it in DSL?
And also I'd take the opportunity to ask where I can find a tutorial or command list for DSL?

Since the DSL is Groovy based I guess you can write any Groovy code and it should work, refer http://grails.asia/groovy-file-examples to get an example of how to write to a file. The DSL commands are provided at https://jenkinsci.github.io/job-dsl-plugin/ and you can try them out at the playground at http://job-dsl.herokuapp.com/.

By default, you cannot use new File(...).text for security reasons. You can use writefile instead:
writeFile file: "myfile.txt", text: "File content."

This is the best I've been able to come up with, it uses writeFile:
def readEscape(String file) {
return readFileFromWorkspace(file).replace("\\", "\\\\").replace("\"", "\\\"").replace("\n", "\\n").replace("\r", "\\r").replace("\$", '\\$')
}
def Dockerfile = readEscape('./Dockerfile')
pipelineJob('sample-write-file') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage("prep-files") {
steps {
writeFile file: './Dockerfile', text: "''' + Dockerfile + '''"
}
}
}
}
''')
}
}
}
The question is unclear about whether the file needs to be written during the processing of the job dsl or during the generated job's execution. Since I needed this for the job execution, that is what my example shows. The file will be read while creating the job and embedded in the created job definition.

Related

how to call property file syntax and define in JOB DSL in jenkins

I want to use property file in DSL job which will take my project name in job name and svn location . Can anyone have idea how to write and syntax?
For handling properties files stored outside your repository, you have a plugin called "Config File Provider Plugin".
You use it like this:
stage('Add Config files') {
steps {
configFileProvider([configFile(fileId: 'ID-of-file0in-jenkins', targetLocation: 'path/destinationfile')]) {
// some block
}
}
}
It is capable of replacing tokens in json and xml or the whole file (as in the example)
For handling data comming from the SVN or project name you can access the environment variables. See this thread and this link

Jenkins Shared Library - Importing classes from the /src folder in /vars

I am trying to writing a Jenkins Shared Library for my CI process. I'd like to reference a class that is in the \src folder inside a global function defined in the \vars folder, since it would allow me to put most of the logic in classes instead of in the global functions. I am following the repository structure documented on the official Jenkins documentation:
Jenkins Shared Library structure
Here's a simplified example of what I have:
/src/com/example/SrcClass.groovy
package com.example
class SrcClass {
def aFunction() {
return "Hello from src folder!"
}
}
/vars/classFromVars.groovy
import com.example.SrcClass
def call(args) {
def sc = new SrcClass()
return sc.aFunction()
}
Jenkinsfile
#Library('<lib-name>') _
pipeline {
...
post {
always {
classFromVars()
}
}
}
My goal was for the global classes in the /vars folder to act as a sort of public facade and to use it in my Jenkinsfile as a custom step without having to instantiate a class in a script block (making it compatible with declarative pipelines). It all seems pretty straightforward to me, but I am getting this error when running the classFromVars file:
<root>\vars\classFromVars.groovy: 1: unable to resolve class com.example.SrcClass
# line 1, column 1.
import com.example.SrcClass
^
1 error
I tried running the classFromVars class directly with the groovy CLI locally and on the Jenkins server and I have the same error on both environments. I also tried specifying the classpath when running the /vars script, getting the same error, with the following command:
<root>>groovy -cp <root>\src\com\example vars\classFromVars.groovy
Is what I'm trying to achieve possible? Or should I simply put all of my logic in the /vars class and avoid using the /src folder?
I have found several repositories on GitHub that seem to indicate this is possible, for example this one: https://github.com/fabric8io/fabric8-pipeline-library, which uses the classes in the /src folder in many of the classes in the /vars folder.
As #Szymon Stepniak pointed out, the -cp parameter in my groovy command was incorrect. It now works locally and on the Jenkins server. I have yet to explain why it wasn't working on the Jenkins server though.
I found that when I wanted to import a class from the shared library I have, to a script step in the /vars I needed to do it like this:
//thanks to '_', the classes are imported automatically.
// MUST have the '#' at the beginning, other wise it will not work.
// when not using "#BRANCH" it will use default branch from git repo.
#Library('my-shared-library#BRANCH') _
// only by calling them you can tell if they exist or not.
def exampleObject = new example.GlobalVars()
// then call methods or attributes from the class.
exampleObject.runExample()

Jenkins : [parameterized-trigger] Properties file

I am using "Parameterized Trigger Plugin" to trigger child job. I am using "parametres from properties file" and in the "Use properties from file" in need to pass the name of the file as a variable...I get this error.
[parameterized-trigger] Properties file $propeties_file did not exist.
enter image description here
If you click on the ? you will see the usage / syntax for the property file:
Comma seperated list of absolute or relative paths to file(s) that
contains the parameters for the new project. Relative paths are
originated from the workspace. The file should have KEY=value pairs,
one per line (Java properties file format). Backslashes are used for
escaping, so use "\\" for a single backslash. Current build
paramenters and/or environment variables can be used in form: ${PARAM}
or $PARAM.
So your file needs to exist and you should put the path to the file to where you are putting your $properties_file - I don't believe it will accept a variable, you should put the file name in there.
A sample pipeline to trigger parameterize build using parameters from the properties file
pipeline {
agent any
stages {
stage('S1') {
steps {
echo 'In S1'
sh '''
echo "param1=value1" > my.properties
echo "param2=value2" >> my.properties
'''
}
}
stage('s2'){
steps {
script {
def props = readProperties file:"${WORKSPACE}/my.properties"
build job: 'called_job', parameters: props.collect {string(name: it.key, value: it.value)}
}
}
}
}
}

Can I import a groovy script from a relative directory into a Jenkinsfile?

I've got a project structured like this:
/
/ Jenkinsfile
/ build_tools /
/ pipeline.groovy # Functions which define the pipeline
/ reporting.groovy # Other misc build reporting stuff
/ dostuff.sh # A shell script used by the pipeline
/ domorestuff.sh # Another pipeline supporting shell-script
Is it possible to import the groovy files in /build_tools so that I can use functions inside those 2 files in my Jenkinsfile?
Ideally, I'd like to have a Jenkins file that looks something like this (pseudocode):
from build_tools.pipeline import build_pipeline
build_pipeline(project_name="my project", reporting_id=12345)
The bit I'm stuck on is how you write a working equivalent of that pretend import statement on line #1 of my pseudocode.
PS. Why I'm doing this: The build_tools folder is actually a git submodule shared by many projects. I'm trying to give each project access to a common set of build tooling to stop each project maintainer from reinventing this wheel.
The best-supported way to load shared groovy code is through shared libraries.
If you have a shared library like this:
simplest-jenkins-shared-library master % cat src/org/foo/Bar.groovy
package org.foo;
def awesomePrintingFunction() {
println "hello world"
}
Shove it into source control, configure it in your jenkins job or even globally (this is one of the only things you do through the Jenkins UI when using pipeline), like in this screenshot:
and then use it, for example, like this:
pipeline {
agent { label 'docker' }
stages {
stage('build') {
steps {
script {
#Library('simplest-jenkins-shared-library')
def bar = new org.foo.Bar()
bar.awesomePrintingFunction()
}
}
}
}
}
Output from the console log for this build would of course include:
hello world
There are lots of other ways to write shared libraries (like using classes) and to use them (like defining vars so you can use them in Jenkinsfiles in super-slick ways). You can even load non-groovy files as resources. Check out the shared library docs for these extended use-cases.

Writing to a json file in workspace using Jenkins

I've a jenkins job with few parameters setup and I've a JSON file in the workspace which has to be updated with the parameters that I pass through jenkins.
Example:
I have the following parameters which I'll take input from user who triggers the job:
Environment (Consider user selects "ENV2")
Filename (Consider user keeps the default value)
I have a json file in my workspace under run/job.json with the following contents:
{
environment: "ENV1",
filename: "abc.txt"
}
Now whatever the value is given by user before triggering a job has to be replaced in the job.json.
So when the user triggers the job, the job.json file should be:
{
environment: "ENV2",
filename: "abc.txt"
}
Please note the environment value in the json which has to be updated.
I've tried https://wiki.jenkins-ci.org/display/JENKINS/Config+File+Provider+Plugin plugin. But I'm unable to find any help on parameterizing the values.
Kindly suggest on configuring this plugin or suggest any other plugin which can serve my purpose.
Config File Provider Plugin doesn't allow you to pass parameters to configuration files. You can solve your problem with any scripting language. My favorite approach is using Groovy plugin. Hit a check-box "Execute system Groovy script" and paste the following script:
import groovy.json.*
// read build parameters
env = build.getEnvironment(listener)
environment = env.get('environment')
filename = env.get('filename')
// prepare json
def builder = new JsonBuilder()
builder environment: environment, filename: filename
json = builder.toPrettyString()
// print to console and write to a file
println json
new File(build.workspace.toString() + "\\job.json").write(json)
Output sample:
{
"environment": "ENV2",
"filename": "abc.txt"
}
With Pipeline Utility Steps plugin this is very easy to achieve.
jsonfile = readJSON file: 'path/to/your.json'
jsonfile['environment'] = 'ENV2'
writeJSON file: 'path/to/your.json', json: jsonfile
I will keep it simple. A windows batch file or a shell script (depending on the OS) which will read the environment values and open the JSON file and make the changes.

Resources