Inject variable in jenkins pipeline with groovy script - jenkins

I am building a jenkins pipeline and the job can be triggered by remote. I have the requirement to know which IP triggered the job. So I have a little groovy script, which returns the remote IP. With the EnvInject-plugin I can easily use this variable in a normal freestyle job, but how can I use this in the pipeline scirpt? I can't use the EnvInject-plugin with the pipeline-plugin :(
Here is the little script for getting the IP:
import hudson.model.*
import static hudson.model.Cause.RemoteCause
def ipaddress=""
for (CauseAction action : currentBuild.getActions(CauseAction.class)) {
for (Cause cause : action.getCauses()) {
if(cause instanceof RemoteCause){
ipaddress=cause.addr
break;
}
}
}
return ["ip":ipaddress]

You can create a shared library function (see here for examples and the directory structure). This is one of the undocumented (or really hard to find any documentation) features of Jenkins.
If you would put a file triggerIp.groovy in the directory vars, which is in the directory workflow-libs at the root level of JENKINS_HOME and put your code in that file.
The full filename then will be $JENKINS_HOME/workflow-libs/vars/ipTrigger.groovy
(You can even make a git repo for your shared libraries and clone it in that directory)
// workflow-libs/vars/ipTrigger.groovy
import hudson.model.*
import static hudson.model.Cause.RemoteCause
#com.cloudbees.groovy.cps.NonCPS
def call(currentBuild) {
def ipaddress=""
for (CauseAction action : currentBuild.getActions(CauseAction.class)) {
for (Cause cause : action.getCauses()) {
if(cause instanceof RemoteCause){
ipaddress=cause.addr
break;
}
}
}
return ["ip":ipaddress]
}
After a restart of Jenkins, from your pipeline script, you can call the method by the filename you gave it.
So from your pipeline just call def trigger = ipTrigger(currentBuild)
The the ipaddress will be, trigger.ip (sorry for the bad naming, couldn't come up with something original)

Related

Executing shell commands from inside Pipeline Shared Library

I'm writing a shared library that will get used in Pipelines.
class Deployer implements Serializable {
def steps
Deployer(steps) {
this.steps = steps
}
def deploy(env) {
// convert environment from steps to list
def process = "ls -l".execute(envlist, null)
process.consumeProcessOutput(output, error)
process.waitFor()
println output
println error
}
}
In the Jenkinsfile, I import the library, call the class and execute the deploy function inside a script section:
stage('mystep') {
steps {
script {
def deployer = com.mypackage.HelmDeployer("test")
deployer.deploy()
}
}
}
However, no output or errors are printed on the Console log.
Is it possible to execute stuff inside a shared library class? If so, how, and what am I doing wrong?
Yes, it is possible but not really an obvious solution. Every call that is usually done in the Jenkinsfile but was moved to the shared-library needs to reference the steps object you passed.
You can also reference the Jenkins environment by calling steps.env.
I will give you a short example:
class Deployer implements Serializable {
def steps
Deployer(steps) {
this.steps = steps
}
def callMe() {
// Always call the steps object
steps.echo("Test")
steps.echo("${steps.env.BRANCH_NAME}")
steps.sh("ls -al")
// Your command could look something like this:
// def process = steps.sh(script: "ls -l", returnStdout: true).execute(steps.env, null)
...
}
}
You also have to import the object of the shared library and create an instance of it. Define the following outside of your Pipeline.
import com.mypackage.Deployer // path is relative to your src/ folder of the shared library
def deployer = new Deployer(this) // 'this' references to the step object of the Jenkins
Then you can call it in your pipeline as the following:
... script { deployer.test() } ...

Jenkinsfile shared params in source control

I'm new to jenkins and inherited a bunch of declarative pipelines of unknown code quality. Each pipeline uses folder properties to set shared default param values. This puts essential variables outside of source control, which kills our PR process and our history for debugging. For example
//pipelineA/Jenkinsfile
pipeline {
parameters {
string name: 'important_variable', defaultValue: folderProperty('important_variable')
}
//etc
}
//pipelineB/Jenkinsfile
pipeline {
parameters {
string name: 'important_variable', defaultValue: folderProperty('important_variable')
}
//etc
}
Then in the root folder a property important_variable is set to "Hello World"
Is there a way to get this into source control either by setting the folder property to extract the variable from a yaml, or by using shared libraries?
Thank you for any help!
In case anyone reads this, we ended up:
Create a bootstrap.groovy file
This file MUST go in a /vars directory at the absolute top of your repo
Using the Jenkins UI we went to the pipeline's parent directory > config and created a shared library called config-lib that points at our repo and automatically exposes the bootstrap.groovy file methods as long as the file is in the right place
The bootstrap.groovy file has a call method that returns a map with key value pairs for our default parameters. This method has to be named call
In the Jenkinsfile for the pipeline we include the following two lines:
#Library("config-lib") _
config = bootstrap()
The library decorator (note it ends with _) imports the config-lib methods defined in the jenkins ui
The bootstrap function calls the call method from the bootstrap.groovy file in that config-lib library
in the Jenkinsfile use the config map to populate the param default values
pipeline {
parameters {
string name: 'foo', defaultValue: config.foo
}
And it's done.
This video helped immensely: https://youtu.be/Wj-weFEsTb0

Use multi-method global variables in Jenkins Shared Libraries

Consider this groovy file in a repo that is loaded as shared library in Jenkins:
/ vars
|
--- Utility.groovy
// Utility.groovy
def funcA() { ... }
def funcB() { ... }
And in the Jenkinsfile:
// Jenkinsfile
#Library('LibName') _
pipeline {
...
steps {
script {
def util = new Utility()
util.funcA()
}
}
}
This works fine. But if i try to load the library dynamically:
// Jenkinsfile
pipeline {
...
steps {
script {
library 'LibName'
def util = new Utility()
}
}
}
That doesn't work...
Can someone explain this with respect to this quote from the documentation:
The documentation of Shared Libraries in Jenkins says:
Internally, scripts in the vars directory are instantiated on-demand as singletons. This allows multiple methods to be defined in a single .groovy file for convenience.
Loading a Jenkins Shared Library dynamically has some limitation and challenges because of:
Using classes from the src/ directory is also possible, but trickier. Whereas the #Library annotation prepares the “classpath” of the script prior to compilation, by the time a library step is encountered the script has already been compiled. Therefore you cannot import or otherwise “statically” refer to types from the library. which is explained here
And it seems this question is kind of similar to this one.

How do I use `when` condition on a file's last modified time in Jenkins pipeline syntax

I am creating a Jenkins pipeline, I want certain stage to be triggered only when a particular log file's(log file is located in the server node where all the stages are going to run) last modified date is updated after the initiation of pipeline job, I understand we need to use "When" condition but not really sure how to implement it.
Tried referring some of the pipeline related portals but could not able to find an answer
Can some please help me through this?
Thanks in advance!
To get data about file is quite tricky in a Jenkins pipeline when using the Groovy sandbox since you're not allowed to do new File(...).lastModified. However there is the findFiles step, which basically returns a list of wrapped File objects with a getter for last modified time in millis, so we can use findFiles(glob: "...")[0].lastModified.
The returned array may be empty, so we should rather check on that (see full example below).
The current build start time in millis is accessible via currentBuild.currentBuild.startTimeInMillis.
Now that we git both, we can use them in an expression:
pipeline {
agent any
stages {
stage("create file") {
steps {
touch "testfile.log"
}
}
stage("when file") {
when {
expression {
def files = findFiles(glob: "testfile.log")
files && files[0].lastModified < currentBuild.startTimeInMillis
}
}
steps {
echo "i ran"
}
}
}
}

Can a Job DSL script be tested

Ideally I'd like to be able to invoke the script with some kind of unit test before I have it execute on a Jenkins.
Is there any way to test a Job DSL script other than having jenkins run it?
Besides the examples in job-dsl-gradle-example, you can also go a step further and write tests for individual files or jobs. For example let's assume you have a job configuration file located in jobs/deployJob.groovy
import javaposse.jobdsl.dsl.DslScriptLoader
import javaposse.jobdsl.dsl.MemoryJobManagement
import javaposse.jobdsl.dsl.ScriptRequest
import spock.lang.Specification
class TestDeployJobs extends Specification {
def 'test basic job configuration'() {
given:
URL scriptURL = new File('jobs').toURI().toURL()
ScriptRequest scriptRequest = new ScriptRequest('deployJob.groovy', null, scriptURL)
MemoryJobManagement jobManagement = new MemoryJobManagement()
when:
DslScriptLoader.runDslEngine(scriptRequest, jobManagement)
then:
jobManagement.savedConfigs.each { String name, String xml ->
with(new XmlParser().parse(new StringReader(xml))) {
// Make sure jobs only run manually
triggers.'hudson.triggers.TimerTrigger'.spec.text().isEmpty()
// only deploy every environment once at a time
concurrentBuild.text().equals('false')
// do a workspace cleanup
buildWrappers.'hudson.plugins.ws__cleanup.PreBuildCleanup'
// make sure masked passwords are active
!buildWrappers.'com.michelin.cio.hudson.plugins.maskpasswords.MaskPasswordsBuildWrapper'.isEmpty()
}
}
}
}
This way you are able to go through every XML node you want to make sure to have all the right values set.
Have a look at the job-dsl-gradle-example. The repo contains a test for DSL scripts.
Doing it in the same way as crasp but using Jenkins test harness as explained in Jenkins Unit Test page, which is slower but would work with auto-generated DSL giving syntax errors as explained here.
After setting the code as explained here, you can just do a test like this one:
#Unroll
void 'check descriptions #file.name'(File file) {
given:
JobManagement jobManagement = new JenkinsJobManagement(System.out, [:], new File('.'))
Jenkins jenkins = jenkinsRule.jenkins
when:
GeneratedItems items = new DslScriptLoader(jobManagement).runScript(file.text)
then:
if (!items.jobs.isEmpty()) {
items.jobs.each { GeneratedJob generatedJob ->
String text = getItemXml(generatedJob, jenkins)
with(new XmlParser().parse(new StringReader(text))) {
// Has some description
!description.text().isEmpty()
}
}
}
where:
file << TestUtil.getJobFiles()
}

Resources