How can I define a custom pipeline directive? - jenkins

For example something like:
pipeline {
customDirective {
sh "env"
..
}
}

This is currently not possible. You can only define custom pipeline steps via a Shared Library and use them within the stage/steps section and in the condition block of the post section. If you need for any reasons more customization you would have to have a look into the Scripted pipeline syntax. It allows to use most functionality of Groovy and is therefore very flexible.

This works:
customDirective.groovy (Shared Library)
def call(Closure body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
config.script()
}
Jenkinsfile
customDirective {
url="http://something.another.com"
title="The Title"
script = {
sh "env"
}
}

Related

How to pass parameters in a stage call in Jenkinsfile

Actually my Jenkinsfile looks like this:
#Library('my-libs') _
myPipeline{
my_build_stage(project: 'projectvalue', tag: '1.0' )
my_deploy_stage()
}
I am trying to pass these two variables (project and tag) to my build_stage.groovy, but it is not working.
What is the correct syntax to be able to use $params.project or $params.tag in my_build_stage.groovy?
Please see the below code which will pass parameters.
In your Jenkinsfile write below code:
// Global variable is used to get data from groovy file(shared library file)
def mylibrary
def PROJECT_VALUE= "projectvalue"
def TAG = 1
pipeline{
agent{}
stages{
stage('Build') {
steps {
script {
// Load Shared library Groovy file mylibs.Give your path of mylibs file which will contain all your function definitions
mylibrary= load 'C:\\Jenkins\\mylibs'
// Call function my_build stage and pass parameters
mylibrary.my_build_stage(PROJECT_VALUE, TAG )
}
}
}
stage('Deploy') {
steps {
script {
// Call function my_deploy_stage
mylibrary.my_deploy_stage()
}
}
}
}
}
Create a file named : mylibs(groovy file)
#!groovy
// Write or add Functions(definations of stages) which will be called from your jenkins file
def my_build_stage(PROJECT_VALUE,TAG_VALUE)
{
echo "${PROJECT_VALUE} : ${TAG_VALUE}"
}
def my_deploy_stage()
{
echo "In deploy stage"
}
return this

Passing environment variable as a pipeline parameter to Jenkins shared library

I have a shared Jenkins library that has my pipeline for Jenkinsfile. The library is structured as follows:
myPipeline.groovy file
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = params.name
}
}
}
}
}
And my Jenkinsfile is as follows:
Jenkinsfile
#Library("some-shared-lib") _
myPipeline{
name = "Some name"
}
Now, I would like to replace "Some name" string with "env.JOB_NAME" command. Normally in Jenkinsfile, I would use name = "${env.JOB_NAME}" to get the info, but because I am using my shared library instead, it failed to work. Error message is as follows:
java.lang.NullPointerException: Cannot get property 'JOB_NAME' on null object
I tried to play around with brackets and other notation but never got it to work. I think that I incorrectly pass a parameter. I would like Jenkinsfile to assign "${env.JOB_NAME}" to projectName variable, once library runs the pipeline that I am calling (via myPipeline{} command)
You can do like this in myPipeline.groovy:
def call(body) {
def params= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = params
body()
pipeline {
// My entire pipeline is here
// Demo stage
stage("Something"){
steps{
script{
projectName = "${env.JOB_NAME}"
}
}
}
}
}

jenkins declarative pipeline set variables derived from parameters

I am using a declarative pipeline in a Jenkinsfile but I would like to derive some variables from a parameter.
For example given:
parameters {
choice(name: 'Platform',choices: ['Debian9', 'CentOS7'], description: 'Target OS platform', )
}
I would like to add a block like:
script {
switch(param.Platform) {
case "Centos7":
def DockerFile = 'src/main/docker/Jenkins-Centos.Dockerfile'
def PackageType = 'RPM'
def PackageSuffix = '.rpm'
break
case "Debian9":
default:
def DockerFile = 'src/main/docker/Jenkins-Debian.Dockerfile'
def PackageType = 'DEB'
def PackageSuffix = '.deb'
break
}
}
Such that I can use variables elsewhere in the pipeline. For example:
agent {
dockerfile {
filename "$DockerFile"
}
}
etc..
but script is illegal in the parameter, environment & agent sections.
It can only be used in steps.
I need to use the parameter in the agent block and I want to avoid repeating myself where the variables are used in different steps.
Is there a sane way to achieve this? My preferences in order are:
a declarative pipeline
a scripted pipeline (less good)
via a plugin to the Jenkins UI (least good)
A shared library might be appropriate here regardless of whether it is actually shared.
The intention is to support a multi-configuration project by creating a parameterised build and invoking it for different parameter sets with a red/blue status light for each configuration.
It could be that I have assumed an 'old fashioned' design. In which case an acceptable answer would explain the modern best practice for creating a multi-configuration multi-branch pipeline. Something like: https://support.cloudbees.com/hc/en-us/articles/115000088431-Create-a-Matrix-like-flow-with-Pipeline or Jenkins Pipeline Multiconfiguration Project
See also Multiconfiguration / matrix build pipeline in Jenkins for less specific discussion of best practices.
Never really used the Jenkins declarative pipeline before but I think the way you refer to params is incorrect?
I think it might be: ${params.Platform} or params.Platform instead of param.
So something like the below maybe?
pipeline {
agent any
stages {
stage('example') {
steps {
script {
switch(${params.Platform}) {
...
}
}
}
}
}
}
As I said, never really used it before so not 100%. I was just looking at the syntax used for parameters on the docs: https://jenkins.io/doc/book/pipeline/syntax/#parameters
I think that the key for solving your issue is the declaration of your variables. Do not use def if you want your variable to be accessible from other stages.
Here is an example of a solution for your issue :
pipeline{
agent none
parameters {
choice(name: 'Platform',choices: ['Debian9', 'CentOS7'], description: 'Target OS platform', )
}
stages{
stage('Setting stage'){
agent any
steps {
script {
switch(params.Platform){
case 'CentOS7' :
DockerFile = 'src/main/docker/Jenkins-Centos.Dockerfile'
PackageType = 'RPM'
PackageSuffix = '.rpm'
break
case 'Debian9' :
DockerFile = 'src/main/docker/Jenkins-Debian.Dockerfile'
PackageType = 'DEB'
PackageSuffix = '.deb'
break
}
}
}
}
stage('Echo stage'){
agent {
dockerfile {
filename "$DockerFile"
}
}
steps{
echo PackageType
echo PackageSuffix
}
}
}
}
What is definitely possible on windows:
stage('var from groovy') {
steps {
script {
anvar = "foo"
}
bat "${anyvar}"
}
}
This is an example that I have in production
def dest_app_instance = "${params.Destination}"
switch(dest_app_instance) {
case "CASE1":
dest_server = "server1"
break
case "CASE2":
dest_server = "server2"
break
}

Evaluation of the configuration body of a custom pipeline step implementation

For a custom Jenkins pipeline step I have some pipeline using this step like:
library 'infrastructure'
infrastructurePipeline {
project ='jenkins'
artifact = ['master', 'backup']
dockerRegistry = [
credentialsId: 'dockerRegistryDeployer',
url : "http://${env.DOCKER_REGISTRY}"
]
}
However, in this context the variable env doesn't seem to be bound hence the expression can't be evaluated.
I can replace env by System.getenv() and it will work. But this comes with a side-effect that I would rather avoid. I have to approve the usage of System.getenv() which is warned to be a security vulnerabulity.
The custom step implements configuration evaluation as recommended in https://jenkins.io/doc/book/pipeline/shared-libraries/#defining-a-more-structured-dsl.
The most relevant code for the step is in vars/infrastructurePipeline.groovy
def call(Closure body) {
// evaluate the body block, and collect configuration into the object
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
def label=config.label ?: 'docker'
String tag = env.BUILD_NUMBER
String deliveryTag=null
String project = config.project
def artifact = config.artifact
...
}

Jenkins Declarative Pipeline: How to inject properties

I have Jenkins 2.19.4 with Pipeline: Declarative Agent API 1.0.1. How does one use readProperties if you cannot define a variable to assign properties read to?
For example, to capture SVN revision number, I currently capture it with following in Script style:
```
echo "SVN_REVISION=\$(svn info ${svnUrl}/projects | \
grep Revision | \
sed 's/Revision: //g')" > svnrev.txt
```
def svnProp = readProperties file: 'svnrev.txt'
Then I can access using:
${svnProp['SVN_REVISION']}
Since it is not legal to def svnProp in Declarative style, how is readProperties used?
You can use the script step inside the steps tag to run arbitrary pipeline code.
So something in the lines of:
pipeline {
agent any
stages {
stage('A') {
steps {
writeFile file: 'props.txt', text: 'foo=bar'
script {
def props = readProperties file:'props.txt';
env['foo'] = props['foo'];
}
}
}
stage('B') {
steps {
echo env.foo
}
}
}
}
Here I'm using env to propagate the values between stages, but it might be possible to do other solutions.
The Jon S answer requires granting script approval because it is setting environment variables. This is not needed when running in same stage.
pipeline {
agent any
stages {
stage('A') {
steps {
writeFile file: 'props.txt', text: 'foo=bar'
script {
def props = readProperties file:'props.txt';
}
sh "echo $props['foo']"
}
}
}
}
To define general vars available to all stages, define values for example in props.txt as:
version=1.0
fix=alfa
and mix script and declarative Jenkins pipeline as:
def props
def VERSION
def FIX
def RELEASE
node {
props = readProperties file:'props.txt'
VERSION = props['version']
FIX = props['fix']
RELEASE = VERSION + "_" + FIX
}
pipeline {
stages {
stage('Build') {
echo ${RELEASE}
}
}
}

Resources