Jenkins pipeline - How to give choice parameters dynamically - jenkins

pipeline {
agent any
stages {
stage("foo") {
steps {
script {
env.RELEASE_SCOPE = input message: 'User input required', ok: 'Release!',
parameters: [choice(name: 'RELEASE_SCOPE', choices: 'patch\nminor\nmajor',
description: 'What is the release scope?')]
}
echo "${env.RELEASE_SCOPE}"
}
}
}
}
In this above code, The choice are hardcoded (patch\nminor\nmajor) -- My requirement is to dynamically give choice values in the dropdown.
I get the values from calling api - Artifacts list (.zip) file names from artifactory
In the above example, It request input when we do the build, But i want to do a "Build with parameters"
Please suggest/help on this.

Depends how you get data from API there will be different options for it, for example let's imagine that you get data as a List of Strings (let's call it releaseScope), in that case your code be following:
...
script {
def releaseScopeChoices = ''
releaseScope.each {
releaseScopeChoices += it + '\n'
}
parameters: [choice(name: 'RELEASE_SCOPE', choices: ${releaseScopeChoices}, description: 'What is the release scope?')]
}
...
hope it will help.

This is a cutdown version of what we use. We separate stuff into shared libraries but I have consolidated a bit to make it easier.
Jenkinsfile looks something like this:
#!groovy
#Library('shared') _
def imageList = pipelineChoices.artifactoryArtifactSearchList(repoName, env.BRANCH_NAME)
imageList.add(0, 'build')
properties([
buildDiscarder(logRotator(numToKeepStr: '20')),
parameters([
choice(name: 'ARTIFACT_NAME', choices: imageList.join('\n'), description: '')
])
])
Shared library that looks at artifactory, its pretty simple.
Essentially make GET Request (And provide auth creds on it) then filter/split result to whittle down to desired values and return list to Jenkinsfile.
import com.cloudbees.groovy.cps.NonCPS
import groovy.json.JsonSlurper
import java.util.regex.Pattern
import java.util.regex.Matcher
List artifactoryArtifactSearchList(String repoKey, String artifact_name, String artifact_archive, String branchName) {
// URL components
String baseUrl = "https://org.jfrog.io/org/api/search/artifact"
String url = baseUrl + "?name=${artifact_name}&repos=${repoKey}"
Object responseJson = getRequest(url)
String regexPattern = "(.+)${artifact_name}-(\\d+).(\\d+).(\\d+).${artifact_archive}\$"
Pattern regex = ~ regexPattern
List<String> outlist = responseJson.results.findAll({ it['uri'].matches(regex) })
List<String> artifactlist=[]
for (i in outlist) {
artifactlist.add(i['uri'].tokenize('/')[-1])
}
return artifactlist.reverse()
}
// Artifactory Get Request - Consume in other methods
Object getRequest(url_string){
URL url = url_string.toURL()
// Open connection
URLConnection connection = url.openConnection()
connection.setRequestProperty ("Authorization", basicAuthString())
// Open input stream
InputStream inputStream = connection.getInputStream()
#NonCPS
json_data = new groovy.json.JsonSlurper().parseText(inputStream.text)
// Close the stream
inputStream.close()
return json_data
}
// Artifactory Get Request - Consume in other methods
Object basicAuthString() {
// Retrieve password
String username = "artifactoryMachineUsername"
String credid = "artifactoryApiKey"
#NonCPS
credentials_store = jenkins.model.Jenkins.instance.getExtensionList(
'com.cloudbees.plugins.credentials.SystemCredentialsProvider'
)
credentials_store[0].credentials.each { it ->
if (it instanceof org.jenkinsci.plugins.plaincredentials.StringCredentials) {
if (it.getId() == credid) {
apiKey = it.getSecret()
}
}
}
// Create authorization header format using Base64 encoding
String userpass = username + ":" + apiKey;
String basicAuth = "Basic " + javax.xml.bind.DatatypeConverter.printBase64Binary(userpass.getBytes());
return basicAuth
}

I could achieve it without any plugin:
With Jenkins 2.249.2 using a declarative pipeline,
the following pattern prompt the user with a dynamic dropdown menu
(for him to choose a branch):
(the surrounding withCredentials bloc is optional, required only if your script and jenkins configuration do use credentials)
node {
withCredentials([[$class: 'UsernamePasswordMultiBinding',
credentialsId: 'user-credential-in-gitlab',
usernameVariable: 'GIT_USERNAME',
passwordVariable: 'GITLAB_ACCESS_TOKEN']]) {
BRANCH_NAMES = sh (script: 'git ls-remote -h https://${GIT_USERNAME}:${GITLAB_ACCESS_TOKEN}#dns.name/gitlab/PROJS/PROJ.git | sed \'s/\\(.*\\)\\/\\(.*\\)/\\2/\' ', returnStdout:true).trim()
}
}
pipeline {
agent any
parameters {
choice(
name: 'BranchName',
choices: "${BRANCH_NAMES}",
description: 'to refresh the list, go to configure, disable "this build has parameters", launch build (without parameters)to reload the list and stop it, then launch it again (with parameters)'
)
}
stages {
stage("Run Tests") {
steps {
sh "echo SUCCESS on ${BranchName}"
}
}
}
}
The drawback is that one should refresh the jenkins configration and use a blank run for the list be refreshed using the script ...
Solution (not from me): This limitation can be made less anoying using an aditional parameters used to specifically refresh the values:
parameters {
booleanParam(name: 'REFRESH_BRANCHES', defaultValue: false, description: 'refresh BRANCH_NAMES branch list and launch no step')
}
then wihtin stage:
stage('a stage') {
when {
expression {
return ! params.REFRESH_BRANCHES.toBoolean()
}
}
...
}

this is my solution.
def envList
def dockerId
node {
envList = "defaultValue\n" + sh (script: 'kubectl get namespaces --no-headers -o custom-columns=":metadata.name"', returnStdout: true).trim()
}
pipeline {
agent any
parameters {
choice(choices: "${envList}", name: 'DEPLOYMENT_ENVIRONMENT', description: 'please choose the environment you want to deploy?')
booleanParam(name: 'SECURITY_SCAN',defaultValue: false, description: 'container vulnerability scan')
}

The example of Jenkinsfile below contains AWS CLI command to get the list of Docker images from AWS ECR dynamically, but it can be replaced with your own command. Active Choices Plug-in is required.
Note! You need to approve the script specified in parameters after first run in "Manage Jenkins" -> "In-process Script Approval", or open job configuration and save it to approve
automatically (might require administrator permissions).
properties([
parameters([[
$class: 'ChoiceParameter',
choiceType: 'PT_SINGLE_SELECT',
name: 'image',
description: 'Docker image',
filterLength: 1,
filterable: false,
script: [
$class: 'GroovyScript',
fallbackScript: [classpath: [], sandbox: false, script: 'return ["none"]'],
script: [
classpath: [],
sandbox: false,
script: '''\
def repository = "frontend"
def aws_ecr_cmd = "aws ecr list-images" +
" --repository-name ${repository}" +
" --filter tagStatus=TAGGED" +
" --query imageIds[*].[imageTag]" +
" --region us-east-1 --output text"
def aws_ecr_out = aws_ecr_cmd.execute() | "sort -V".execute()
def images = aws_ecr_out.text.tokenize().reverse()
return images
'''.stripIndent()
]
]
]])
])
pipeline {
agent any
stages {
stage('First stage') {
steps {
sh 'echo "${image}"'
}
}
}
}

choiceArray = [ "patch" , "minor" , "major" ]
properties([
parameters([
choice(choices: choiceArray.collect { "$it\n" }.join(' ') ,
description: '',
name: 'SOME_CHOICE')
])
])

Related

How to pass parameters and variables from a file to jenkinsfile?

I'm trying to convert my jenkins pipeline to a shared library since it can be reusable on most of the application. As part of that i have created groovy file in vars folder and kept pipeline in jenkins file in github and able to call that in jenkins successfully
As part of improving this i want to pass params, variables, node labels through a file so that we should not touch jenkins pipeline and if we want to modify any vars, params, we have to do that in git repo itself
pipeline {
agent
{
node
{
label 'jks_deployment'
}
}
environment{
ENV_CONFIG_ID = 'jenkins-prod'
ENV_CONFIG_FILE = 'test.groovy'
ENV_PLAYBOOK_NAME = 'test.tar.gz'
}
parameters {
string (
defaultValue: 'test.x86_64',
description: 'Enter app version',
name: 'app_version'
)
choice (
choices: ['10.0.0.1','10.0.0.2','10.0.0.3'],
description: 'Select a host to be delpoyed',
name: 'host'
)
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
def props = readProperties file: 'extravars.properties'
env.var1 = props.var1
env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
In above code,i used pipeline utility steps plugin and able to read variables from extravars.properties file. Is it same way we can do for jenkins parameters also? Or do we have any suitable method to take care of passing this parameters via a file from git repo?
Also is it possible to pass variable for node label also?
=====================================================================
Below are the improvements which i have made in this project
Used node label plugin to pass the node name as variable
Below is my vars/sayHello.groovy file content
def call(body) {
// evaluate the body block, and collect configuration into the object
def pipelineParams= [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = pipelineParams
body()
pipeline {
agent
{
node
{
label "${pipelineParams.slaveName}"
}
}
stages {
stage("reading properties from properties file") {
steps {
// Use a script block to do custom scripting
script {
// def props = readProperties file: 'extravars.properties'
// script {
readProperties(file: 'extravars.properties').each {key, value -> env[key] = value }
//}
// env.var1 = props.var1
// env.var2 = props.var2
}
echo "The variable 1 value is $var1"
echo "The variable 2 value is $var2"
}
}
stage ('stage2') {
steps {
sh "echo ${var1}"
sh "echo ${var2}"
sh "echo ${pipelineParams.appVersion}"
sh "echo ${pipelineParams.hostIp}"
}
}
}
}
}
Below is my vars/params.groovy file
properties( [
parameters([
choice(choices: ['10.80.66.171','10.80.67.6','10.80.67.200'], description: 'Select a host to be delpoyed', name: 'host')
,string(defaultValue: 'fxxxxx.x86_64', description: 'Enter app version', name: 'app_version')
])
] )
Below is my jenkinsfile
def _hostIp = params.host
def _appVersion = params.app_version
sayHello {
slaveName = 'master'
hostIp = _hostIp
appVersion = _appVersion
}
Now Is it till we can improve this?Any suggestions let me know.

How to configure dynamic parameters in declarative pipeline (Jenkinsfile)?

Using the declarative pipeline syntax, I want to be able to define parameters based on an array of repos, so that when starting the build, the user can check/uncheck the repos that should not be included when the job runs.
final String[] repos = [
'one',
'two',
'three',
]
pipeline {
parameters {
booleanParam(name: ...) // static param
// now include a booleanParam for each item in the `repos` array
// like this but it's not allowed
script {
repos.each {
booleanParam(name: it, defaultValue: true, description: "Include the ${it} repo in the release?")
}
}
}
// later on, I'll loop through each repo and do stuff only if its value in `params` is `true`
}
Of course, you can't have a script within the parameters block, so this won't work. How can I achieve this?
Using the Active Choices Parameter plugin is probably the best choice, but if for some reason you can't (or don't want to) use a plugin, you can still achieve dynamic parameters in a Declarative Pipeline.
Here is a sample Jenkinsfile:
def list_wrap() {
sh(script: 'echo choice1 choice2 choice3 choice4 | sed -e "s/ /\\n/g"', , returnStdout: true)
}
pipeline {
agent any
stages {
stage ('Gather Parameters') {
steps {
timeout(time: 30, unit: 'SECONDS') {
script {
properties([
parameters([
choice(
description: 'List of arguments',
name: 'service_name',
choices: 'DEFAULT\n' + list_wrap()
),
booleanParam(
defaultValue: false,
description: 'Whether we should apply changes',
name: 'apply'
)
])
])
}
}
}
}
stage ('Run command') {
when { expression { params.apply == true } }
steps {
sh """
echo choice: ${params.service_name} ;
"""
}
}
}
}
This embeds a script {} in a stage, which calls a function, which runs a shell script on the agent/node of the Declarative Pipeline, and uses the script's output to set the choices for the parameters. The parameters are then available in the next stages.
The gotcha is that you must first run the job with no build parameters in order for Jenkins to populate the parameters, so they're always going to be one run out of date. That's why the Active Choices Parameter plugin is probably a better idea.
You could also combine this with an input command to cause the pipeline to prompt the user for a parameter:
script {
def INPUT_PARAMS = input message: 'Please Provide Parameters', ok: 'Next',
parameters: [
choice(name: 'ENVIRONMENT', choices: ['dev','qa'].join('\n'), description: 'Please select the Environment'),
choice(name: 'IMAGE_TAG', choices: getDockerImages(), description: 'Available Docker Images')]
env.ENVIRONMENT = INPUT_PARAMS.ENVIRONMENT
env.IMAGE_TAG = INPUT_PARAMS.IMAGE_TAG
}
Credit goes to Alex Lashford (https://medium.com/disney-streaming/jenkins-pipeline-with-dynamic-user-input-9f340fb8d9e2) for this method.
You can use CHOICE parameter of Jenkins in which user can select a repository.
pipeline {
agent any
parameters
{
choice(name: "REPOS", choices: ['REPO1', 'REPO2', 'REPO3'])
}
stages {
stage ('stage 1') {
steps {
// the repository selected by user will be printed
println("$params.REPOS")
}
}
}
}
You can also use the plugin Active Choices Parameter if you want to do multiple select : https://plugins.jenkins.io/uno-choice/#documentation
You can visit pipeline syntax and configure in below way to generate code snippet and you can put it in the jenkins file:
Copy the snippet code and paste it in jenkinsfile at the start.

Parse and Return Jenkins Console Output

Within Jenkins, I would like to parse the ansible playbook "Play Recap" output section for the failing hostname(s). I want to put the information into an email or other notification. This could also be used to fire off another Jenkins job.
I'm currently submitting an ansible-playbook as a jenkins job to deploy software across a number of systems. I'm using a Jenkins Pipeline script, which was necessary to implement for sshagent to be applied correctly.
pipeline {
agent any
options {
ansiColor('xterm')
}
stages {
stage("setup environment") {
steps {
deleteDir()
} //steps
} //stage - setup environment
stage("clone the repo") {
environment {
GIT_SSH_COMMAND = "ssh -o StrictHostKeyChecking=no"
} //environment
steps {
sshagent(['my_git']) {
sh "git clone ssh://git#github.com/~usr/ansible.git"
} //sshagent
} //steps
} //stage - clone the repo
stage("run ansible playbook") {
steps {
sshagent (credentials: ['apps']) {
withEnv(['ANSIBLE_CONFIG=ansible.cfg']) {
dir('ansible') {
ansiblePlaybook(
becomeUser: null,
colorized: true,
credentialsId: 'apps',
disableHostKeyChecking: true,
forks: 50,
hostKeyChecking: false,
inventory: 'hosts',
limit: 'production:&*generic',
playbook: 'demo_play.yml',
sudoUser: null,
extras: '-vvvvv'
) //ansiblePlaybook
} //dir
} //withEnv
} //sshagent
} //steps
} //stage - run ansible playbook
} //stages
post {
failure {
emailext body: "Please go to ${env.BUILD_URL}/consoleText for more details.",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']],
subject: "${env.JOB_NAME}",
to: 'our.dev.team#gmail.com',
attachLog: true
office365ConnectorSend message:"A production system appears to be unreachable.",
status:"Failed",
color:"f00000",
factDefinitions: [[name: "Credentials ID", template: "apps"],
[name: "Build Duration", template: "${currentBuild.durationString}"],
[name: "Full Name", template: "${currentBuild.fullDisplayName}"]],
webhookUrl:'https://outlook.office.com/webhook/[really long alphanumeric key]/IncomingWebhook/[another super-long alphanumeric key]'
} //failure
} //post
} //pipeline
There are several Jenkins plug-ins for parsing the console output, but none will let me capture and utilize text. I have looked at log-parser and text finder.
The only lead I have is using groovy to script this.
https://devops.stackexchange.com/questions/5363/jenkins-groovy-to-parse-console-output-and-mark-build-failure
An example of "Play Recap" within the console output is:
PLAY RECAP **************************************************************************************************************************************************
some.host.name : ok=25 changed=2 unreachable=0 failed=1 skipped=2 rescued=0 ignored=0
some.ip.address : ok=22 changed=2 unreachable=0 failed=0 skipped=1 rescued=0 ignored=0
I am trying to get either a list or a delimited string of each host that is failing. Although, in the case of a list, I need to figure out how to send multiple notifications.
If anyone could help me with the full solution, I would very much appreciate your help.
Q: "Parse the ansible playbook 'Play Recap' output section."
A: Use json callback and parse the output with jq. For example
shell> ANSIBLE_STDOUT_CALLBACK=json ansible-playbook pb.yml | jq .stats
There are a few 'gotchas' that I came across as I solved this problem.
The only successful way I could access the output of the ansible plugin was through pulling the raw log file. def log = currentBuild.rawBuild.getLog(100) In this case I only pulled the last 100 lines, as I'm only looking for the Play Recap box. This method requires special permissions. The console log will display the error and provide a link where the functions can be allowed.
The ansible output should not be colorized. colorized: false Colorized output is quite difficult to parse. The 'console log' doesn't show you the colorized markup, however if you look at the 'consoleText' you will see it.
When using regex, you will most likely have a matcher object which is non-serializable. To use this in Jenkins, it may need to be placed in a function tagged #NonCPS which stops Jenkins from trying to serialize the object. I had mixed results with needing this, so I don't exhaustively understand where it's required.
The regex statement was one of the harder parts for me. I came up with a generic statement that could be easily modified for different scenarios e.g. failed or unreachable. I also had more luck using the 'slashy-style' regex in groovy which places a forward slash on either end of the statement with no need for quotes of any kind. You'll note the 'failed' portion is different failed=([1-9]|[1-9][0-9]), so that it only matches a statement where the failure is non-zero.
/([0-9a-zA-Z\.\-]+)(?=[ ]*:[ ]*ok=([0-9]|[1-9][0-9])[ ]*changed=([0-9]|[1-9][0-9])[ ]*unreachable=([0-9]|[1-9][0-9])[ ]*failed=([1-9]|[1-9][0-9]))/
Here's the full pipeline code that I came up with.
pipeline {
agent any
options {
ansiColor('xterm')
}
stages {
stage("setup environment") {
steps {
deleteDir()
} //steps
} //stage - setup environment
stage("clone the repo") {
environment {
GIT_SSH_COMMAND = "ssh -o StrictHostKeyChecking=no"
} //environment
steps {
sshagent(['my_git']) {
sh "git clone ssh://git#github.com/~usr/ansible.git"
} //sshagent
} //steps
} //stage - clone the repo
stage("run ansible playbook") {
steps {
sshagent (credentials: ['apps']) {
withEnv(['ANSIBLE_CONFIG=ansible.cfg']) {
dir('ansible') {
ansiblePlaybook(
becomeUser: null,
colorized: false,
credentialsId: 'apps',
disableHostKeyChecking: true,
forks: 50,
hostKeyChecking: false,
inventory: 'hosts',
limit: 'production:&*generic',
playbook: 'demo_play.yml',
sudoUser: null,
extras: '-vvvvv'
) //ansiblePlaybook
} //dir
} //withEnv
} //sshagent
} //steps
} //stage - run ansible playbook
} //stages
post {
failure {
script {
problem_hosts = get_the_hostnames()
}
emailext body: "${problem_hosts} has failed. Please go to ${env.BUILD_URL}/consoleText for more details.",
recipientProviders: [[$class: 'DevelopersRecipientProvider'], [$class: 'RequesterRecipientProvider']],
subject: "${env.JOB_NAME}",
to: 'our.dev.team#gmail.com',
attachLog: true
office365ConnectorSend message:"${problem_hosts} has failed.",
status:"Failed",
color:"f00000",
factDefinitions: [[name: "Credentials ID", template: "apps"],
[name: "Build Duration", template: "${currentBuild.durationString}"],
[name: "Full Name", template: "${currentBuild.fullDisplayName}"]],
webhookUrl:'https://outlook.office.com/webhook/[really long alphanumeric key]/IncomingWebhook/[another super-long alphanumeric key]'
} //failure
} //post
} //pipeline
//#NonCPS
def get_the_hostnames() {
// Get the last 100 lines of the log
def log = currentBuild.rawBuild.getLog(100)
print log
// GREP the log for the failed hostnames
def matches = log =~ /([0-9a-zA-Z\.\-]+)(?=[ ]*:[ ]*ok=([0-9]|[1-9][0-9])[ ]*changed=([0-9]|[1-9][0-9])[ ]*unreachable=([0-9]|[1-9][0-9])[ ]*failed=([1-9]|[1-9][0-9]))/
def hostnames = null
// if any matches occurred
if (matches) {
// iterate over the matches
for (int i = 0; i < matches.size(); i++) {
// if there is a name, concatenate it
// else populate it
if (hostnames?.trim()) {
hostnames = hostnames + " " + matches[i]
} else {
hostnames = matches[i][0]
} // if/else
} // for
} // if
if (!hostnames?.trim()) {
hostnames = "No hostnames identified."
}
return hostnames
}

Jenkins dynamic paramters based on groovy method

I am trying to do the below things as part of Jenkins pipeline dsl.
I have a yaml file where i store all my static values.
I created a pipeline job which should show 2 parameters.
a)region : northamerica/europe
b)environment : this should come based on the region selected.
I am defining the 2 functions outside of the pipeline so that i can use them during the parameters section.
Syntax:
#!/usr/bin/env groovy
def yaml_file = "JenkinsFiles/environments.yaml"
def getRegions() {
def var_regions = []
yaml_settings.environments.each { key, value -> var_regions.add(key) }
return var_regions
}
def getEnvironments(String region) {
def var_envs = []
yaml_settings.environments."${region}".non_prod.each { key, value -> var_envs.add("\"" + key + "\"") }
return var_envs
}
environment {
yaml_settings = {}
}
pipeline {
agent
{
node
{
label 'docker'
}
}
stages {
stage('Prepare') {
steps{
script{
yaml_settings = readYaml file: "${yaml_file}"
list_regions = getRegions()
properties([
parameters([
choice(choices: list_regions , description: 'Please select region to deploy', name: 'REGION'),
[$class: 'CascadeChoiceParameter', choiceType: 'PT_SINGLE_SELECT', description: 'Please select environment to deploy', filterLength: 1, filterable: false, name: 'ACP_ENVIRONMENTS', randomName: 'choice-parameter-deploy-env', referencedParameters: 'REGION', script: [$class: 'GroovyScript', fallbackScript: [classpath: [], sandbox: false, script: ''], script: [classpath: [], sandbox: true, script: """
envs = getEnvironments($REGION)
return $envs
"""]]]])])}}}}}
Issue:
The getEnvironments method is not returning the value into the variable and its not getting effected in the parameter. But $region value is coming into though.I can do if else based on the reference parameter and get the value but i dont want to use if else coz i will get many values down the line.
HELP APPRECIATED!!
As with many other questions, the issue here is that Jenkins needs to know the parameters before it executes your pipeline. Once the pipeline is running, the parameters have already been defined, and any change to the parameters won't impact this build.
You may want to take a look at the ActiveChoice plugin to address this.

Jenkins - matrix jobs - variables on different slaves overwrite each other?

I think i dont get how matrix builds work. When i set some variable in some stage depending on which node i run, then on rest of the stage sometimes this variable is set as it should and sometimes it gets values from other nodes (axes). In example below its like job which runs on ub18-1 sometimes has VARIABLE1='Linux node' and sometimes is VARIABLE1='Windows node'. Or gitmethod sometimes it is created from LinuxGitInfo and sometimes WindowsGitInfo.
Source i based on
https://jenkins.io/doc/book/pipeline/syntax/#declarative-matrix
Script almost exactly the same as real one
#Library('firstlibrary') _
import mylib.shared.*
pipeline {
parameters {
booleanParam name: 'AUTO', defaultValue: true, description: 'Auto mode sets some parameters for every slave separately'
choice(name: 'SLAVE_NAME', choices:['all', 'ub18-1','win10'],description:'Run on specific platform')
string(name: 'BRANCH',defaultValue: 'master', description: 'Preferably common label for entire group')
booleanParam name: 'SONAR', defaultValue: false, description: 'Scan and gateway'
booleanParam name: 'DEPLOY', defaultValue: false, description: 'Deploy to Artifactory'
}
agent none
stages{
stage('BuildAndTest'){
matrix{
agent {
label "$NODE"
}
when{ anyOf{
expression { params.SLAVE_NAME == 'all'}
expression { params.SLAVE_NAME == env.NODE}
}}
axes{
axis{
name 'NODE'
values 'ub18-1', 'win10'
}
}
stages{
stage('auto mode'){
when{
expression { return params.AUTO }
}
steps{
echo "Setting parameters for each slave"
script{
nodeLabelsList = env.NODE_LABELS.split()
if (nodeLabelsList.contains('ub18-1')){
println("Setting params for ub18-1");
VARIABLE1 = 'Linux node'
}
if (nodeLabelsList.contains('win10')){
println("Setting params for Win10");
VARIABLE1 = 'Windows node'
}
if (isUnix()){
gitmethod = new LinuxGitInfo(this,env)
} else {
gitmethod = new WindowsGitInfo(this, env)
}
}
}
}
stage('GIT') {
steps {
checkout scm
}
}
stage('Info'){
steps{
script{
sh 'printenv'
echo "branch: " + env.BRANCH_NAME
echo "SLAVE_NAME: " + env.NODE_NAME
echo VARIABLE1
gitinfo = new GitInfo(gitmethod)
gitinfo.init()
echo gitinfo.author
echo gitinfo.id
echo gitinfo.msg
echo gitinfo.buildinfo
}
}
}
stage('install'){
steps{
sh 'make install'
}
}
stage('test'){
steps{
sh 'make test'
}
}
}
}
}
}
}
Ok i solved the problem by defining variables maps with node/slave names as keys. Some friend even suggested to define variables in yml/json file in repository and parse them. Maybe i will, but so far this works well
example:
before the pipelines
def DEPLOYmap = [
'ub18-1': false,
'win10': true
]
in stages
when {
equals expected: true, actual: DEPLOYmap[NODE]
}

Resources