How to define and get/put the values in Jenkinsfile groovy map - jenkins

I have this Jenkinsfile below. I am trying to get the key of a map but I am getting "java.lang.NoSuchMethodError: No such DSL method 'get' found among steps". Can someone help me to resolve this?
def country_capital = {
[Australia : [best: 'xx1', good: 'xx2', bad: 'xx3'],
America : [best: 'yy1', good: 'yy2', bad: 'yy3']]
}
pipeline {
agent any
stages {
stage('Test Map') {
steps {
script {
echo country_capital.get('Australia')['best']
}
}
}
}
}

You can get the value using this way
def country_capital = [
Australia: [
best: 'xx1',
good: 'xx2',
bad: 'xx3'
],
America: [
best: 'yy1',
good: 'yy2',
bad: 'yy3'
]
]
pipeline {
agent any
stages {
stage('Test Map') {
steps {
script {
echo country_capital['Australia'].best
}
}
}
}
}
// Output
xx1

For the above example one can also do
country_capital.each { capital_key, capital_value ->
try {
echo "Testing ${capital_value.best}..."
}
catch(ex){
echo "Test failed: ${capital_value.bad}" }
}

Related

Parallel execution inside the post step

I am building to 2 different environments in the same pipeline and I want to make the cleanup for both environments in parallel.
As I understood, parallel does not work inside the post step: post step parallel.
Any suggestions? Example of my code:
post {
always {
script{
cleanup(env1)
cleanup(env2)
}
}
}
def cleanup(env) {
withEnv(env) {
sh "./cleanup.py"
}
}
The parallel keyword can work inside a post condition as long as it is encapsulated inside a script block, as the script blocks is just a fallback to the scripted pipeline which will allow you to run parallel execution wherever you want.
The following should work fine:
post {
always{
script {
def environments = ['env1', 'env2', 'env3']
parallel environments.collectEntries {
["Cleanup ${it}" : {
cleanup(it)
}]
}
}
}
}
def cleanup(env) {
withEnv(env) {
sh "./cleanup.py"
}
}
Just don't forget to allocate an agent using the node keyword if the steps in the post section are required to run on a specific agent.
A better idea in my opinion is to clean up after the fact, before you possibly lost the node to another job:
parallel {
stage('env1') {
agent { node { label "env1" }}
steps {
script {
println "Inside env1"
}
}
post {
cleanup { script { my_cleanup_func("env1") } }
}
}
stage('env2') {
agent { node { label "env2" }}
steps {
script {
println "Inside env2"
}
}
post {
cleanup { script { my_cleanup_func("env2") } }
}
}
...
def my_cleanup_func(String env) {
// ...
}

Execute a script in post on both unstable and success in a jenkins pipeline

My pipeline looks like that:
pipeline{
...
post {
always {
archiveArtifacts artifacts: 'artifacts/**/*'
script {
...
}
rtp stableText: '${FILE:artifacts/summary.html}', parserName: 'HTML'
}
success {
script {
...
}
}
}
}
I'd like that the script which is executed on success, was executed also on unstable, how can I achieve that?
Is there a way to specify success or unstable {?
Or is there a way to declare the action to take somewhere else and "invoke" it in a success and in an unstable tags?
you can also do like below
def commonPostSteps() {
echo "Hello World"
script {
def x =10
print x + 20
}
}
pipeline {
agent any;
stages {
stage('one') {
steps {
echo "${env.STAGE_NAME}"
}
}
}
post {
always {
echo "post always"
}
success {
commonPostSteps()
}
unstable {
commonPostSteps()
}
}
}

java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps

This question ties in with one of my earlier questions here
Tl;dr of the linked question:
Basically I want a generic pipeline to generate distributable bundles (zip files etc) of any of my applications. An application can have multiple components (almost all components are Java/Spring or NodeJS projects).
The plan was to store a pipeline descriptor of each application in a JSON file like such:
{
"app": "MyApp",
"components": [
{
"name": "Component1",
"scmUrl": "https://git.mycompany.com/app/component1.git",
"buildCmd": "mvn clean install"
},
{
"name": "Component2",
"scmUrl": "https://git.mycompany.com/app/component2.git",
"buildCmd": "npm run build"
},
]
}
There will be a descriptor for each application and will be checked into separate repository.
When the pipeline is run the required application name will be an input parameter and the above repo will be cloned and the respective JSON descriptor will be loaded.
This is where things start to get tricky. All components will have some common stages (Checkout, Build, Docker Build). So I am trying to loop the components array and run the stages in parallel:
def parallelCheckoutStages = components.collectEntries {
[ "Checkout ${it.name}", generateCheckoutStage(it) ]
}
generateCheckoutStage(component) {
return {
stage("Stage: ${component.name}") {
script {
git(url: component.scmUrl, branch: component.branch)
}
}
}
}
def parallelBuildStages = components.collectEntries {
[ "Build ${it.name}", generateBuildStage(it) ]
}
generateBuildStage(component) {
return {
stage("Stage: ${component.name}") {
script {
sh script "${component.buildCmd}"
}
}
}
}
pipeline {
agent any
stages {
.
. // clone repo and load json
.
stage("Checkout Components") {
steps {
script {
parallel parallelCheckoutStages
}
}
}
stage("Build Components") {
steps {
script {
parallel parallelBuildStages
}
}
}
}
}
Sometimes I need run the build command inside a docker container (only for some components). To accomplish this I want to edit the generateBuildStage method to something like this:
generateBuildStage(component) {
if(component.requiresDocker) {
return {
stage("Stage: ${component.name}") {
agent {
docker {
image 'jdk11-mvn3.6'
}
}
script {
sh script "${component.buildCmd}"
}
}
}
} else {
return {
stage("Stage: ${component.name}") {
script {
sh script "${component.buildCmd}"
}
}
}
}
}
When I run the above code I get an error java.lang.NoSuchMethodError: No such DSL method 'agent' found among steps
Sort of a second part to my question, does my pipeline seem hacky? I could replace the entire parallel stages by creating individual jobs for each component and calling them from the pipeline. Which approach is better?

Jenkins Pipeline Conditional Environmental Variables

I have a set of static environmental variables in the environmental directive section of a declarative pipeline. These values are available to every stage in the pipeline.
I want the values to change based on an arbitrary condition.
Is there a way to do this?
pipeline {
agent any
environment {
if ${params.condition} {
var1 = '123'
var2 = abc
} else {
var1 = '456'
var2 = def
}
}
stages {
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
stag('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
Looking for the same thing I found a nice answer in other question:
Basically is to use the ternary conditional operator
pipeline {
agent any
environment {
var1 = "${params.condition == true ? "123" : "456"}"
var2 = "${params.condition == true ? abc : def}"
}
}
Note: keep in mind that in the way you wrote your question (and I did my answer) the numbers are Strings and the letters are variables.
I would suggest you to create a stage "Environment" and declare your variable according to the condition you want, something like below:-
pipeline {
agent any
environment {
// Declare variables which will remain same throughout the build
}
stages {
stage('Environment') {
agent { node { label 'master' } }
steps {
script {
//Write condition for the variables which need to change
if ${params.condition} {
env.var1 = '123'
env.var2 = abc
} else {
env.var1 = '456'
env.var2 = def
}
sh "printenv"
}
}
}
stage('One') {
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
stage('Two'){
steps {
script {
...
echo env.var1
echo env.var2
...
}
}
}
}
}
Suppose we want to use optional params for downstream job if it is called from upsteam job, and default params if downsteam job is called by itself.
But we don't want to have "holder" params with default value in downstream for some reason.
This could be done via groovy function:
upstream Jenkinsfile - param CREDENTIALS_ID is passed downsteam
pipeline {
stage {
steps {
build job: "my_downsteam_job_name",
parameters [string(name: 'CREDENTIALS_ID', value: 'other_credentials_id')]
}
}
}
downstream Jenkinsfile - if param CREDENTIALS_ID not passed from upsteam, function returns default value
def getCredentialsId() {
if(params.CREDENTIALS_ID) {
return params.CREDENTIALS_ID;
} else {
return "default_credentials_id";
}
}
pipeline {
environment{
TEST_PASSWORD = credentials("${getCredentialsId()}")
}
}
you can get another level of flexibility, using maps:
stage("set_env_vars") {
steps {
script {
def MY_MAP1 = [A: "123", B: "456", C: "789"]
def MY_MAP2 = [A: "abc", B: "def", C: "ghi"]
env.var1 = MY_MAP1."${env.switching_var}"
env.var2 = MY_MAP2."${env.switching_var}"
}
}
}
This way, more choices are possible.

Dynamically defining parallel steps in declarative jenkins pipeline

I try to parallelize dynamically defined set of functions as follows:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
{ somefunc() },
{ somefunc2() }
]
pipeline {
agent none
stages{
stage('Run') {
steps {
parallel(running_set)
}
}
}
}
And what I end up with is:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 17: No "steps" or "parallel" to execute within stage "Run" # line 17, column 9.
stage('Run') {
Although steps are defined within stage 'Run'. Anyway what I would like to achieve running is a dynamically defined set of functions to execute in parallel.
If you want to use dynamic parallel block with declarative pipeline script, you have to apply two changes to your Jenkinsfile:
You have to define running_set as a Map like ["task 1": { somefunc()}, "task 2": { somefunc2() }] - keys from this map are used as parallel stages names
You have to pass running_set to parallel method inside script {} block
Here is what updated Jenkinsfile could look like:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
"task1": {
somefunc()
},
"task2": {
somefunc2()
}
]
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel(running_set)
}
}
}
}
}
And here is what it looks like in Blue Ocean UI:
It is not obvious. But Szymon's way can be very straightforward.
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel([
'parallelTask1_Name': {
any code you like
},
'parallelTask2_Name': {
any other code you like
},
... etc
])
}
}
}
}
}

Resources