in jenkins declarative pipeline is there a way to execute pre condition whereby it should load build parameters from a file. In Jenkins there is an option whereby we can restart individual stage. Therefore, i wish for each stage to load the parameters from groovy file.
Currently is
pipeline {
agent any
stage("Grep the values") {
steps {
load "${WORKSPACE}/file-parameter.groovy"
}
}
stage("Perform Deploynment) {
when {
expression { "${Perform_Deployment}" == "true" }
}
steps {
withCredentials([
usernamePassword(credentialsId: "LoginID", passwordVariable: "LoginPassword", usernameVariable: "LoginUser")
]) {
ansiblePlaybook (
playbook: "${WORKSPACE}/ansible-playbook.yml",
forks: 5,
extraVars: [
loginUser: "${LoginUser}",
loginPassword: "${LoginPassword}"
]
)
}
}
}
}
}
How can i load "${WORKSPACE}/file-parameter.groovy" in teh stage before when condition. My expectation should be somethign as below
pipeline {
agent any
stage("Grep the values") {
steps {
load "${WORKSPACE}/file-parameter.groovy"
}
}
stage("Perform Deploynment) {
load "${WORKSPACE}/file-parameter.groovy"
when {
expression { "${Perform_Deployment}" == "true" }
}
steps {
withCredentials([
usernamePassword(credentialsId: "LoginID", passwordVariable: "LoginPassword", usernameVariable: "LoginUser")
]) {
ansiblePlaybook (
playbook: "${WORKSPACE}/ansible-playbook.yml",
forks: 5,
extraVars: [
loginUser: "${LoginUser}",
loginPassword: "${LoginPassword}"
]
)
}
}
}
}
}
The load step returns whatever the groovy script returned when it was executed, so you need to store it in a variable
file-parameter.groovy could either look like this:
return [
performDeployment: true,
// other variables
]
or like this
performDeployment = true
// other variables and methods
return this
In both cases you could use it in your pipeline like so:
stage("Grep the values") {
steps {
script {
fileParams = load("${WORKSPACE}/file-parameter.groovy")
}
}
}
stage("Perform Deploynment) {
when {
expression { fileParams.performDeployment }
}
I am pretty sure there is no need for the string comparison you are doing and you could use just the boolean value instead.
Related
I looked at Call stage from function in Jenkinsfile which did some of what I wanted, but I had issues adding the code from the answer to my pipeline. I want to be able to scan for files in a folder and generate a stage for each file:
def foo = sh (
script: 'find ./collections/*.json -printf "%f\n"',
returnStdout: true
).trim().split("\n")
def parallelStagesFromMap = foo.collectEntries {
["Build ${it}" : generateStage(it)]
}
def generateStage(bar) {
return {
stage("Build ${bar}") {
echo "Building for ${bar}"
}
}
}
pipeline {
agent { label 'myExecutor' }
triggers {
pollSCM('') // empty cron expression string
cron('H 6 * * 1-5') // run every weekday at 06:00 https://crontab.guru/#0_6_*_*_1-5
}
options {
timeout(time: 20, unit: "MINUTES")
buildDiscarder(logRotator(numToKeepStr: '10', artifactNumToKeepStr: '10'))
disableConcurrentBuilds()
}
stages {
stage('Setup') {
steps {
script {
foo = ["1", "2", "3", "4", "5"]
}
}
}
stage('parallel') {
steps {
script {
parallel parallelStagesFromMap
generateStage("skipped") // no invocation, stage is skipped
generateStage("nonparallel").call()
}
}
}
}
post {
always {
echo ' * * * Doing POST actions'
//...
}
}
}
node {
parallel parallelStagesFromMap
generateStage("skipped") // no invocation, stage is skipped
generateStage("nonparallel").call()
}
This is essentially a hybrid between declarative and scripted pipelines, but it can be done like this:
Essentially you need to define the variables and execute the lookup in the setup stage and then run the stage generating code in a subsequent stage.
The node block is executed after the pipeline as you have put it, and the variable definitions are called before the code is checked out.
def foo
def parallelStagesFromMap
def generateStage(bar) {
return {
stage("Build ${bar}") {
echo "Building based on file ${bar}"
}
}
}
pipeline {
agent { label 'myExecutor' }
triggers {
pollSCM('') // empty cron expression string
cron('H 6 * * 1-5') // run every weekday at 06:00 https://crontab.guru/#0_6_*_*_1-5
}
options {
timeout(time: 20, unit: "MINUTES")
buildDiscarder(logRotator(numToKeepStr: '10', artifactNumToKeepStr: '10'))
disableConcurrentBuilds()
}
stages {
stage('Setup') {
steps {
script {
//scan collections folder for .json files
foo = sh (
script: 'find ./collections/*.json -printf "%f\n"',
returnStdout: true
).trim().split("\n")
parallelStagesFromMap = foo.collectEntries {
["Build ${it}" : generateStage(it)]
}
}
// do other setup-y stuff...
}
}
stage('Parallel Dynamic') {
steps {
script {
// https://stackoverflow.com/questions/55340071/call-stage-from-function-in-jenkinsfile
parallel parallelStagesFromMap
//generateStage("skipped") // no invocation, stage is skipped
//generateStage("nonparallel").call()
}
}
}
stage('static stage') {
steps {
// step to the mic ...
}
}
// more stages ...
}
post {
always {
echo ' * * * Doing POST actions'
//...
}
}
}
I'm using Jenkins declarative pipeline and I want to make a conditional step depending on an environment variable, which is set according the existence of a file.
So I just want to make something like that : if Dockerfile exist, perform next stage, else don't.
To perform this I tried :
pipeline {
// ...
stage {
stage('Docker') {
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
when {
environment name: 'IS_DOCKERFILE', value: true
}
stage('Build') {
// ...
}
}
}
}
Or :
pipeline {
// ...
stage {
stage('Docker') {
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
when {
expression {
env.IS_DOCKERFILE == true
}
}
stage('Build') {
// ...
}
}
}
}
In both cases, the Dockerfile exist and it is in the workspace. I also tried with strings ("true") but everytime, the pipeline continue without executing the stage 'Build'.
Any suggestions ?
This is because the exprsssion:
IS_DOCKERFILE = fileExists 'Dockerfile'
Creates the environment variable with boolean value as string:
$ set
IS_DOCKERFILE='false'
So the solution would be to use .toBoolean() like this:
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
stages {
stage("build docker image") {
when {
expression {
env.IS_DOCKERFILE.toBoolean()
}
}
steps {
echo 'fileExists'
}
}
stage("build libraries") {
when {
expression {
!env.IS_DOCKERFILE.toBoolean()
}
}
steps {
echo 'fileNotExists'
}
}
}
As #Sergey already posted, the problem is that you're comparing a string to a boolean. See fileExists: Verify if file exists in workspace.
Besides his answer, you can compare directly to a string:
environment {
IS_DOCKERFILE = fileExists 'Dockerfile'
}
stages {
stage("build docker image") {
when {
expression {IS_DOCKERFILE == 'true'}
}
steps {
echo 'fileExists'
}
}
stage("build libraries") {
when {
expression {IS_DOCKERFILE == 'false'}
}
steps {
echo 'fileNotExists'
}
}
}
I would like to set the variable that will be available to all stages. Variable that depends on chosen parameter, something like this:
parameters {
choice(name: 'Environment', choices: ['Dev', 'Stage'], description: 'Deploy to chosen environment')
}
environment {
//set the config file which depends on params.Environment e.g.
//case params.Environment of
// Dev -> CONFIG_FILE="deploy/file_1.conf"
// Stage -> CONFIG_FILE="deploy/other_file.conf"
}
stages {
stage('check-params') {
steps {
sh "echo \"config file: ${CONFIG_FILE}\""
}
}
stage('build-frontend') {
steps {
sh "build-fronted.sh ${CONFIG_FILE}"
}
}
stage('deploy-backend') {
steps {
sh "deploy-backend.sh ${CONFIG_FILE}"
}
}
but according to the Pipeline Syntax it is not allowed (I get ERROR: Expected name=value pairs).
Does anyone know how can I achieve this without using scripts { ... } in every stage->step as described in this post?
You can have a init stage which will set correct variable depending on parameter, something like this:
parameters {
choice(name: 'Environment', choices: ['Dev', 'Stage'], description: 'Deploy to chosen environment')
}
environment {
//set the config file which depends on params.Environment e.g.
//case params.Environment of
// Dev -> CONFIG_FILE="deploy/file_1.conf"
// Stage -> CONFIG_FILE="deploy/other_file.conf"
}
stages {
// ----------------------------
stage('init-env-variables') {
steps {
script {
switch(params.Environment) {
case "Dev":
env.setProperty('CONFIG_FILE', 'deploy/file_1.conf')
break;
case "Stage":
env.setProperty('CONFIG_FILE', 'deploy/other_file.conf')
break;
}
}
}
}
// -----------------------
stage('check-params') {
steps {
sh "echo \"config file: ${CONFIG_FILE}\""
}
}
stage('build-frontend') {
steps {
sh "build-fronted.sh ${CONFIG_FILE}"
}
}
stage('deploy-backend') {
steps {
sh "deploy-backend.sh ${CONFIG_FILE}"
}
}
I have written a Jenkinsfile script which gets whether documents are updated or code is updated in the current Github commit and starts all the stages accordingly. If only documents are updated I don't run the code testing stage again.
So now if the previous build failed and now in the current Git commit only documents are updated then it will not run the code testing stage. So I want a method/way to know which stage failed during the last Jenkins build and if needed run the current Jenkins build.
For example if the code testing stage failed in the previous build, I'll need to run the code testing stage for this build, otherwise I can just run the documents zipping stage.
As a workaround to get failed stages from Jenkins build such function can be used. I could not find a simpler way to do it. But this code requires to run without Groovy sandbox or you need to whitelist a lot of Jenkins method signatures (which is not recommeded). Also blueocean plugin has to be installed.
import io.jenkins.blueocean.rest.impl.pipeline.PipelineNodeGraphVisitor
import io.jenkins.blueocean.rest.impl.pipeline.FlowNodeWrapper
import org.jenkinsci.plugins.workflow.flow.FlowExecution
import org.jenkinsci.plugins.workflow.graph.FlowNode
import org.jenkinsci.plugins.workflow.job.WorkflowRun
#NonCPS
List getFailedStages(WorkflowRun run) {
List failedStages = []
FlowExecution exec = run.getExecution()
PipelineNodeGraphVisitor visitor = new PipelineNodeGraphVisitor(run)
def flowNodes = visitor.getPipelineNodes()
for (node in flowNodes) {
if (node.getType() != FlowNodeWrapper.NodeType.STAGE ) { continue; }
String nodeName = node.getDisplayName()
def nodeResult = node.getStatus().getResult()
println String.format('{"displayName": "%s", "result": "%s"}',
nodeName, nodeResult)
def resultSuccess = io.jenkins.blueocean.rest.model.BlueRun$BlueRunResult.SUCCESS
if (nodeResult != resultSuccess) {
failedStages.add(nodeName)
}
}
return failedStages
}
// Ex. Get last build of "test_job"
WorkflowRun run = Jenkins.instance.getItemByFullName("test_job")._getRuns()[0]
failedStages = getFailedStages(run)
I thing it could fit. Use buildVariables from previous build, timeout \ input in case You need to change something, try \ catch for setup stages status. Code example:
// yourJob
// with try/catch block
def stageOneStatus;
def stageTwoStatus;
def stageThreeStatus;
pipeline {
agent any
stages {
stage("STAGE 1") {
// For initial run every stage
when { expression { params.stageOne == "FAILURE" } }
steps {
script {
try {
// make thing
} catch (Exception e) {
stageOneStatus = "FAILURE";
}
}
}
}
stage("STAGE 2") {
when { expression { params.stageTwo == "FAILURE" } }
steps {
script {
try {
// make thing
} catch (Exception e) {
stageTwoStatus = "FAILURE";
}
}
}
}
stage("STAGE 3") {
when { expression { params.stageThree == "FAILURE" } }
steps {
script {
try {
// make thing
} catch (Exception e) {
stageThreeStatus = "FAILURE";
}
}
}
}
}
}
// Checking JOB
def pJob;
pipeline {
agent any
stages {
// Run job with inheriting variable from build
stage("Inheriting job") {
steps {
script {
pJob = build(job: "yourJob", parameters: [
[$class: 'StringParameterValue', name: 'stageOne', value: 'FAILURE'],
[$class: 'StringParameterValue', name: 'stageTwo', value: 'FAILURE'],
[$class: 'StringParameterValue', name: 'stageThree', value: 'FAILURE']
], propagate: false)
if (pJob.result == 'FAILURE') {
error("${pJob.projectName} FAILED")
}
}
}
}
// Wait for fix, and re run job
stage ('Wait for fix') {
timeout(time: 24, unit: 'HOURS') {
input "Ready to rerun?"
}
}
// Re run job after changes in code
stage("Re-run Job") {
steps {
script {
build(
job: "yourJob",
parameters: [
[$class: 'StringParameterValue',name: 'stageOne',value: pJob.buildVariables.stageOneStatus ],
[$class: 'StringParameterValue',name: 'stageTwo',value: pJob.buildVariables.stageTwoStatus ],
[$class: 'StringParameterValue',name: 'stageThree',value: pJob.buildVariables.stageThreeStatus ]
]
)
}
}
}
}
}
I try to parallelize dynamically defined set of functions as follows:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
{ somefunc() },
{ somefunc2() }
]
pipeline {
agent none
stages{
stage('Run') {
steps {
parallel(running_set)
}
}
}
}
And what I end up with is:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 17: No "steps" or "parallel" to execute within stage "Run" # line 17, column 9.
stage('Run') {
Although steps are defined within stage 'Run'. Anyway what I would like to achieve running is a dynamically defined set of functions to execute in parallel.
If you want to use dynamic parallel block with declarative pipeline script, you have to apply two changes to your Jenkinsfile:
You have to define running_set as a Map like ["task 1": { somefunc()}, "task 2": { somefunc2() }] - keys from this map are used as parallel stages names
You have to pass running_set to parallel method inside script {} block
Here is what updated Jenkinsfile could look like:
def somefunc() {
echo 'echo1'
}
def somefunc2() {
echo 'echo2'
}
running_set = [
"task1": {
somefunc()
},
"task2": {
somefunc2()
}
]
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel(running_set)
}
}
}
}
}
And here is what it looks like in Blue Ocean UI:
It is not obvious. But Szymon's way can be very straightforward.
pipeline {
agent none
stages{
stage('Run') {
steps {
script {
parallel([
'parallelTask1_Name': {
any code you like
},
'parallelTask2_Name': {
any other code you like
},
... etc
])
}
}
}
}
}