in the first stage, i've shell script to check directory in the remote server and the results will be sent to the next stage. I have tried the following way but it seems the variable is not read in the execute stage, is there another proper way?
pipeline {
agent any
stages
{
stage("validate")
{
steps
{
sh '''
dir_path="/home/servicenamedir"
ssh username#host bash -c "'
if [ -d "$dir" ]
then
checkdir="true"
else
checkdir="false"
fi
'"
'''
}
}
stage("execute")
{
steps
{
sh '''
if [ "$checkdir" == "true" ]
then
echo "directory already exist, please double check";
exit;
elif [ "$checkdir" == "false" ]
then
echo "execute ./install-service.sh"
fi
'''
}
}
}
create variable
def var
use options returnStdout: true. And parse output
Def var = sh ( script " ls -la", returnStdout: true).split("\n")
use var in stage 'execute'
If (var[0] =="true"){...} else {...}
https://www.jenkins.io/doc/pipeline/steps/workflow-durable-task-step/
I came here from this post Defining a variable in shell script portion of Jenkins Pipeline
My situation is the following I have a pipeline that is updating some files and generating a PR in my repo if there are changes in the generated files (they change every couple of weeks or less).
At the end of my pipeline I have a post action to send the result by email to our teams connector.
I wanted to know if I could somehow generate a variable and include that variable in my email.
It looks something like this but off course it does not work.
#!groovy
String WasThereAnUpdate = '';
pipeline {
agent any
environment {
GRADLE_OPTS = '-Dorg.gradle.java.home=$JAVA11_HOME'
}
stages {
stage('File Update') {
steps {
sh './gradlew updateFiles -P updateEnabled'
}
}
stage('Create PR') {
steps {
withCredentials(...) {
sh '''
if [ -n \"$(git status --porcelain)\" ]; then
WasThereAnUpdate=\"With Updates\"
...
else
WasThereAnUpdate=\"Without updates\"
fi
'''
}
}
}
}
post {
success {
office365ConnectorSend(
message: "Scheduler finished: " + WasThereAnUpdate,
status: 'Success',
color: '#1A5D1C',
webhookUrl: 'https://outlook.office.com/webhook/1234'
)
}
}
}
I've tried referencing my variable in different ways ${}, etc... but I'm pretty sure that assignment is not working.
I know I probably could do it with a script block but I'm not sure how I would put the script block inside the SH itself, not sure this would be possible.
Thanks to the response from MaratC https://stackoverflow.com/a/64572833/5685482 and this documentation
I'll do it something like this:
#!groovy
def date = new Date()
String newBranchName = 'protoUpdate_'+date.getTime()
pipeline {
agent any
stages {
stage('ensure a diff') {
steps {
sh 'touch oneFile.txt'
}
}
stage('AFTER') {
steps {
script {
env.STATUS2 = sh(script:'git status --porcelain', returnStdout: true).trim()
}
}
}
}
post {
success {
office365ConnectorSend(
message: "test ${env.STATUS2}",
status: 'Success',
color: '#1A5D1C',
webhookUrl: 'https://outlook.office.com/webhook/1234'
)
}
}
In your code
sh '''
if [ -n \"$(git status --porcelain)\" ]; then
WasThereAnUpdate=\"With Updates\"
...
else
WasThereAnUpdate=\"Without updates\"
fi
'''
Your code creates a sh session (most likely bash). That session inherits the environment variables from the process that started it (Jenkins). Once it runs git status, it then sets a bash variable WasThereAnUpdate (which is a different variable from likely named Groovy variable.)
This bash variable is what gets updated in your code.
Once your sh session ends, bash process gets destroyed, and all of its variables get destroyed too.
This whole process has no influence whatsoever on Groovy variable named WasThereAnUpdate that just stays what it was before.
I have defined global variable in Jenkins pipeline
def BUILDNRO = '0'
pipeline { ...
Then i manipulate variable with shell script to enable running builds parallel by using job build number as identifier so we don't mix different docker swarms.
stage('Handle BUILD_NUMBER') {
steps {
script {
BUILDNRO = sh( script: '''#!/bin/bash
Build=`echo ${BUILD_NUMBER} | grep -o '..$'`
# Check if BUILD first character is 0
if [[ $Build:0:1 == "0" ]]; then
# replace BUILD first character from 0 to 5
Build=`echo $Build | sed s/./5/1`
fi
echo $Build
''',returnStdout: true).trim()
}
}
}
i get value out from previos stage and trying to get global variable on next stage
stage('DOCKER: Init docker swarm') {
steps {
echo "BUILDNRO is: ${BUILDNRO}" --> Value is here.
sh '''#!/bin/bash
echo Buildnro is: ${BUILDNRO} --> This is empty.
...
}
}
This will out give global variable empty. why? in previous stage there was value in it.
EDIT 1.
Modified code blocks to reflect current status.
I managed to figure it out. Here is solution how i managed to did it.
BUILDNRO is groovy variable and if wanting to used in bash variable it have to pass using withEnv. BUILD_NUMBER in first stage is bash variable hence it can be used directly script in first stage.
def BUILDNRO = '0'
pipeline {
....
stages {
stage('Handle BUILD_NUMBER') {
steps {
script {
BUILDNRO = sh( script: '''#!/bin/bash
Build=`echo ${BUILD_NUMBER} | grep -o '..$'`
''',returnStdout: true).trim()
}
}
}
stage('DOCKER: Init docker swarm') {
steps {
dir("prose_env/prose_api_dev_env") {
withEnv(["MYNRO=${BUILDNRO}"]) {
sh(returnStdout: false, script: '''#!/bin/bash
echo Buildnro is: ${MYNRO}`
'''.stripIndent())
}
}
}
}
}
}
If you are using single quotes(```) in the shell module, Jenkins treats every variable as a bash variable. The solution is using double quotes(""") but then if you made bash variable you have to escape it. Below an example with working your use case and escaped bash variable
pipeline {
agent any
stages {
stage('Handle BUILD_NUMBER') {
steps {
script {
BUILDNRO = sh(script: 'pwd', returnStdout: true).trim()
echo "BUILDNRO is: ${BUILDNRO}"
}
}
}
stage('DOCKER: Init docker swarm') {
steps {
sh """#!/bin/bash
echo Buildnro is: ${BUILDNRO}
variable=world
echo "hello \${variable}"
sh """
}
}
}
}
output of the second stage:
Buildnro is: /var/lib/jenkins/workspace/stack1
hello world
I am attempting to write a scripted Jenkinsfile using the groovy DSL which will have parallel steps within a set of stages.
Here is my jenkinsfile:
node {
stage('Build') {
sh 'echo "Build stage"'
}
stage('API Integration Tests') {
parallel Database1APIIntegrationTest: {
try {
sh 'echo "Build Database1APIIntegrationTest parallel stage"'
}
finally {
sh 'echo "Finished this stage"'
}
}, Database2APIIntegrationTest: {
try {
sh 'echo "Build Database2APIIntegrationTest parallel stage"'
}
finally {
sh 'echo "Finished this stage"'
}
}, Database3APIIntegrationTest: {
try {
sh 'echo "Build Database3APIIntegrationTest parallel stage"'
}
finally {
sh 'echo "Finished this stage"'
}
}
}
stage('System Tests') {
parallel Database1APIIntegrationTest: {
try {
sh 'echo "Build Database1APIIntegrationTest parallel stage"'
}
finally {
sh 'echo "Finished this stage"'
}
}, Database2APIIntegrationTest: {
try {
sh 'echo "Build Database2APIIntegrationTest parallel stage"'
}
finally {
sh 'echo "Finished this stage"'
}
}, Database3APIIntegrationTest: {
try {
sh 'echo "Build Database3APIIntegrationTest parallel stage"'
}
finally {
sh 'echo "Finished this stage"'
}
}
}
}
I want to have 3 stages: Build; Integration Tests and System Tests.
Within the two test stages, I want to have 3 sets of the tests executed in parallel, each one against a different database.
I have 3 available executors. One on the master, and 2 agents and I want each parallel step to run on any available executor.
What I've noticed is that after running my pipeline, I only see the 3 stages, each marked out as green. I don't want to have to view the logs for that stage to determine whether any of the parallel steps within that stage were successful/unstable/failed.
I want to be seeing the 3 steps within my test stages - marked as either green, yellow or red (Success, unstable or failed).
I've considered expanding the tests out into their own stages, but have realised that parallel stages are not supported (Does anyone know whether this will ever be supported?), so I cannot do this as the pipeline would take far too long to complete.
Any insight would be much appreciated, thanks
In Jenkins scripted pipeline, parallel(...) takes a Map describing each stage to be built. Therefore you can programatically construct your build stages up-front, a pattern which allows flexible serial/parallel switching.
I've used code similar to this where the prepareBuildStages returns a List of Maps, each List element is executed in sequence whilst the Map describes the parallel stages at that point.
// main script block
// could use eg. params.parallel build parameter to choose parallel/serial
def runParallel = true
def buildStages
node('master') {
stage('Initialise') {
// Set up List<Map<String,Closure>> describing the builds
buildStages = prepareBuildStages()
println("Initialised pipeline.")
}
for (builds in buildStages) {
if (runParallel) {
parallel(builds)
} else {
// run serially (nb. Map is unordered! )
for (build in builds.values()) {
build.call()
}
}
}
stage('Finish') {
println('Build complete.')
}
}
// Create List of build stages to suit
def prepareBuildStages() {
def buildStagesList = []
for (i=1; i<5; i++) {
def buildParallelMap = [:]
for (name in [ 'one', 'two', 'three' ] ) {
def n = "${name} ${i}"
buildParallelMap.put(n, prepareOneBuildStage(n))
}
buildStagesList.add(buildParallelMap)
}
return buildStagesList
}
def prepareOneBuildStage(String name) {
return {
stage("Build stage:${name}") {
println("Building ${name}")
sh(script:'sleep 5', returnStatus:true)
}
}
}
The resulting pipeline appears as:
There are certain restrictions on what can be nested within a parallel block, refer to the pipeline documentation for exact details. Unfortunately much of the reference seems biased towards declarative pipeline, despite it being rather less flexible than scripted (IMHO).
The pipeline examples page was the most helpful.
Here's a simple example without loops or functions based on #Ed Randall's post:
node('docker') {
stage('unit test') {
parallel([
hello: {
echo "hello"
},
world: {
echo "world"
}
])
}
stage('build') {
def stages = [:]
stages["mac"] = {
echo "build for mac"
}
stages["linux"] = {
echo "build for linux"
}
parallel(stages)
}
}
...which yields this:
Note that the values of the Map don't need to be stages. You can give the steps directly.
Here is an example from their docs:
Parallel execution
The example in the section above runs tests across two different platforms in a linear series. In practice, if the make check execution takes 30 minutes to complete, the "Test" stage would now take 60 minutes to complete!
Fortunately, Pipeline has built-in functionality for executing portions of Scripted Pipeline in parallel, implemented in the aptly named parallel step.
Refactoring the example above to use the parallel step:
// Jenkinsfile (Scripted Pipeline)
stage('Build') {
/* .. snip .. */
}
stage('Test') {
parallel linux: {
node('linux') {
checkout scm
try {
unstash 'app'
sh 'make check'
}
finally {
junit '**/target/*.xml'
}
}
},
windows: {
node('windows') {
/* .. snip .. */
}
}
}
To simplify the answer of #Ed Randall here.
Remember this is Jenkinsfile scripted (not declarative)
stage("Some Stage") {
// Stuff ...
}
stage("Parallel Work Stage") {
// Prealocate dict/map of branchstages
def branchedStages = [:]
// Loop through all parallel branched stage names
for (STAGE_NAME in ["Branch_1", "Branch_2", "Branch_3"]) {
// Define and add to stages dict/map of parallel branch stages
branchedStages["${STAGE_NAME}"] = {
stage("Parallel Branch Stage: ${STAGE_NAME}") {
// Parallel stage work here
sh "sleep 10"
}
}
}
// Execute the stages in parallel
parallel branchedStages
}
stage("Some Other Stage") {
// Other stuff ...
}
Please pay attention to the curly braces.
This will result in the following result (with the BlueOcean Jenkins Plugin):
I was also trying similar sort of steps to execute parallel stages and display all of them in a stage view. You should write a stage inside a parallel step as shown in the following code block.
// Jenkinsfile (Scripted Pipeline)
stage('Build') {
/* .. Your code/scripts .. */
}
stage('Test') {
parallel 'linux': {
stage('Linux') {
/* .. Your code/scripts .. */
}
}, 'windows': {
stage('Windows') {
/* .. Your code/scripts .. */
}
}
}
The above example with a FOR is wrong, as varible STAGE_NAME will be overwritten everytime, I had the same problem as Wei Huang.
Found the solution here:
https://www.convalesco.org/notes/2020/05/26/parallel-stages-in-jenkins-scripted-pipelines.html
def branchedStages = [:]
def STAGE_NAMES = ["Branch_1", "Branch_2", "Branch_3"]
STAGE_NAMES.each { STAGE_NAME ->
// Define and add to stages dict/map of parallel branch stages
branchedStages["${STAGE_NAME}"] = {
stage("Parallel Branch Stage: ${STAGE_NAME}") {
// Parallel stage work here
sh "sleep 10"
}
}
}
parallel branchedStages
I have used as below where the three stages are parallel.
def testCases() {
stage('Test Cases') {
def stages = [:] // declaring empty list
stages['Unit Testing'] = {
sh "echo Unit Testing completed"
}
stages['Integration Testing'] = {
sh "echo Integration Testing completed"
}
stages['Function Testing'] = {
sh "echo Function Testing completed"
}
parallel(stages) // declaring parallel stages
}
}
I have used stage{} in parallel blocks several times. Then each stage shows up in the Stage view. The parent stage that contains parallel doesn't include the timing for all the parallel stages, but each parallel stage shows up in stage view.
In blue ocean, the parallel stages appear separately instead of the stages showing. If there is a parent stage, it shows as the parent of the parallel stages.
If you don't have the same experience, maybe a plugin upgrade is due.
I am trying to start a docker image from Jenkins.
(Not getting Docker to run from within Jenkins)
I think I'm really close but this part has still some issues.
Can please anyone help?
stage('build Dockerimage 1') {
steps{
apitestimage = docker.build('apitestimage', '--no-cache=true dockerbuild')
}
}
stage('start Dockerimage and Tests 2') {
steps{
apitestimage.inside {
sh 'cd testing && ctest'
}
}
}
Jenkins reports:
WorkflowScript: 21: Expected a step # line 21, column 15. apitestimage = docker.build('apitestimage', '--no-cache=true dockerbuild')
and also
WorkflowScript: 27: Method calls on objects not allowed outside "script" blocks. # line 27, column 13. apitestimage.inside {
From your error, it shows that you're missing a script block in your steps. You'll need a script block when using the DSL in steps.
stage('build Dockerimage 1') {
steps{
script {
def apitestimage = docker.build('apitestimage', '--no-cache=true dockerbuild')
}
}
}
stage('start Dockerimage and Tests 2') {
steps{
script {
apitestimage.inside {
sh 'cd testing && ctest'
}
}
}
}
References:
https://jenkins.io/doc/book/pipeline/syntax/#script
In my case, I did declare a variable using the stages name: def stages = [buildStage]
Changing this variable name to another name fixed my issue.