Ihave noticed that Jenkins pipeline file -- Jenkinsfile which have two syntax
Declarative
Scripted
I have made Declarative Script work to specify node to run my task. However I don't know how to modify my script to Scripted syntax.
My Declarative Script
pipeline {
agent none
stages {
stage('Build') {
agent { label 'my-label' }
steps {
echo 'Building..'
sh '''
'''
}
}
stage('Test') {
agent { label 'my-label' }
steps {
echo 'Testing..'
sh '''
'''
}
}
stage('Deploy') {
agent { label 'my-label' }
steps {
echo 'Deploying....'
sh '''
'''
}
}
}
}
I have tried to use in this way:
node('my-label') {
stage 'SCM'
git xxxx
stage 'Build'
sh ''' '''
}
But it seems Jenkins cannot find my node to run.
How about this simple example?
stage("one") {
node("linux") {
echo "One"
}
}
stage("two") {
node("linux") {
echo "two"
}
}
stage("three") {
node("linux") {
echo "three"
}
}
Or the below answer, this way you are guaranteed to have the stages run on the same node if there are multiple nodes with the same label and run interrupted by another job.
The above example will release the node after every stage, the below example will hold the node for all three stages.
node("linux") {
stage("one") {
echo "One"
}
stage("two") {
echo "two"
}
stage("three") {
echo "three"
}
}
Related
I have a Jenkins pipeline that runs several stages in parallel. Some of those stages produce intermediate build files that I'd like to reuse in a later step:
pipeline {
stages {
stage("Parallel build") {
parallel {
stage("A") { /* produces file A */ }
stage("B") { /* produces file B */ }
stage("C") { /* produces nothing relevant */ }
}
}
stage("Combine") {
/* runs a task that needs files A and B */
}
}
}
As far as I've been able to tell, Jenkins will randomly give me the workspace from one of the parallel stages. So my Combine step will have file A, B or neither, but not both.
How do I resolve this issue?
Few ways to do this.
You can copy the files to a directory you desire(Which you know the path, you can create a sub directory with the build ID for it to be unique) and access them from there.
You can stash the files in the initial stages then unstash them and use them in the latter stages. Here is the documentation.
stash includes: 'something/A.txt', name: 'filea'
unstash 'filea'
Save the workspace location to a global variable and use it in the stages.
pipeline {
agent any
stages {
stage('Run Tests') {
parallel {
stage('Stage A') {
steps {
script {
sh '''
echo "STAGE AAAA"
pwd echo
"ATAGEA" > a.txt
'''
stageAWS = "$WORKSPACE"
}
}
}
stage('Stage B') {
steps {
script {
sh '''
echo "STAGE B"
pwd
'''
stageBWS = "$WORKSPACE" }
}
}
}
}
stage('Stage C') {
steps {
script { echo "$stageAWS" echo "$stageBWS" }
}
}
}
}
pipeline {
agent any
stages {
stage('a') {
steps {
sh """
#!/bin/bash
a="test"
"""
}
}
stage('b') {
steps {
sh """
#!/bin/bash
echo ${a} => this result is stage 'a' in variable(test)
""
}
}
}
}
HI, I want to use stage 'a' steps sh variable in stage 'b' steps
But, I don't want to use readFile function
Please tell me how to configuration jenkins pipeline
Thank you.
I want to execute multiple jobs from a single pipeline using declarative syntax in parallel. Can this be possible!! I know we can make a declarative parallel pipeline using "parallel" parameter.
pipeline {
agent any
parallel{
stages {
stage('Test1') {
steps {
sh 'pip install -r requirements.txt'
}
}
stage('Test2') {
steps {
echo 'Stage 2'
sh 'behave -f allure_behave.formatter:AllureFormatter -o allure-results features/scenarios/**/*.feature'
}
}
stage('Test3') {
steps {
script {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'allure-results']]
])
}
}
}
}
}
}
Below image will show you the proper flow that I want. Any approach how to do it?
// Pipeline project: SO-69680107-1-parallel-downstream-jobs-matrix
pipeline {
agent any
stages {
stage('Clean Workspace') {
steps {
cleanWs()
}
}
stage('Job matrix') {
matrix {
axes {
axis {
name 'job'
values 'SO-69680107-2', 'SO-69680107-3', 'SO-69680107-k' // , ...
}
}
stages {
stage('Run job') {
steps {
build "$job"
copyFiles( "$WORKSPACE\\..\\$job", "$WORKSPACE")
}
} // stage 'Run job'
}
} // matrix
} // stage 'Job matrix'
stage('List upstream workspace') {
steps {
bat "#dir /b \"$WORKSPACE\""
}
}
} // stages
}
def copyFiles( downstreamWorkspace, upstreamWorkspace ) {
dir("$downstreamWorkspace") {
bat """
#set prompt=\$g\$s
#echo Begin: %time%
dir /b
xcopy /f *.* \"$upstreamWorkspace\\\"
#echo End: %time%
"""
}
}
Template for downstream projects SO-69680107-2, SO-69680107-3, SO-69680107-k:
// Pipeline project: SO-69680107-X
pipeline {
agent any
stages {
stage('Stage X') {
steps {
sh 'set +x; echo "Step X" | tee SO-69680107-X.log; date; sleep 3; date'
}
}
}
}
I'd like to set an env variable in one Stage and have it available in all subsequent Stages and Steps. Something like this:
pipeline {
stages {
stage('One') {
steps {
sh 'export MY_NAME=$(whoami)'
}
}
stage('Two') {
steps {
sh 'echo "I am ${MY_NAME}"'
}
}
stage('Three') {
steps {
sh 'echo "I am ${MY_NAME}"'
}
}
}
}
Those sh steps seem to be independent of each other, and the exported var is not preserved even for the next Step, let alone Stage.
One way I can think of is to write the var to a shell file, like echo "FOLDER_CONTENT=$(ls -lh)" and then source it a next Step, but again, I'll have to do the sourcing in every next Step, which is suboptimal.
Is there a better way to do that?
Finally was able to achieve it like so:
pipeline {
stages {
stage('One') {
steps {
script {
env.MY_NAME= sh (
script: 'whoami',
returnStdout: true
).trim()
}
}
}
stage('Two') {
steps {
echo "I am ${MY_NAME}"
}
}
stage('Three') {
steps {
sh 'echo "I am ${MY_NAME}"'
}
}
}
}
In my scripted pipeline I would like to set global timestamps and ansicolor option.
Below scripted pipeline not working. How can we add these two options in scripted pipeline?
Declarative Pipeline
pipeline {
agent none
options {
timestamps()
ansiColor('xterm')
}
stages {
stage('Checkout') {
agent { label 'linux' }
steps{
echo "test"
}
}
}
}
Scripted Pipeline
node('linux') {
options {
timestamps()
ansiColor('xterm')
}
stage('Pre Build Setup') {
task('Display env') {
echo "test"
}
}
}
In case of a scripted pipeline, all you have to do is to wrap your script with timestamps and ansiColor('xterm') steps as shown in the example down below:
node {
timestamps {
ansiColor("xterm") {
stage("A") {
echo 'This is stage A'
sh 'printf "\\e[31mHello World\\e[0m\\n"'
sh "sleep 3s"
}
stage("B") {
echo "This is stage B"
}
}
}
}