I want to execute multiple jobs from a single pipeline using declarative syntax in parallel. Can this be possible!! I know we can make a declarative parallel pipeline using "parallel" parameter.
pipeline {
agent any
parallel{
stages {
stage('Test1') {
steps {
sh 'pip install -r requirements.txt'
}
}
stage('Test2') {
steps {
echo 'Stage 2'
sh 'behave -f allure_behave.formatter:AllureFormatter -o allure-results features/scenarios/**/*.feature'
}
}
stage('Test3') {
steps {
script {
allure([
includeProperties: false,
jdk: '',
properties: [],
reportBuildPolicy: 'ALWAYS',
results: [[path: 'allure-results']]
])
}
}
}
}
}
}
Below image will show you the proper flow that I want. Any approach how to do it?
// Pipeline project: SO-69680107-1-parallel-downstream-jobs-matrix
pipeline {
agent any
stages {
stage('Clean Workspace') {
steps {
cleanWs()
}
}
stage('Job matrix') {
matrix {
axes {
axis {
name 'job'
values 'SO-69680107-2', 'SO-69680107-3', 'SO-69680107-k' // , ...
}
}
stages {
stage('Run job') {
steps {
build "$job"
copyFiles( "$WORKSPACE\\..\\$job", "$WORKSPACE")
}
} // stage 'Run job'
}
} // matrix
} // stage 'Job matrix'
stage('List upstream workspace') {
steps {
bat "#dir /b \"$WORKSPACE\""
}
}
} // stages
}
def copyFiles( downstreamWorkspace, upstreamWorkspace ) {
dir("$downstreamWorkspace") {
bat """
#set prompt=\$g\$s
#echo Begin: %time%
dir /b
xcopy /f *.* \"$upstreamWorkspace\\\"
#echo End: %time%
"""
}
}
Template for downstream projects SO-69680107-2, SO-69680107-3, SO-69680107-k:
// Pipeline project: SO-69680107-X
pipeline {
agent any
stages {
stage('Stage X') {
steps {
sh 'set +x; echo "Step X" | tee SO-69680107-X.log; date; sleep 3; date'
}
}
}
}
Related
I have two stages on jenkins pipeline, and it is expecting to add steps in execute jmeter stage
Could someone help to resolve this....
I got below error:
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 339: Expected one of "steps", "stages", or "parallel" for stage "Execute Jmeter" # line 339, column 9.
stage('Execute Jmeter') {
Below is the code snippet of jenkins pipeline:
pipeline {
agent {
label 'qatest'
}
tools {
maven 'Maven'
jdk 'JDK8'
}
environment {
VIRTUOSO_URL = 'qa.myapp.com'
}
stages {
stage('BUILD') {
steps {
sh 'mvn clean verify'
}
}
stage('Execute Jmeter') {
post{
always{
dir("scenarioLoadTests/target/jmeter/results/"){
sh 'pwd'
sh 'mv *myapp_UserLoginAndLogout.csv UserLoginAndLogout.csv '
sh 'mv *myapp_myappPortfolioScenario.csv myappPortfolioScenario.csv '
sh 'mv *myapp_myappDesign.csv myappDesign.csv '
perfReport '*.csv'
}
}
}
}
}
}
Shouldn't you use script blocks like:
pipeline {
agent {
label 'qatest'
}
tools {
maven 'Maven'
jdk 'JDK8'
}
environment {
VIRTUOSO_URL = 'qa.myapp.com'
}
stages {
stage('BUILD') {
steps {
script { // <----------------- here
sh 'mvn clean verify'
}
}
}
stage('Execute Jmeter') {
post {
always {
dir('scenarioLoadTests/target/jmeter/results/') {
script { <-------------------------------------- and here
sh 'pwd'
sh 'mv *myapp_UserLoginAndLogout.csv UserLoginAndLogout.csv '
sh 'mv *myapp_myappPortfolioScenario.csv myappPortfolioScenario.csv '
sh 'mv *myapp_myappDesign.csv myappDesign.csv '
perfReport '*.csv'
}
}
}
}
}
}
}
More information:
Pipeline Maven Integration
Running a JMeter Test via Jenkins Pipeline - A Tutorial
In my scripted pipeline I would like to set global timestamps and ansicolor option.
Below scripted pipeline not working. How can we add these two options in scripted pipeline?
Declarative Pipeline
pipeline {
agent none
options {
timestamps()
ansiColor('xterm')
}
stages {
stage('Checkout') {
agent { label 'linux' }
steps{
echo "test"
}
}
}
}
Scripted Pipeline
node('linux') {
options {
timestamps()
ansiColor('xterm')
}
stage('Pre Build Setup') {
task('Display env') {
echo "test"
}
}
}
In case of a scripted pipeline, all you have to do is to wrap your script with timestamps and ansiColor('xterm') steps as shown in the example down below:
node {
timestamps {
ansiColor("xterm") {
stage("A") {
echo 'This is stage A'
sh 'printf "\\e[31mHello World\\e[0m\\n"'
sh "sleep 3s"
}
stage("B") {
echo "This is stage B"
}
}
}
}
Ihave noticed that Jenkins pipeline file -- Jenkinsfile which have two syntax
Declarative
Scripted
I have made Declarative Script work to specify node to run my task. However I don't know how to modify my script to Scripted syntax.
My Declarative Script
pipeline {
agent none
stages {
stage('Build') {
agent { label 'my-label' }
steps {
echo 'Building..'
sh '''
'''
}
}
stage('Test') {
agent { label 'my-label' }
steps {
echo 'Testing..'
sh '''
'''
}
}
stage('Deploy') {
agent { label 'my-label' }
steps {
echo 'Deploying....'
sh '''
'''
}
}
}
}
I have tried to use in this way:
node('my-label') {
stage 'SCM'
git xxxx
stage 'Build'
sh ''' '''
}
But it seems Jenkins cannot find my node to run.
How about this simple example?
stage("one") {
node("linux") {
echo "One"
}
}
stage("two") {
node("linux") {
echo "two"
}
}
stage("three") {
node("linux") {
echo "three"
}
}
Or the below answer, this way you are guaranteed to have the stages run on the same node if there are multiple nodes with the same label and run interrupted by another job.
The above example will release the node after every stage, the below example will hold the node for all three stages.
node("linux") {
stage("one") {
echo "One"
}
stage("two") {
echo "two"
}
stage("three") {
echo "three"
}
}
here's what I have:
#!/usr/bin/env groovy
pipeline {
agent none
stages {
stage('Checkout SCM') {
agent { label 'win' && 'apple' && 'rhel' }
steps {
echo "Cloning Repository"
checkout([$class: 'GitSCM',
branches: [[name: "*/develop"]],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'WipeWorkspace']],
submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'UNAME', url: 'URL']],
browser: [$class: 'BitbucketWeb', repoUrl: 'URL'],
])}}
stage('Building Win64, Linux764, MacOS') {
agent { label 'win&&rhel&&apple' }
steps {
script {
echo '************************'
echo '*****BUILDING JOBS******'
echo '************************'
sh 'python build.py'
sh 'cd ion-js && npm run prepublishOnly'
}}}
}
}
However I get the There are no nodes with the label ‘win && rhel && apple’ error. Does anyone happen to know how to run a declarative jenkins pipeline where one of the stages is ran on multiple agent labels in parallel?
I want to checkout the same git repo to 3 different nodes at the same time. I've tried agent { label 'win' && 'apple' && 'rhel' } and agent { label 'win&&apple&&rhel' } but it just says it can't find that label.
Here they say you can use || and using && should work but I'm not sure what I'm missing. I could write 3 different checkout stages but I figured there was a better way
To add more to answer from Micah,
Instead of repeating stage with different labels, you can define a function which can create a stage for you and will be executing the generate stage on different agent nodes of your choice.
def agents = ['win64', 'linux64', 'macos']
def generateStage(nodeLabel) {
return {
stage("Runs on ${nodeLabel}") {
node(nodeLabel) {
script {
echo "Running on ${nodeLabel}"
echo '************************'
echo '*****BUILDING JOBS******'
echo '************************'
sh 'python build.py'
sh 'cd ion-js && npm run prepublishOnly'
}
}
}
}
}
def parallelStagesMap = agents.collectEntries {
["${it}" : generateStage(it)]
}
pipeline {
agent none
stages {
stage('non-parallel stage') {
steps {
echo 'This stage will be executed first.'
}
}
stage('parallel stage') {
steps {
script {
parallel parallelStagesMap
}
}
}
}
}
Since we have parallel keyword while calling parallelStageMap, the same stage will be executed in parallel on the agent nodes.
ProTip: You can define more steps inside the function which are common to be executed on all agents.
If you wanted to defined label and stage name, you can added another argument named stagename and can parse into the function generateStage.
I have tried the same things with no success. The only solution I'm aware of is to have a parallel block and define the stage multiple times, for each agent/node/label.
stage('Building Win64, Linux764, MacOS') {
parallel {
stage('Win64') {
agent {
label 'win-10-x64'
}
steps {
...
}
}
stage('Linux64') {
agent {
label 'linux-x64'
}
steps {
...
}
}
stage('MacOS') {
agent {
label 'macos'
}
steps {
...
}
}
}
}
I am migrating a job from multijob to a Jenkins Declarative pipeline job. I am unable to run the parallel steps on multiple executors.
For example in the pipeline below, I see only one executor being used when I run the pipeline.
I was wondering why only a single executor is used. The idea is that each parallel step would be inoking a make target that would build a docker image.
pipeline {
agent none
stages {
stage('build libraries') {
agent { label 'master' }
steps {
parallel(
"nodejs_lib": {
dir(path: 'nodejs_lib') {
sh 'sleep 110'
}
},
"python_lib": {
dir(path: 'python_lib') {
sh 'sleep 100'
}
}
)
}
}
}
options {
ansiColor('gnome-terminal')
buildDiscarder(logRotator(artifactDaysToKeepStr: '', artifactNumToKeepStr: '', daysToKeepStr: '', numToKeepStr: '30'))
timestamps()
}
}
You can try the following way to perform parallel tasks execution for your pipeline job:
def tasks = [:]
tasks["TasK No.1"] = {
stage ("TASK1"){
node('master') {
sh '<docker_build_command_here>'
}
}
}
tasks["task No.2"] = {
stage ("TASK2"){
node('master') {
sh '<docker_build_command_here>'
}
}
}
tasks["task No.3"] = {
stage ("TASK3"){
node('remote_node') {
sh '<docker_build_command_here>'
}
}
}
parallel tasks
If you want to execute parallel tasks on a single node and also want to have the same workspace for both the tasks then you can go with the following approach:
node('master') {
def tasks = [:]
tasks["TasK No.1"] = {
stage ("TASK1"){
sh '<docker_build_command_here>'
}
}
tasks["task No.2"] = {
stage ("TASK2"){
sh '<docker_build_command_here>'
}
}
parallel tasks
}