Run Jenkins jobs in parallel - jenkins

I have 3 different jobs (Build, Undeploy and Deploy), which want to execute Build and Undeploy in parallel and after that Deploy.
From search got to know that Build Flow Plugin got deprecated.
Please suggest a plugin.

You can write Jenkins file with the below format:-
pipeline {
stages {
agent { node { label 'master' } }
stage('Build/Undeploy') {
parallel {
stage('Build') {
agent { node { label 'Build' } }
steps {
script {
//Call your build script
}
}
}
stage('Undeploy') {
agent { node { label 'Undeploy' } }
steps {
script {
//Call your undeploy script
}
}
}
}
}
stage('Deploy'){
agent { node { label 'Deploy' } }
steps {
script {
//Call your deploy script
}
}
}
}
}

Related

Parallel execution inside the post step

I am building to 2 different environments in the same pipeline and I want to make the cleanup for both environments in parallel.
As I understood, parallel does not work inside the post step: post step parallel.
Any suggestions? Example of my code:
post {
always {
script{
cleanup(env1)
cleanup(env2)
}
}
}
def cleanup(env) {
withEnv(env) {
sh "./cleanup.py"
}
}
The parallel keyword can work inside a post condition as long as it is encapsulated inside a script block, as the script blocks is just a fallback to the scripted pipeline which will allow you to run parallel execution wherever you want.
The following should work fine:
post {
always{
script {
def environments = ['env1', 'env2', 'env3']
parallel environments.collectEntries {
["Cleanup ${it}" : {
cleanup(it)
}]
}
}
}
}
def cleanup(env) {
withEnv(env) {
sh "./cleanup.py"
}
}
Just don't forget to allocate an agent using the node keyword if the steps in the post section are required to run on a specific agent.
A better idea in my opinion is to clean up after the fact, before you possibly lost the node to another job:
parallel {
stage('env1') {
agent { node { label "env1" }}
steps {
script {
println "Inside env1"
}
}
post {
cleanup { script { my_cleanup_func("env1") } }
}
}
stage('env2') {
agent { node { label "env2" }}
steps {
script {
println "Inside env2"
}
}
post {
cleanup { script { my_cleanup_func("env2") } }
}
}
...
def my_cleanup_func(String env) {
// ...
}

How can I import a shared groovy script into a pipeline from the same Git repo?

I have the following two pipelines in my repository
#Field String ANDROID_EMULATOR = "android-emulator"
pipeline {
agent { label "android-emulator" }
stages {
stage("build") {
steps {
gradlew (":build")
}
}
}
}
void gradlew(String tasks) {
sh "./gradlew $tasks --profile"
}
#Field String ANDROID_EMULATOR = "android-emulator"
pipeline {
agent none
stages {
stage("PR checks") {
parallel {
stage("build 1") {
agent { label ANDROID_EMULATOR }
steps {
gradlew(":one:build")
}
}
stage("build 2") {
agent { label ANDROID_EMULATOR }
steps {
gradlew(":two:build")
}
}
}
}
}
}
void gradlew(String tasks) {
sh "./gradlew $tasks --profile"
}
As you can see, there is some code duplication between the two - ANDROID_EMULATOR and void gradlew(..).
I would like to move them into their own shared.groovy file:
#Field String ANDROID_EMULATOR = "android-emulator"
void gradlew(String tasks) {
sh "./gradlew $tasks --profile"
}
And be able to import it into my other pipelines with a single line of code. Gradle allows this to be done with apply('shared.groovy').
Jenkins seems to allow only shared libraries (which are global), and load statements (which need to be loaded as a part of a node, which does not scale well). Does Jenkins lack support for this basic style of code sharing here?
You can use the pipeline load which is more simple than using shared library, especially when you hope the shared.groovy in the same repo as your Jenkinsfiles.
// shared.groovy
def gradlew(String tasks) {
sh "./gradlew $tasks --profile"
}
return this // the return this must be have
// pipeline 1
pipeline {
agent { label "android-emulator" }
stages {
stage("build") {
steps {
scripts {
shared = load 'shared.groovy'
shared.gradlew (":build")
}
}
}
}
}
// pipeline 2
pipeline {
agent { label "android-emulator" }
stages {
stage("build") {
steps {
scripts {
shared = load 'shared.groovy'
shared.gradlew ("one:build")
}
}
}
}
}
Jenkins shared libraries have a well defined folder structure https://www.jenkins.io/doc/book/pipeline/shared-libraries/#directory-structure
You can try:
to implement this folder structure in a subfolder in your repo
to use a Dynamic Retrieval with a scm config that will check out the specific folder
I afraid it is too complicated and even not possible
I think the best approach is to create global shared libraries repo and to implement a gradleBuild custom step . In that case your code will be like
Pipeline 1:
#Library('somelib')
#Field String ANDROID_EMULATOR = "android-emulator"
pipeline {
agent { label "android-emulator" }
stages {
stage("build") {
steps {
gradleBuild ":build"
}
}
}
}
Pipeline 2:
#Library('somelib')
#Field String ANDROID_EMULATOR = "android-emulator"
pipeline {
agent none
stages {
stage("PR checks") {
parallel {
stage("build 1") {
agent { label ANDROID_EMULATOR }
steps {
gradlew(":one:build")
}
}
stage("build 2") {
agent { label ANDROID_EMULATOR }
steps {
gradlew(":two:build")
}
}
}
}
}
}
Shared libraries vars/gradeBuild.groovy file:
def call(String tasks) {
sh "./gradlew $tasks --profile"
}

How can I have one pipeline executed even if the other has failed in jenkins

I have the following (part of a) pipeline
stages {
stage('bootstrap') {
parallel {
stage("Linux") {
agent { label 'linux' }
steps {
sh 'bootstrap.sh'
}
}
stage("Windows") {
agent { label 'Windows' }
steps {
bat 'bootstrap.bat'
}
}
}
}
stage('devenv') {
parallel {
stage('Linux') {
agent { label 'linux' }
steps {
sh 'devenv.sh'
}
}
stage('Windows') {
agent { label 'Windows' }
steps {
bat 'devenv.bat'
}
}
}
}
}
post {
always {
echo "Done"
}
}
The problem is that when bootstrap.bat fails on windows, the devenv step is now considered failed, and the linux devenv won't continue. I would like to have the results of the linux pipeline even if the windows one fails early.
An option would be to separate the stages so that linux full pipeline is on one branch of the parallel execution, and windows is on the other, but maybe there's a trick I am not aware of, because I tried it and it does not seem to be acceptable syntax.
Edit
Suggested fix does not work. This is the pipeline
pipeline {
agent none
parallel {
stage('Linux') {
agent { label 'linux' }
stages {
stage('bootstrap') {
sh "ls"
}
stage('devenv') {
sh "ls"
}
}
}
stage('windows') {
agent { label 'Windows' }
stages {
stage('bootstrap') {
bat 'dir'
}
stage('devenv') {
bat 'dir'
}
}
}
}
}
This is the error message
WorkflowScript: 8: Undefined section "parallel" # line 8, column 5.
parallel {
^
WorkflowScript: 1: Missing required section "stages" # line 1, column 1.
pipeline {
^

Check parallel stages status

I have something like this:
stages {
stage('createTemplate') {
parallel {
stage('template_a') {
creating template a
}
stage('template_b') {
creating template b
}
}
}
stage('deployVm') {
parallel {
stage('deploy_a') {
deploy vm a
}
stage('deploy_b') {
deploy vm b
}
}
}
}
How can I make sure that deployVm stages run when and only when respective createTemplate stages were successful?
You may want to run one parallel like this:
parallel {
stage('a') {
stages {
stage ('template_a') { ... }
stage ('deploy_a') { ... }
}
stage('b') {
stages {
stage ('template_b') { ... }
stage ('deploy_b') { ... }
}
}
}
This will make sure only stages that deploy are the ones following successful template stages.

Post directive at pipeline level of jenkins declarative pipeline does not get executed on failure

Below is the skeleton of my Jenkinsfile. The post directive is executed on success but not in case of failure. Is this the expected behavior of jenkins?
Thanks
#!/usr/bin/env groovy
pipeline {
agent {
node { label 'ent_linux_node' }
}
stages {
stage('Prepare'){
steps {
//some steps
}
}
stage('Build') {
steps {
//Fails at this stage
}
}
stage('ArtifactoryUploads') {
steps {
//skips since previous stage failed
}
}
}
post {
always {
//Doesn't get executed but I am expecting it to execute
}
}
}

Resources