So I spent the whole day trying to figure out how to configure a simple Jenkins Pipeline with multiple Docker images and I am not happy at all.
I need a few stages (prepare, build, test, docs) executed on a couple of different docker containers (currently I just picked three standard Python containers). And it would be nice if those would run in parallel, but I only found this solution, which combines all stages into a single one (and thus creates a not so informative overview in the Blue Ocean UI): Jenkins Pipeline Across Multiple Docker Images
So I ended up with the configuration below, which is ugly as hell (code repetition everywhere), but more or less creates an good looking overview in the classic UI:
A not so informative overview in the Blue Ocean UI
And an acceptable test overview from junit, which combines all the tests from each stage but if any test is failing, the corresponding "version" is shown:
The most annoying thing however is, you cannot see which step has failed. If Python 2.7 fails, everything else is also marked as failed and you don't even see which stage failed.
I tried so many different approaches and I am wondering how this should be done. This should be such a common thing to do with Jenkins, so I guess I have some general misunderstandings in this (for me absolutely new) pipeline/nodes/labels/stages/steps/declarative/scripted/groovy/blueocean stuff...
It should be possible to define a list of docker containers some (maybe customisable stages/steps) for each of them and run them in parallel and having it displayed nicely in Blue Ocean and in Classic UI, shouldn't it?
node {
stage("Python 2.7.14") {
checkout scm
docker.image('python:2.7.14').inside { // just a dummy for now
stage("Prepare") { sh 'python --version' }
stage("Build") { sh 'ls -al' }
}
}
stage("Python 3.5.4") {
checkout scm
docker.image('python:3.5.4').inside {
stage("Prepare") { sh 'python -m venv venv' }
stage("Build") {
sh """
. venv/bin/activate
make install-dev
"""
}
stage('Test') {
sh """
. venv/bin/activate
make test
"""
}
stage('Docs') {
sh """
. venv/bin/activate
make doc-dependencies
cd docs
make html
"""
}
}
}
stage("Python 3.6.4") {
checkout scm
docker.image('python:3.5.4').inside {
stage("Prepare") { sh 'python -m venv venv' }
stage("Build") {
sh """
. venv/bin/activate
make install-dev
"""
}
stage('Test') {
sh """
. venv/bin/activate
make test
"""
}
stage('Docs') {
sh """
. venv/bin/activate
make doc-dependencies
cd docs
make html
"""
}
}
}
}
Update: this is how it looks like in the Blue Ocean UI when a step fails, int this case "Test" in both Python 3.5.4 and 3.6.4 failed but it looks like everything has failed.
Also the Python 2.7.14 and 3.5.4 stages are collapsed and cannot be viewed separately. If I click on one of them, all the steps are shown in green although in this case . venv/bin/activate make test failed:
So this is what I ended up with. There are surely better solutions, but I have to move on. I hope to gather some (better) answers in time, I'll not mark this as "the solution" yet ;)
First, some credits to Stephen Kings slides (the title says "Declarative" but there are some nice examples regarding the scripted Pipeline): (Declarative) Jenkins Pipelines
Here is my gist on GitHub with the following snippet:
def docker_images = ["python:2.7.14", "python:3.5.4", "python:3.6.2"]
def get_stages(docker_image) {
stages = {
docker.image(docker_image).inside {
stage("${docker_image}") {
echo 'Running in ${docker_image}'
}
stage("Stage A") {
switch (docker_image) {
case "python:2.7.14":
sh 'exit 123' // for python 2.7.14 we force an error for fun
break
default:
sh 'sleep 10' // for any other docker image, we sleep 10s
}
sh 'echo this is stage A' // this is executed for all
}
stage("Stage B") {
sh 'sleep 5'
sh 'echo this is stage B'
}
stage("Stage C") {
sh 'sleep 8'
sh 'echo this is stage C'
}
}
}
return stages
}
node('master') {
def stages = [:]
for (int i = 0; i < docker_images.size(); i++) {
def docker_image = docker_images[i]
stages[docker_image] = get_stages(docker_image)
}
parallel stages
}
I tried to make it easy to use:
you add your Docker images in a list at the top and then you define the stages in the get_stages() function
add the common stages and steps
if any Docker image needs special treatment (like python:2.7.14 in my example), you can use a simple switch. This could also be realised with a double map for the special cases ('images'->'stage'='steps') and a fallback double map for defaults, but I'll leave it as an exercise for the reader. (to be honest, I could not figure out the correct, supported Groovy-lang syntax)
This is how it looks like when everything is fine in both the Classic and the Blue Ocean UIs (it's known that the Blue Ocean UI fails to display multiple stages in parallel runs, see JENKINS-38442):
Classic UI
Blue Ocean UI
And this is the output if Stage A in python:2.7.14 fails:
Classic UI
Blue Ocean UI
Related
I am trying to run a job with parallel and sequential stages. My code looks like this, each stage will be a groovy script therefore I am running with parallel name: and not parallel {}
stage('Start profiling') {
steps {
script{
parallel stageParallel: {
stage("parallel"){
echo"parallel"
}
}, runTest: {
stage("sequential 1"){
echo "sequential 1 "
}
stage("sequential 2"){
echo "sequential 2 "
}
}
}
}
}
Image of Blue Ocean
As you can see it gets executed, but canĀ“t see it on Open Blue Ocean as a step.
Image of stage view
Displaying sequential stages within parallel stages tends to be flaky in Blue Ocean. There is an open bug relating to your problem in the Jenkins bug backlog, but it hasn't been looked into since 2019. Unfortunately there is little you can do from your end since you pipeline code is obviously correct.
I'm trying to learn groovy for my pipeline set-up and I'm stuck on something realy basic, but I don't know where to start looking to solve my issue.
What I'm trying to do is basicly create a stage with multiple Named steps. Underneath I've posted a basic example of what I'm trying to do, what would be the 'go to way' to do this.(this just creates a folder with a zipped file inside it).
Sidenote: This code currently throws the error
WorkflowScript: 23: Expecting "interface jenkins.tasks.SimpleBuildStep" but got Mkdir
pipeline {
agent any
stages {
stage('Zipfile'){
steps{
step('Mkdir'){
sh 'mkdir LocalDir'
}
step('Touch File'){
sh '''cd LocalDir
touch File
cd ..'''
}
step('Zip'){
sh '''cd LocalDir
zip File.zip File
cd ..'''
}
}
}
}
}
I think I have to disappoint you. You can't name a step. The official documentation doesn't mention such a thing.
From the official documentation:
steps
The steps section defines a series of one or more steps to be
executed in a given stage directive.
Required: Yes
Parameters: None
Allowed: Inside each stage block.
I think i found what the question was expecting... I should have added named stages with steps in them
pipeline {
agent any
stages {
stage('Zipfile'){
stages{
stage('mkdir'){
steps{
sh 'mkdir LocalDir'
}
}
stage('Touch File'){
steps{
sh '''cd LocalDir
touch File
cd ..'''
}
}
stage('Zip'){
steps{
sh '''cd LocalDir
zip File.zip File
cd ..'''
}
}
}
}
}
}
I want to pass a variable which I read in stage A towards stage B somehow. I see in some examples that people write it to a file, but I guess that is not really a nice solution. I tried writing it to an environment variable, but I'm not really successful on that. How can I set it up properly?
To get it working I tried a lot of things and read that I should use the """ instead of ''' to start a shell and escape those variables to \${foo} for example.
Below is what I have as a pipeline:
#!/usr/bin/env groovy
pipeline {
agent { node { label 'php71' } }
environment {
packageName='my-package'
packageVersion=''
groupId='vznl'
nexus_endpoint='http://nexus.devtools.io'
nexus_username='jenkins'
nexus_password='J3nkins'
}
stages{
// Package dependencies
stage('Install dependencies') {
steps {
sh '''
echo Skip composer installation
#composer install --prefer-dist --optimize-autoloader --no-interaction
'''
}
}
// Unit tests
stage('Unit Tests') {
steps {
sh '''
echo Running PHP code coverage tests...
#composer test
'''
}
}
// Create artifact
stage('Package') {
steps {
echo 'Create package refs'
sh """
mkdir -p ./build/zpk
VERSIONTAG=\$(grep 'version' composer.json)
REGEX='"version": "([0-9]+.[0-9]+.[0-9]+)"'
if [[ \${VERSIONTAG} =~ \${REGEX} ]]
then
env.packageVersion=\${BASH_REMATCH[1]}
/usr/bin/zs-client packZpk --folder=. --destination=./build/zpk --name=${env.packageName}-${env.packageVersion}.zpk --version=${env.packageVersion}
else
echo "No version found!"
exit 1
fi
"""
}
}
// Publish ZPK package to Nexus
stage('Publish packages') {
steps {
echo "Publish ZPK Package"
sh "curl -u ${env.nexus_username}:${env.nexus_password} --upload-file ./build/zpk/${env.packageName}-${env.packageVersion}.zpk ${env.nexus_endpoint}/repository/zpk-packages/${groupId}/${env.packageName}-${env.packageVersion}.zpk"
archive includes: './build/**/*.{zpk,rpm,deb}'
}
}
}
}
As you can see the packageVersion which I read from stage Package needs to be used in stage Publish as well.
Overall tips against the pipeline are of course always welcome as well.
A problem in your code is that you are assigning version of environment variable within the sh step. This step will execute in its own isolated process, inheriting parent process environment variables.
However, the only way of passing data back to the parent is through STDOUT/STDERR or exit code. As you want a string value, it is best to echo version from the sh step and assign it to a variable within the script context.
If you reuse the node, the script context will persist, and variables will be available in the subsequent stage. A working example is below. Note that any try to put this within a parallel block can be of failure, as the version information variable can be written to by multiple processes.
#!/usr/bin/env groovy
pipeline {
environment {
AGENT_INFO = ''
}
agent {
docker {
image 'alpine'
reuseNode true
}
}
stages {
stage('Collect agent info'){
steps {
echo "Current agent info: ${env.AGENT_INFO}"
script {
def agentInfo = sh script:'uname -a', returnStdout: true
println "Agent info within script: ${agentInfo}"
AGENT_INFO = agentInfo.replace("/n", "")
env.AGENT_INFO = AGENT_INFO
}
}
}
stage("Print agent info"){
steps {
script {
echo "Collected agent info: ${AGENT_INFO}"
echo "Environment agent info: ${env.AGENT_INFO}"
}
}
}
}
}
Another option which doesn't involve using script, but is just declarative, is to stash things in a little temporary environment file.
You can then use this stash (like a temporary cache that only lives for the run) if the workload is sprayed out across parallel or distributed nodes as needed.
Something like:
pipeline {
agent any
stages {
stage('first stage') {
steps {
// Write out any environment variables you like to a temporary file
sh 'echo export FOO=baz > myenv'
// Stash away for later use
stash 'myenv'
}
}
stage ("later stage") {
steps {
// Unstash the temporary file and apply it
unstash 'myenv'
// use the unstashed vars
sh 'source myenv && echo $FOO'
}
}
}
}
I have some code that needs running (build, test, and packages in actuality but for example just running tox) on different OSes. Currently my Jenkinsfile looks like thus:
pipeline {
// Where to run stuff.
agent {
node {
label 'CentOS7'
customWorkspace '/home/build/jenkins/workspace/pipelines/ook'
}
}
// What to run goes here.
stages {
stage('Tox') {
steps {
sh 'tox -v --recreate'
}
}
}
// Clean up after ourselves.
post {
failure {
mail subject: "\u2639 ${env.JOB_NAME} (${env.BUILD_NUMBER}) has failed",
body: """Build ${env.BUILD_URL} is failing!
Somebody should do something about that\u2026""",
to: "devs#example.com",
replyTo: "devs#example.com",
from: 'jenkins#example.com'
}
}
}
}
The middle bit, I want to run on two different nodes: one for OS 1 and one for OS 2.
How do I do that?
Sure, you would want to label your slave nodes somehow. I didn't look up what tox is, but maybe like 'os_linux' and 'os_mac', and then you can use the node step in your Jenkinsfile to run some commands in the context of each slave. So your Tox stage might look like:
stage('Tox') {
steps {
node('os_linux') {
sh 'tox -v --recreate'
}
node('os_mac') {
sh 'tox -v --recreate'
}
}
}
This will run the tasks in serial, and Jenkinsfile syntax also supports doing those two tox commands in parallel on different nodes. Use the "Pipeline Syntax" link in the left nav of your Jenkins UI (only on pipeline jobs) to play around with node and parallel. Rock on.
How can we run two stages in parallel in Jenkins2.0 Pipeline project.
For ex: in the following code i want to run the two stages to run in parallel i.e. "Build_First_Repo" and "Build_Second_Repo" should run in parallel.
stage "Build_First_Repo"
node { sh '''echo 'Build first repo'
source ~/.bashrc'''
export path_to_perl_library="/path/to/perl/lib/perl5/5.8.8"
perl -I <include_files> build_first_view.pl --inputfile ${input_params}
}
stage "Build_Second_Repo"
node { sh '''echo 'Build second repo'
source ~/.bashrc'''
export path_to_perl_library="/path/to/perl/lib/perl5/5.8.8"
perl -I <include_files> build_second_view.pl --inputfile ${input_params}
}
I tried to use the "parallel" keyword but it didnt work.
In declarative pipeline you cannot run stages within a parallel step because steps are part of the stage directive. The declarative flow is like stages -> stage -> steps -> the actual step that you perform.
But, you can achieve it in scripted pipeline. A sample is as follows:
node(){
parallel first:{
stage('stage1'){
echo 'helloworld'
}
},
second:{
stage('stage2'){
echo 'helloworld2'
}
}
}