Will a Jenkins pipeline compile twice when building a tag? - jenkins

I want to setup a Jenkins pipeline which builds a Docker image whenever Jenkins is building a tag, so I used buildingTag() in the when condition. This works fine but I have some trouble understanding Jenkins at this point.
Every commit triggers the "Compile" stage. If a tag is built, will the "Compile" stage be executed twice? In a first run on the e.g. master branch and in a second run when explicitly starting the "Tag" build job? If so, how could this be avoided?
pipeline {
agent any
environment {
APP_NAME = 'myapp'
}
stages {
stage('Compile') {
steps {
echo "Start compiling..."
}
}
stage('Build Docker Image') {
when { buildingTag() }
steps {
echo "Building a Docker image..."
}
}
}
}

For a multibranch project branch builds are separate from tag builds, so yes, each build would have the compile stage running. They will also have separate workspaces, so they should not affect each other.
If you don't want a stage to run at tag build, just add a when { not { buildingTag() } } expression to that stage.

Related

Jenkins Docker Cloud - Dynamic Agent Template Image

Setup
I'm trying to setup a Jenkins CI pipeline:
A Jenkins Main node which manages the pipeline
A build node generates docker containers (Stage 1 & 2)
Another node runs these containers. These containers are launched by Jenkins using Docker remote API (Stage 3)
Pipeline
Here is the pipeline (truncated)
pipeline {
agent {
label 'Build'
}
options {
skipDefaultCheckout()
}
stages {
stage('1')
{
steps {
sh script: "$WORKSPACE/jenkins/build_packages.sh", label: "Build Packages"
}
}
stage('2')
{
steps {
sh script: "$WORKSPACE/jenkins/build_container.sh", label: "Build Container"
}
}
stage('3')
{
steps {
script {
for (def key in testsuites.keySet())
{
def testsuite = key
executions[testsuite] = {
node('Docker-Test')
{
execute_testsuite(testsuite, testsuites[testsuite])
}
}
}
parallel(executions)
}
}
}
}
}
Stage 2 will push to a remote registry the built image. In stage 3, this image is used. This is working perfectly.
Problem
My problem is if the CI is triggered by two different developpers at the same time, then the image may change during execution of testsuites...
What i would like to do is push a different image (e.g. using git hash as docker tag) and use that image in Stage 3. And probably use the internal Jenkins function to build the image and push it instead of my own scripts.
But the image is set within the Docker Cloud - Agent Template configuration (see in screen capture below). Is there a way to modify this from within the pipeline ?
Docker Cloud - Agent Template
This seems to be something possible at least using Kubernetes:
https://support.cloudbees.com/hc/en-us/articles/360049905312-Dynamically-selecting-the-docker-image-for-a-downstream-Pipeline-using-Kubernetes-Pod-Templates

Continuous Delivery with Jenkins: Deploy to production as manual step

We are doing trunk-based development. So we just have a main branch.
There we have a Jenkins pipeline with this stages:
Build -> Test -> Deploy to Test
Now I would like to add a manual stage that will deploy to Production. But I don't want to have stage that waits and blocks the pipeline. Just an optional manual stage that can be triggered by a user and that will deploy the build to production.
How can I achieve this?
You can use the Pipeline:Input Step to achieve the deployment to production as manual step.
Here you can find an example for the same:
How can I approve a deployment and add an optional delay for the deployment
Edited Answer
Create a choice parameter using the option This project is parameterised as below
Sample pipeline code with condition to run the stage or not:
pipeline {
agent any
stages {
stage('Hello') {
steps {
echo 'Hello World'
}
}
stage('Deploy to Production') {
when {
expression { params.DEPLOY_TO_PRODUCTION == 'Yes' }
}
steps {
echo 'Deploying to Production...'
}
}
}
}
Output:
When user select the YES, stage Deploy to Production will run
When user select the NO, stage Deploy to Production will not run

How to use a Jenkinsfile for these build steps?

I'm learning how to use Jenkins and working on configuring a Jenkins file instead of the build using the Jenkins UI.
The source code management step for building from Bitbucket:
The build step for building a Docker container:
The build is of type multi configuration project:
Reading the Jenkins file documentation at https://www.jenkins.io/doc/book/pipeline/jenkinsfile/index.html and creating a new build using Pipeline :
I'm unsure how to configure the steps I've configured via the UI: Source Code Management & Build. How to convert the config for Docker and Bitbucket that can be used with a Jenkinsfile ?
The SCM will not be changed, regardless if you are using UI configuration or a pipeline, although in theory you can do the git clone from the steps in the pipeline, if you really insist convert the SCM steps in pure pipeline steps.
The pipeline will can have multiple stages, and each of the stages can have different execution environment. You can use the Docker pipeline plug-in, or you can use plain sh to issue the docker commands on the build agent.
Here is small sample from one of my manual build pipelines:
pipeline {
agent none
stages {
stage('Init') {
agent { label 'docker-x86' }
steps {
checkout scm
sh 'docker stop demo-001c || true'
sh 'docker rm demo-001c || true'
}
}
stage('Build Back-end') {
agent { label 'docker-x86' }
steps {
sh 'docker build -t demo-001:latest ./docker'
}
}
stage('Test') {
agent {
docker {
label 'docker-x86'
}
}
steps {
sh 'docker run --name demo-001c demo-001:latest'
sh 'cd test && make test-back-end'
}
}
}
}
You need to create a Pipeline type of a project and specify the SCM configuration in the General tab. In the Pipeline tab, you will have option to select Pipeline script or Pipeline script from SCM. It's always better to start with the Pipeline script while you are building and modifying your workflow. Once it's stabilized, you can add it to the repository.

Jenkins Declarative Pipeline - SCM

I am taking some Jenkins tutorial. The sample code I read is
pipeline {
agent none
stages {
stage('Build') {
agent {
docker {
image 'python:2-alpine'
}
}
steps {
sh 'python -m py_compile sources/add2vals.py sources/calc.py'
}
}
stage('Test') {
agent {
docker {
image 'qnib/pytest'
}
}
steps {
sh 'py.test --verbose --junit-xml test-reports/results.xml sources/test_calc.py'
}
post {
always {
junit 'test-reports/results.xml'
}
}
}
stage('Deliver') {
agent {
docker {
image 'cdrx/pyinstaller-linux:python2'
}
}
steps {
sh 'pyinstaller --onefile sources/add2vals.py'
}
post {
success {
archiveArtifacts 'dist/add2vals'
}
}
}
}
}
So basically there are three steps Build, Test and Deliver. They all use different images to generate different containers. But this Jenkins job is configured to use the Git as the SCM.
So if this Jenkins build is run, says the project is built on the first container. Then the second stage is testing the project on another container, followed by the deliver on the third container. How does this Jenkins job make sure that these three steps are performing on the code sequentially.
Based on my understanding, each stage needs to perform git clone/git pull, and before the stage finishes, the git push is required.
If SCM IS configured through Jenkins to use Git, do we need to include the git clone/git pull', as well as 'git push' in the corresponding shell scripts(understeps, or it it already taken into consideration by theSCM` function of Jenkins?
Thanks
In this case, you must ensure that the binary that is in the QA environment must be the same as it should be in the UAT environment and then in Production.
For this, you must use an artifact repository or registry (Artifactory, Nexus, Docker Registry, etc.) to promote the artifacts to the Production environment.
See this link and see how it was done in the Pipeline.

Build Name Setter before SCM on Jenkins

I'm using Build Name Setter plugin on Jenkins and it works great.
I'm running latest Jenkins version (2.73.1)
The only problem is that I want it to set the build name before the SCM runs as my SCM operation itself can take 20 minutes and I want to see the build name before then. It currently only runs after SCM and before actual build steps.
Is there a way to run the plugin before SCM or is there an alternative method to setting the build name in a pre-SCM build step?
pipeline {
agent any
stages {
stage('init'){
steps {
script {
currentBuild.displayName = "#${BUILD_NUMBER}, blablaaaa1"
currentBuild.description = "#${BUILD_NUMBER}, blablaaaa2"
}
}
}
stage('Git') {
steps {
echo "git ..."
}
}
}
}

Resources