Jenkins Declarative Pipeline - SCM - jenkins

I am taking some Jenkins tutorial. The sample code I read is
pipeline {
agent none
stages {
stage('Build') {
agent {
docker {
image 'python:2-alpine'
}
}
steps {
sh 'python -m py_compile sources/add2vals.py sources/calc.py'
}
}
stage('Test') {
agent {
docker {
image 'qnib/pytest'
}
}
steps {
sh 'py.test --verbose --junit-xml test-reports/results.xml sources/test_calc.py'
}
post {
always {
junit 'test-reports/results.xml'
}
}
}
stage('Deliver') {
agent {
docker {
image 'cdrx/pyinstaller-linux:python2'
}
}
steps {
sh 'pyinstaller --onefile sources/add2vals.py'
}
post {
success {
archiveArtifacts 'dist/add2vals'
}
}
}
}
}
So basically there are three steps Build, Test and Deliver. They all use different images to generate different containers. But this Jenkins job is configured to use the Git as the SCM.
So if this Jenkins build is run, says the project is built on the first container. Then the second stage is testing the project on another container, followed by the deliver on the third container. How does this Jenkins job make sure that these three steps are performing on the code sequentially.
Based on my understanding, each stage needs to perform git clone/git pull, and before the stage finishes, the git push is required.
If SCM IS configured through Jenkins to use Git, do we need to include the git clone/git pull', as well as 'git push' in the corresponding shell scripts(understeps, or it it already taken into consideration by theSCM` function of Jenkins?
Thanks

In this case, you must ensure that the binary that is in the QA environment must be the same as it should be in the UAT environment and then in Production.
For this, you must use an artifact repository or registry (Artifactory, Nexus, Docker Registry, etc.) to promote the artifacts to the Production environment.
See this link and see how it was done in the Pipeline.

Related

Jenkins Docker Cloud - Dynamic Agent Template Image

Setup
I'm trying to setup a Jenkins CI pipeline:
A Jenkins Main node which manages the pipeline
A build node generates docker containers (Stage 1 & 2)
Another node runs these containers. These containers are launched by Jenkins using Docker remote API (Stage 3)
Pipeline
Here is the pipeline (truncated)
pipeline {
agent {
label 'Build'
}
options {
skipDefaultCheckout()
}
stages {
stage('1')
{
steps {
sh script: "$WORKSPACE/jenkins/build_packages.sh", label: "Build Packages"
}
}
stage('2')
{
steps {
sh script: "$WORKSPACE/jenkins/build_container.sh", label: "Build Container"
}
}
stage('3')
{
steps {
script {
for (def key in testsuites.keySet())
{
def testsuite = key
executions[testsuite] = {
node('Docker-Test')
{
execute_testsuite(testsuite, testsuites[testsuite])
}
}
}
parallel(executions)
}
}
}
}
}
Stage 2 will push to a remote registry the built image. In stage 3, this image is used. This is working perfectly.
Problem
My problem is if the CI is triggered by two different developpers at the same time, then the image may change during execution of testsuites...
What i would like to do is push a different image (e.g. using git hash as docker tag) and use that image in Stage 3. And probably use the internal Jenkins function to build the image and push it instead of my own scripts.
But the image is set within the Docker Cloud - Agent Template configuration (see in screen capture below). Is there a way to modify this from within the pipeline ?
Docker Cloud - Agent Template
This seems to be something possible at least using Kubernetes:
https://support.cloudbees.com/hc/en-us/articles/360049905312-Dynamically-selecting-the-docker-image-for-a-downstream-Pipeline-using-Kubernetes-Pod-Templates

How to use a Jenkinsfile for these build steps?

I'm learning how to use Jenkins and working on configuring a Jenkins file instead of the build using the Jenkins UI.
The source code management step for building from Bitbucket:
The build step for building a Docker container:
The build is of type multi configuration project:
Reading the Jenkins file documentation at https://www.jenkins.io/doc/book/pipeline/jenkinsfile/index.html and creating a new build using Pipeline :
I'm unsure how to configure the steps I've configured via the UI: Source Code Management & Build. How to convert the config for Docker and Bitbucket that can be used with a Jenkinsfile ?
The SCM will not be changed, regardless if you are using UI configuration or a pipeline, although in theory you can do the git clone from the steps in the pipeline, if you really insist convert the SCM steps in pure pipeline steps.
The pipeline will can have multiple stages, and each of the stages can have different execution environment. You can use the Docker pipeline plug-in, or you can use plain sh to issue the docker commands on the build agent.
Here is small sample from one of my manual build pipelines:
pipeline {
agent none
stages {
stage('Init') {
agent { label 'docker-x86' }
steps {
checkout scm
sh 'docker stop demo-001c || true'
sh 'docker rm demo-001c || true'
}
}
stage('Build Back-end') {
agent { label 'docker-x86' }
steps {
sh 'docker build -t demo-001:latest ./docker'
}
}
stage('Test') {
agent {
docker {
label 'docker-x86'
}
}
steps {
sh 'docker run --name demo-001c demo-001:latest'
sh 'cd test && make test-back-end'
}
}
}
}
You need to create a Pipeline type of a project and specify the SCM configuration in the General tab. In the Pipeline tab, you will have option to select Pipeline script or Pipeline script from SCM. It's always better to start with the Pipeline script while you are building and modifying your workflow. Once it's stabilized, you can add it to the repository.

Will a Jenkins pipeline compile twice when building a tag?

I want to setup a Jenkins pipeline which builds a Docker image whenever Jenkins is building a tag, so I used buildingTag() in the when condition. This works fine but I have some trouble understanding Jenkins at this point.
Every commit triggers the "Compile" stage. If a tag is built, will the "Compile" stage be executed twice? In a first run on the e.g. master branch and in a second run when explicitly starting the "Tag" build job? If so, how could this be avoided?
pipeline {
agent any
environment {
APP_NAME = 'myapp'
}
stages {
stage('Compile') {
steps {
echo "Start compiling..."
}
}
stage('Build Docker Image') {
when { buildingTag() }
steps {
echo "Building a Docker image..."
}
}
}
}
For a multibranch project branch builds are separate from tag builds, so yes, each build would have the compile stage running. They will also have separate workspaces, so they should not affect each other.
If you don't want a stage to run at tag build, just add a when { not { buildingTag() } } expression to that stage.

How to clean up pull request directories in jenkins home workspace on jenkins master

We use docker containers to build all jobs. So we don't have to worry about cleanup workspace, as container goes away with it. But I see
$JENKINS_HOME/workspace/_job_name_PR-251-4MSWSIVZHFONFLOHNITFTR6R5CAJMNKESIVZHFONFLOHNITFTR6#script
I tried below but did not cleanup
pipeline {
agent {
label 'jenkins-slave'
}
stages {
stage('Build') {
steps {
sh '''npm install
npm run build'''
}
}
}
post { cleanup { cleanWs() } }
}
I can have nightly jenkins job or cron job to delete those directories, looking for better way.

Jenkins declarative pipeline: input with conditional steps without blocking the executor

I'm trying to get the following features to work in Jenkins' Declarative Pipeline syntax:
Conditional execution of certain stages only on the master branch
input to ask for user confirmation to deploy to a staging environment
While waiting for confirmation, it doesn't block an executor
Here's what I've ended up with:
pipeline {
agent none
stages {
stage('1. Compile') {
agent any
steps {
echo 'compile'
}
}
stage('2. Build & push Docker image') {
agent any
when {
branch 'master'
}
steps {
echo "build & push docker image"
}
}
stage('3. Deploy to stage') {
when {
branch 'master'
}
input {
message "Deploy to stage?"
ok "Deploy"
}
agent any
steps {
echo 'Deploy to stage'
}
}
}
}
The problem is that stage 2 needs the output from 1, but this is not available when it runs. If I replace the various agent directives with a global agent any, then the output is available, but the executor is blocked waiting for user input at stage 3. And if I try and combine 1 & 2 into a single stage, then I lose the ability to conditionally run some steps only on master.
Is there any way to achieve all the behaviour I'm looking for?
You need to use the stash command at the end of your first step and then unstash when you need the files
I think these are available in the snippet generator
As per the documentation
Saves a set of files for use later in the same build, generally on
another node/workspace. Stashed files are not otherwise available and
are generally discarded at the end of the build. Note that the stash
and unstash steps are designed for use with small files. For large
data transfers, use the External Workspace Manager plugin, or use an
external repository manager such as Nexus or Artifactory

Resources