Continuous Delivery with Jenkins: Deploy to production as manual step - jenkins

We are doing trunk-based development. So we just have a main branch.
There we have a Jenkins pipeline with this stages:
Build -> Test -> Deploy to Test
Now I would like to add a manual stage that will deploy to Production. But I don't want to have stage that waits and blocks the pipeline. Just an optional manual stage that can be triggered by a user and that will deploy the build to production.
How can I achieve this?

You can use the Pipeline:Input Step to achieve the deployment to production as manual step.
Here you can find an example for the same:
How can I approve a deployment and add an optional delay for the deployment
Edited Answer
Create a choice parameter using the option This project is parameterised as below
Sample pipeline code with condition to run the stage or not:
pipeline {
agent any
stages {
stage('Hello') {
steps {
echo 'Hello World'
}
}
stage('Deploy to Production') {
when {
expression { params.DEPLOY_TO_PRODUCTION == 'Yes' }
}
steps {
echo 'Deploying to Production...'
}
}
}
}
Output:
When user select the YES, stage Deploy to Production will run
When user select the NO, stage Deploy to Production will not run

Related

Checkout and run SCM pipeline only on master node

I coded a generic pipeline which accepts several parameters in order to deploy releases from a pre-defined GitHub repository to specific nodes. I wanted to host this pipeline on a Jenkinsfile on GitHub, so I configured the job to work with a "Pipeline script from SCM". The fact is - when I try and build the job - the Jenkinsfile gets checked out on every node. Is it possible to checkout and execute the Jenkinsfile only on, say, the master node and run the pipeline as intended?
EDIT: As I stated before, the pipeline works just fine and as intended setting the job to work with a pipeline script. The thing is when I try and change it to be a "Pipeline script from SCM", the Jenkinsfile gets checked out on every agent, which is a problem since I don't have git installed on any agent other than master. I want the Jenkinsfile to be checked out only on master agent and be executed as intended. FYI the pipeline below:
def agents = "$AGENTS".toString()
def agentLabel = "${ println 'Agents: ' + agents; return agents; }"
pipeline {
agent none
stages {
stage('Prep') {
steps {
script {
if (agents == null || agents == "") {
println "Skipping build"
skipBuild = true
}
if (!skipBuild) {
println "Agents set for this build: " + agents
}
}
}
}
stage('Powershell deploy script checkout') {
agent { label 'master' }
when {
expression {
!skipBuild
}
}
steps {
git url: 'https://github.com/owner/repo.git', credentialsId: 'git-credentials', branch: 'main'
stash includes: 'deploy-script.ps1', name: 'deploy-script'
}
}
stage('Deploy') {
agent { label agentLabel }
when {
expression {
!skipBuild
}
}
steps {
unstash 'deploy-script'
script {
println "Execute powershell deploy script on agents set for deploy"
}
}
}
}
}
I think that skipDefaultCheckout is what are you looking for:
pipeline {
options {
skipDefaultCheckout true
}
stages {
stage('Prep') {
steps {
script {
........................
}
}
}
}
}
Take a look to the documentation:
skipDefaultCheckout
Skip checking out code from source control by default in the agent directive.
https://www.jenkins.io/doc/book/pipeline/syntax/
I think you are requesting the impossible.
Now:
your Jenkinsfile is inside your jenkins configuration and is sent as such to each of your agents. No need for git on your agents.
Pipeline script for SCM:
Since you use git, SCM = git. So you are saying: my Pipeline needs to be fetched from a git repository. You are declaring the Deploy step to run on agent { label agentLabel }, so that step is supposed to run on another agent than master.
How would you imagine that agent could get the content of the Jenkinsfile to know what to do, but not use git ?
What happens in Jenkins?
Your master agent gets triggered that it needs to build
the master agent checkouts the Jenkinsfile using git (since it is a Pipeline script from SCM)
jenkins reads the Jenkinsfile and sees what has to be done.
for the Prep stage, I'm not quite sure what happens without agent, I guess that runs on master agent.
the Powershell deploy script checkout is marked to run on master agent, so it runs on master agent (note that the Jenkinsfile will get checked out with git two more times:
before starting the stage, because jenkins needs to know what to execute
one more checkout because you specify git url: 'https://github.com/owner/repo.git'...
the Deploy stage is marked to run on agentLabel, so jenkins tries to checkout your Jenkinsfile on that agent (using git)...
You can use Scripted Pipeline to do this, it should basically look like this
node('master') {
checkout scm
stash includes: 'deploy-script.ps1', name: 'deploy-script'
}
def stepsForParallel = [:]
env.AGENTS.split(' ').each { agent ->
stepsForParallel["deploy ${agent}"] = { ->
node(agent) {
unstash 'deploy-script'
}
}
parallel stepsForParallel
you can find all info about jenkins agent section here.
Shortly: you can call any agent by name or label.
pipeline {
agent {
label 'master'
}
}
If it will not work for you, then you will need to set any label on master node and call it by label
pipeline {
agent {
label 'master_label_here'
}
}

Will a Jenkins pipeline compile twice when building a tag?

I want to setup a Jenkins pipeline which builds a Docker image whenever Jenkins is building a tag, so I used buildingTag() in the when condition. This works fine but I have some trouble understanding Jenkins at this point.
Every commit triggers the "Compile" stage. If a tag is built, will the "Compile" stage be executed twice? In a first run on the e.g. master branch and in a second run when explicitly starting the "Tag" build job? If so, how could this be avoided?
pipeline {
agent any
environment {
APP_NAME = 'myapp'
}
stages {
stage('Compile') {
steps {
echo "Start compiling..."
}
}
stage('Build Docker Image') {
when { buildingTag() }
steps {
echo "Building a Docker image..."
}
}
}
}
For a multibranch project branch builds are separate from tag builds, so yes, each build would have the compile stage running. They will also have separate workspaces, so they should not affect each other.
If you don't want a stage to run at tag build, just add a when { not { buildingTag() } } expression to that stage.

Jenkins declarative pipeline: input with conditional steps without blocking the executor

I'm trying to get the following features to work in Jenkins' Declarative Pipeline syntax:
Conditional execution of certain stages only on the master branch
input to ask for user confirmation to deploy to a staging environment
While waiting for confirmation, it doesn't block an executor
Here's what I've ended up with:
pipeline {
agent none
stages {
stage('1. Compile') {
agent any
steps {
echo 'compile'
}
}
stage('2. Build & push Docker image') {
agent any
when {
branch 'master'
}
steps {
echo "build & push docker image"
}
}
stage('3. Deploy to stage') {
when {
branch 'master'
}
input {
message "Deploy to stage?"
ok "Deploy"
}
agent any
steps {
echo 'Deploy to stage'
}
}
}
}
The problem is that stage 2 needs the output from 1, but this is not available when it runs. If I replace the various agent directives with a global agent any, then the output is available, but the executor is blocked waiting for user input at stage 3. And if I try and combine 1 & 2 into a single stage, then I lose the ability to conditionally run some steps only on master.
Is there any way to achieve all the behaviour I'm looking for?
You need to use the stash command at the end of your first step and then unstash when you need the files
I think these are available in the snippet generator
As per the documentation
Saves a set of files for use later in the same build, generally on
another node/workspace. Stashed files are not otherwise available and
are generally discarded at the end of the build. Note that the stash
and unstash steps are designed for use with small files. For large
data transfers, use the External Workspace Manager plugin, or use an
external repository manager such as Nexus or Artifactory

Manual trigger of delivery in Jenkins 2.0 for a successful build

I have a pipeline with build and test stages, and on success archiving the result:
node {
try {
stage('build') {
sh 'make'
}
stage('test') {
sh 'make test'
}
stage('archive') {
archiveArtifacts artifacts: "build/"
}
} finally {
junit "tests/*.xml"
}
I'd like to trigger a 'deploy' stage manually from the build page. For instance on the build page, there is a link to download the artifact. I'd like to have a custom button to trigger a 'deploy' action. This action would need to access to the archived artifact and call a web service.
I have tried with the 'input' step in my pipeline, but it's not satisfactory as the build doesn't appear as successful and is still active until I decide to process or abort. Since I'm using concurrentBuild=false, it's not a practical solution.

One Jenkins job with 2 branches

Currently working on a basic deployment pipeline in Jenkins (with pipeline). I am looking for the best way of doing the following:
When the developer pushes to the development branch, all stages but deploy is executed.
When the developer pushes to the master branch, all stages including deploy is executed.
I have read about matching patterns you can do, but not sure if this is the right way as the information I read was dated.
My Jenkins pipeline file
node {
stage('Preparation') {
git 'git#bitbucket.org:foo/bar.git'
}
stage('Build') {
sh 'mkdir -p app/cache app/logs web/media/cache web/uploads'
sh 'composer install'
}
stage('Test') {
sh 'codecept run'
}
stage('Deploy') {
sh 'mage deploy to:prod'
}
}
There's no magic here. This is just Groovy code. The branch in scope will be available as a parameter in some way. Inside the "stage" block, add an "if" check to compare the branch name with whatever logic you need, and either execute the body or not, depending on what branch is in scope.

Resources