Using Jenkins Environment Variable in Pipeline SH script - jenkins

I don't find a way using the BUILD_NUMBER provided by jenkins in a sh script. I read some answers to similar questions but nothing seem to help.
node {
echo "Build number 1 $BUILD_NUMBER"
// output ok
stage('stage1') {
echo "Build number 2 $BUILD_NUMBER"
// output ok
def BUILD_NUMBER = "$BUILD_NUMBER"
withCredentials([sshUserPrivateKey(credentialsId: 'github-rsa-key', variable: 'RSAKEY')]) {
echo "Build number 3 " + BUILD_NUMBER
// output ok
echo "Build number 4 $BUILD_NUMBER"
// output ok
// -----------------
sh 'echo $BUILD_NUMBER' // NullPointer
sh "echo $BUILD_NUMBER" // NullPointer
sh "echo \$BUILD_NUMBER" // NullPointer
sh "echo BUILD_NUMBER" // NullPointer
withEnv(["BUILD_NUMBER=BUILD_NUMBER"]) {
sh "echo $BUILD_NUMBER" // NullPointer!!
}
env.BUILD_NUMER = "$BUILD_NUMBER"
sh "echo $BUILD_NUMBER" // NullPointer
sh "echo ${env.BUILD_NUMBER}" // NullPointer
}
}
}

Basic solution: wrap shell script in """ block
node {
echo "Build number 1: $BUILD_NUMBER"
// output ok
stage('stage1') {
echo "Build number 2: $BUILD_NUMBER"
// output ok
def BUILD_NUMBER = "$BUILD_NUMBER"
echo "Build number 3: " + BUILD_NUMBER
// output ok
echo "Build number 4: $BUILD_NUMBER"
// output ok
// -----------------
sh 'printenv'
sh """
echo "Build number in sh script: ${env.BUILD_NUMBER}"
echo "Job base name: ${env.JOB_BASE_NAME}"
"""
// output ok
}
}
Console Output:
Running on Jenkins in /var/lib/jenkins/workspace/test-infra-env
[Pipeline] {
[Pipeline] echo
Build number 1: 5
[Pipeline] stage
[Pipeline] { (stage1)
[Pipeline] echo
Build number 2: 5
[Pipeline] echo
Build number 3: 5
[Pipeline] echo
Build number 4: 5
[Pipeline] sh
+ printenv
JENKINS_HOME=/var/lib/jenkins
MAIL=/var/mail/jenkins
USER=jenkins
...
...
JOB_BASE_NAME=test-infra-env
BUILD_NUMBER=5
...
...
[Pipeline] sh
+ echo Build number in sh script: 5
Build number in sh script: 5
+ echo Job base name: test-infra-env
Job base name: test-infra-env
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

There may be a more idiomatic approach (please share if you know) but it works if you define it in an environment block first. Something like:
stage('Show Build Number') {
environment {
BUILD_NUMBER = "${env.BUILD_NUMBER}"
}
steps {
sh '''
echo "This is build $BUILD_NUMBER"
'''
}
}
There is a good post on code maven with useful examples.

Here's a simple example that works for me. Jenkins 2.164.2
Edit to add a physical script as well: /tmp/script.sh contains..
#!/bin/bash
echo "Script: - Build number: $BUILD_NUMBER"
And the Jenkins job
node {
echo "Node: Build number: $BUILD_NUMBER"
stage('stage1') {
echo "Stage: Build number: $BUILD_NUMBER"
sh ("echo Shell: Build number: $BUILD_NUMBER")
sh ("/tmp/script.sh")
}
}
This example uses a "withCredentials" block. Note the single quotes, which is referenced here - https://jenkins.io/doc/pipeline/steps/credentials-binding/
node {
echo "Build number 1 $BUILD_NUMBER"
// output ok
stage('stage1') {
withCredentials([string(credentialsId: 'my_password', variable: 'TOKEN')]) {
sh '''
echo "Shell: Build number: $BUILD_NUMBER"
'''
sh ('/tmp/script.sh')
}
}
}

Related

Jenkins fails on empty grep result

I need to filter a file in jenkins. Filtering works as long as the result is not empty. But if the resulting output is empty, the pipeline fails with ERROR: script returned exit code 1 Finished: FAILURE
Example:
#!groovy
pipeline {
agent any
stages {
stage ('mystage') {
steps {
script {
sh "echo '' > myfile"
sh "echo 'foo 0' >> myfile"
sh "echo 'foo 1' >> myfile"
sh "grep foo myfile"
sh "grep ba myfile"
}
}
}
}
}
output:
+ echo ''
[Pipeline] sh
+ echo 'foo 0'
[Pipeline] sh
+ echo 'foo 1'
[Pipeline] sh
+ grep foo myfile
foo 0
foo 1
[Pipeline] sh
+ grep ba myfile
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 1
Finished: FAILURE
routing the output to a file with grep ba myfile > catchoutput does not work.
How can I output the grep result, without the pipeline failing in this edge case?
Adding a dummy line like sh "echo 'dummyline that won't match' >> myfile" seems to work but is a hack. Is there a clean solution?
We can take return value in a variable:
def ret = sh(script: 'grep ba myfile', returnStdout: true)
More info : https://www.jenkins.io/doc/pipeline/steps/workflow-durable-task-step/#code-sh-code-shell-script.
Note you can also add returnStatus: true, so that jenkins step does not fail even if there is failure in the command.

jenkins get pipeline environment variable in ssh agent plugin sh command

I am trying to get environment variable declared in pipeline unfortunately i am not getting the pipeline environment variable in ssh agent shell command.
I am trying to get environment variable declared in pipeline unfortunately i am not getting the pipeline environment variable in ssh agent shell command.
Please find the code below:
#!groovy
library 'reference-pipeline'
pipeline{
agent {
label 'Weblogic||Tomcat'
}
environment{
HostName='test.prod.com'
sshserver="ssh -o StrictHostKeyChecking=no user#${HostName}"
SERVER_ADDRESS='192.25.58.201'
CONFIG='PRODUCTION'
}
stages
{
stage("Check TLA version")
{
steps{
script{
sshagent(credentials : ['SSH_Credentials']) {
sh """
set -e
$sshserver << "EOF"
echo "Configuration:$CONFIG" // output "Configuration: " should be "Configuration:production"
echo " Server:$SERVER_ADDRESS" // output "Server: " should be "Server: 192.25.58.201"
echo " Server Host : $hostname" // output "server host: testgood"
echo "started"
'`git describe`'
echo "ended"
cd /var/lib/ubuntu/test-srv/current
server_version="`git describe`"
echo "Current server version: $server_version"
if [[ $server_version != *'1.0.0_Release'* ]]; then
echo "Error: The underlying server version is not 1.0.0_Release Release. Exiting ..."
exit 1
fi
EOF
"""
}
}
}
}
}
post {
always {
cleanWs()
}
}
}
Instead of:
echo "Configuration:$CONFIG"
try:
echo "Configuration: ${env.CONFIG}"

How to run the build inside docker container in Jenkins?

In my application I have a build script in package.json.
The build makes dist folder and inside I have my application.
I set Jenkins master and Jenkins agent as say in boxboat setup jenkins with docker and watch the video in youtube.
But now after I did this, I don't think my bash commands running inside a container.
I want to clone the repo and run npm i and npm run build - inside the docker container.
How I modify this configuration to able to do that?
throttle(['throttleDocker']) {
node('docker') {
wrap([$class: 'AnsiColorBuildWrapper']) {
try{
stage('Build') {
checkout scm
sh '''
echo "in Setup"
docker ps -a
echo "after docker"
# ./ci/docker-down.sh
# ./ci/docker-up.sh
'''
}
stage('Test'){
parallel (
"unit": {
sh '''
echo "in unit"
# ./ci/test/unit.sh
'''
},
"functional": {
sh '''
echo "in functional"
# ./ci/test/functional.sh
'''
}
)
}
stage('Capacity Test') {
sh '''
echo "in Capacity Test"
# ./ci/test/stress.sh
'''
}
}
finally {
stage('Cleanup') {
sh '''
echo "in Cleanup"
# ./ci/docker-down.sh
'''
}
}
}
}
}
I tried to this codes but they don't work. I also add agent after try.
stage('Build') {
agent {
docker {
label 'docker'
image 'node:latest'
}
}
steps {
checkout scm
sh 'node -v'
}
...
You can try below scripted pipeline
node {
docker.image('yourimage').inside {
stage('Build'){
sh 'echo "Build stage inside container"'
}
stage('Test'){
sh 'echo "Test Stage inside container"'
}
}
}

pass variables between stages jenkins pipeline

I'm creating a Jenkins pipeline for simple deployment into kubernetes cluster, I have my private Docker registry, in here I simply clone my repo and build my docker image and update build docker image id into kubernetes deployment manifest and deploy the pod. but I'm having trouble passing my build image id to next stage, I did some research and try to solve it so I managed to pass the id to next stage but when I try to add the new id to deployment manifests its empty.
here is my pipeline
pipeline {
environment {
BUILD_IMAGE_ID = ''
}
agent any
stages {
stage('Cloning Git') {
steps {
git( url: 'https://xxxxxx.git',
credentialsId: 'id',
branch: 'master')
}
}
stage('Login Docker Registry') {
steps{
script {
sh 'docker login --username=xxxx --password=xxxx registry.xxxx.com'
}
}
}
stage('Building Image') {
steps{
script {
def IMAGE_ID = sh script:'docker run -e REPO_APP_BRANCH=xxxx -e REPO_APP_NAME=xxx --volume /var/run/docker.sock:/var/run/docker.sock registry.xxxx/image-build', returnStdout: true
println "Build image id: ${IMAGE_ID} "
BUILD_IMAGE_ID = IMAGE_ID.replace("/n","")
env.BUILD_IMAGE_ID = BUILD_IMAGE_ID
}
}
}
stage('Integration'){
steps{
script{
echo "passed: ${BUILD_IMAGE_ID} "
//update deployment manifests with latest docker tag
sh 'sed -i s,BUILD_ID,${BUILD_IMAGE_ID},g deployment-manifests/development/Service-deployments.yaml'
}
}
}
}
}
I don't want to save that value into a file and read and do the operation
output
[Pipeline] echo
Build image id:
registry.xxxx.com/service:3426d51-baeffc2
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Integration)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
passed:
registry.xxxx.com/service:3426d51-baeffc2
[Pipeline] sh
[orderservice] Running shell script
+ sed -i s,BUILD_ID,,g deployment-manifests/development/service-deployments.yaml

Hung mail notification in Jenkins Pipeline or Workflow

I have created a test workflow as below:
node("master") {
ws("/opt/mount1/jenkins/jobs/GoogleFlow/workspace/${env.BUILD_NUMBER}") {
try {
stage name: 'sync', concurrency: 3
echo "before sync"
sh '''touch buildFile
echo "This is from ${BUILD_NUMBER}" >> buildFile
cat buildFile'''
sh "sleep 5"
echo "after sync"
sh "date"
stage name: 'build', concurrency: 1
echo "before build"
sh "date"
sh '''sleep 10
cat buildFile'''
echo "build 1/3"
sh "sleep 5"
echo "build 2/3"
sh '''sleep 5
cat buildFile'''
echo "build 3/3"
sh "date"
stage name: 'test', concurrency: 3
echo "before test"
sh "date"
sh '''sleep 10
cat buildFile'''
sh "date"
stage name: 'delete', concurrency: 1
sh '''pwd
ls -al'''
//deleteDir()
//sh '''pwd
//ls -al'''
}
catch (err){
stage 'Send Notification'
mail (to: 'XXXXXXX#gmail.com',
subject: "test",
body: "test");
}
}
}
I am trying to get an email notification using the try-catch. I have referred this blog post, but when it comes to stage to 'send notification' it just sits there hung.
But if i use the old Jenkins job way, i can receive emails. This shows that the SMTP setup is working. Below is the console output, when it hungs
[Pipeline] Allocate node : Start
Running on master in /opt/mount1/jenkins/jobs/GoogleFlow/workspace#4
[Pipeline] node {
[Pipeline] Allocate workspace : Start
Running in /opt/mount1/jenkins/jobs/GoogleFlow/workspace/205
[Pipeline] ws {
[Pipeline] stage: sync
Entering stage sync
Proceeding
[Pipeline] echo
before sync
[Pipeline] sh
[205] Running shell script
+ touch buildFile
+ echo 'This is from 205'
+ cat buildFile
This is from 205
[Pipeline] sh
[205] Running shell script
+ sleep 5
[Pipeline] echo
after sync
[Pipeline] sh
[205] Running shell script
+ date
Tue Feb 2 22:54:52 UTC 2016
[Pipeline] stage: build
Entering stage build
Waiting for builds [204]
Canceled since #206 got here
[Pipeline] stage: Send Notification
Entering stage Send Notification
Proceeding
[Pipeline] mail
Any idea on how to fix?
Jenkins - 1.643
Mailer plugin - 1.11
I had the same problem and fixed it by upgrading mailer to v 1.16.

Resources