I have tried all sort of ways but nothing seems to be working. Here is my jenkinsfile.
def ZIP_NODE
def CODE_VERSION
pipeline{
/*A declarative pipeline*/
agent {
/*Agent section*/
// where would you like to run the code
label 'ubuntu'
}
options{
timestamps()
}
parameters {
choice(choices: ['dev'], description: 'Name of the environment', name: 'ENV')
choice(choices: ['us-east-1', 'us-west-1','us-west-2','us-east-2','ap-south-1'], description: 'What AWS region?', name: 'AWS_DEFAULT_REGION')
string(defaultValue: "", description: '', name: 'APP_VERSION')
}
stages{
/*stages section*/
stage('Initialize the variables') {
// Each stage is made up of steps
steps{
script{
CODE_VERSION='${BUILD_NUMBER}-${ENV}'
ZIP_NODE='abcdefgh-0.0.${CODE_VERSION}.zip'
}
}
}
stage ('code - Checkout') {
steps{
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'xxxxxxxxxxxxxxxxxxxxxxxxxx', url: 'http://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.git']]])
}
}
stage ('code - Build'){
steps{
sh '''
echo ${JOB_NAME}
pwd
echo ${ZIP_NODE}
echo 'remove alraedy existing zip files'
rm -rf *.zip
zip -r ${ZIP_NODE} .
chmod 777 $ZIP_NODE
'''
}
}
stage('Deploy on Beanstalk'){
steps{
build job: 'abcdefgh-PHASER' , parameters: [[$class: 'StringParameterValue', name: 'vpc', value: ENV], [$class: 'StringParameterValue', name: 'ZIP_NODE', value: ZIP_NODE], [$class: 'StringParameterValue', name: 'CODE_VERSION', value: CODE_VERSION], [$class: 'StringParameterValue', name: 'APP_VERSION', value: BUILD_NUMBER], [$class: 'StringParameterValue', name: 'AWS_DEFAULT_REGION', value: AWS_DEFAULT_REGION], [$class: 'StringParameterValue', name: 'ParentJobName', value: JOB_NAME]]
}
}
}
}
The output of step script in stage ('Initialize the variables') gives me nothing, It is not setting the value of global variable ZIP_NODE:
[Pipeline] stage
[Pipeline] { (Initialize the variables)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
And then we go to stage (code - Build) we do not get the value of ZIP_NODE. See echo statement at 22:34:17
[Pipeline] stage
[Pipeline] { (code - Build)
[Pipeline] sh
22:34:16 [abcdefgh-ci-dev-pipeline] Running shell script
22:34:17 + echo abcdefgh-ci-dev-pipeline
22:34:17 abcdefgh-ci-dev-pipeline
22:34:17 + pwd
22:34:17 /home/advisor/Jenkins/workspace/abcdefgh-ci-dev-pipeline
22:34:17 + echo
22:34:17
22:34:17 + echo remove alraedy existing zip files
Thanks to #awefsome, I had some observation which I would like add in details:
When I use below code I get desired output i.e. correct value of ZIP_NODE:
stage ('code - Build'){
steps{
sh "echo ${JOB_NAME} && pwd && echo ${ZIP_NODE} && echo 'remove alraedy existing zip files' && rm -rf *.zip && zip -r ${ZIP_NODE} . && chmod 777 $ZIP_NODE"
}
}
But when I use below code I do not get the value of ZIP_NODE:
stage ('code - Build'){
steps{
sh '''
echo ${ZIP_NODE}
echo ${JOB_NAME}
pwd
echo ${ZIP_NODE}
echo ${CODE_VERSION}
#rm -rf .ebextensions
echo 'remove alraedy existing zip files'
rm -rf *.zip
zip -r ${ZIP_NODE} .
chmod 777 $ZIP_NODE
'''
}
}
sh '''
'''
should be
sh """
"""
with single quotes the variables don't get processed.
Try following and see how it goes:
def ZIP_NODE
def CODE_VERSION
pipeline{
/*A declarative pipeline*/
agent {
/*Agent section*/
// where would you like to run the code
label 'master'
}
options{
timestamps()
}
parameters {
choice(choices: ['dev'], description: 'Name of the environment', name: 'ENV')
choice(choices: ['us-east-1', 'us-west-1','us-west-2','us-east-2','ap-south-1'], description: 'What AWS region?', name: 'AWS_DEFAULT_REGION')
string(defaultValue: "", description: '', name: 'APP_VERSION')
}
stages{
/*stages section*/
stage('Initialize the variables') {
// Each stage is made up of steps
steps{
script{
CODE_VERSION="${BUILD_NUMBER}-${ENV}"
ZIP_NODE="abcdefgh-0.0.${CODE_VERSION}.zip"
}
}
}
stage ('code - Checkout') {
steps{
println "checkout skipped"
//checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'xxxxxxxxxxxxxxxxxxxxxxxxxx', url: 'http://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.git']]])
}
}
stage ('code - Build'){
steps{
sh "echo ${JOB_NAME} && pwd && echo ${ZIP_NODE} && echo 'remove alraedy existing zip files' && rm -rf *.zip && zip -r ${ZIP_NODE} . && chmod 777 $ZIP_NODE"
}
}
stage('Deploy on Beanstalk'){
steps{
println "build job skipped"
//build job: 'abcdefgh-PHASER' , parameters: [[$class: 'StringParameterValue', name: 'vpc', value: ENV], [$class: 'StringParameterValue', name: 'ZIP_NODE', value: ZIP_NODE], [$class: 'StringParameterValue', name: 'CODE_VERSION', value: CODE_VERSION], [$class: 'StringParameterValue', name: 'APP_VERSION', value: BUILD_NUMBER], [$class: 'StringParameterValue', name: 'AWS_DEFAULT_REGION', value: AWS_DEFAULT_REGION], [$class: 'StringParameterValue', name: 'ParentJobName', value: JOB_NAME]]
}
}
}
}
I got following output:
Started by user jenkins
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Running on Jenkins in /Users/Shared/Jenkins/Home/workspace/test
[Pipeline] {
[Pipeline] timestamps
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Initialize the variables)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (code - Checkout)
[Pipeline] echo
21:19:06 checkout skipped
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (code - Build)
[Pipeline] sh
21:19:06 [test] Running shell script
21:19:06 + echo test
21:19:06 test
21:19:06 + pwd
21:19:06 /Users/Shared/Jenkins/Home/workspace/test
21:19:06 + echo abcdefgh-0.0.17-dev.zip
21:19:06 abcdefgh-0.0.17-dev.zip
21:19:06 + echo 'remove alraedy existing zip files'
21:19:06 remove alraedy existing zip files
21:19:06 + rm -rf '*.zip'
21:19:06 + zip -r abcdefgh-0.0.17-dev.zip .
21:19:06
21:19:06 zip error: Nothing to do! (try: zip -r abcdefgh-0.0.17-dev.zip . -i .)
[Pipeline] }
[Pipeline] // stage
Global environments on jenkins should be outside the pipeline ,can be initialized/used inside the scripts, and of course known by all scopes inside the pipeline:
example:
def internal_ip
pipeline {
agent { node { label "test" } }
stages {
stage('init') {
steps {
script {
def x
if(env.onHold.toBoolean() == true){
x=1
}else{
x=2
}
internal_ip = sh (
script: "echo 192.168.0.${x}",
returnStdout: true
).trim()
}
}
}
stage('test') {
steps {
script {
echo "my internal ip is: ${internal_ip}"
}
}
}
}
}
}
In addition to #avivamg's answer which is correct.
One remaining problem is that if you want to access Jenkins' environment variables (see http://<JENKINS_URL>/env-vars.html/) to initialize these globals, e.g.:
def workspaceDir=${WORKSPACE}
you get:
groovy.lang.MissingPropertyException: No such property: WORKSPACE for class: groovy.lang.Binding
The idea of :
def workspaceDir
pipeline {
environment {
workspaceDir=${WORKSPACE}
}
stages {
stage('Globals test') {
steps {
script {
echo "workspaceDir=${workspaceDir}"
echo workspaceDir
}
}
}
}
}
leads to the output:
workspaceDir=null
null
since there are two different contexts involved: environment and Groovy, which apparently are independent from each other.
It works with:
environment {
workspaceDir=${WORKSPACE}
}
but that's an environment variable then, not a variable in the Groovy context.
Declaring and initializing in the Groovy context works via stage(s):
def workspaceDir
pipeline {
stages {
stage('Initializing globals') {
steps {
script {
workspaceDir = "${WORKSPACE}"
}
}
}
stage('Globals test') {
steps {
script {
echo "workspaceDir=${workspaceDir}"
echo workspaceDir
}
}
}
}
}
Output:
workspaceDir=C:\Users\jenkins\AppData\Local\Jenkins\.jenkins\workspace\Globals-test-project
C:\Users\jenkins\AppData\Local\Jenkins\.jenkins\workspace\Globals-test-project
My experience with using global variables inside groovy shellscript is to declare them global and then use a stage environment variable. It looks that the shell script has only access to this private environment variables and not global. As sample:
environment{
COMMIT_ID = "foo"
}
stages{
stage('Build'){
environment {
commitId = sh(returnStdout: true, script: 'git rev-parse HEAD').trim()
}
steps {
script {
COMMIT_ID = commitId
}
}
}
stage ('Build-Win-Container'){
environment {
commitId = "${COMMIT_ID}"
}
steps{
sh label: 'build/push container', script: '''
echo "With Windows commitId: ${commitId}"
'''
}
}
}
I tried dozen of ways also from stackoverflow but none was working the way to access the variable inside the shell script. I think this is because the shell script is an private instance inside a step and only has access to variables inside this step (my opinion)
So I set an global environment variable
Fill this inside step 1 with a script
Set a environment variable inside the next step and fill it there.
Finaly it was available inside the shell script. No other way worked for me.
Related
stages{
stage('Update pipline'){
when{
expression{ env.Update_Pipeline = true}
}
steps{
script{
currentBuild.result = 'ABORTED'
error("Updated pipeline & please re-run pipeline again")
}
}
}
stage('Checkout'){
steps{
script{
dir('app'){
checkout([$class:'GitSCM',branches:[[name:'*/master']],
doGenerateSubmoduleConfigurations: false,
extensions: [],
submoduelCfg:[],
userRemoteConfigs: [[credentialsId:'Bitbucket-credentials',
url: 'git#bitbucket.org:exium-c2/exops-nodemon.git']]
])
//GIT_BRANCH= sh(returnStdout: true,script:"git rev-parse --abbrev-ref HEAD").trim()
current_tag=sh(returnStdout: true, script: "git describe --tags --abbrev=0").trim()
echo "${current_tag}"
}
}
}
}
stage('CD-Deploy'){
steps{
script{
if (( env.current_tag ='stg-v.2.0')){
ssh -o StrictHostKeyChecking=no
centos#65.0.194.82 "cd /etc/release/exops_nodemon && docker-compose down && docker-compose up -d "
}
}
}
}
}
}
''
I have created a tag with stg-1.0,2.0..etc so once condition doesn't matches the stage[CD-Deploy]should skipped but my condition is matched still it is getting skipped. Please help to understand this issue. Thanks.
Note: My requirement I want to skip the stage based on tag if tag is not starting with 'stg' need to skip the stage
output:
git describe --tags --abbrev=0
stg-v.2.0
''
I am using more than one agents in my declarative pipeline. Is there anyway to copy artifacts (input.txt) from agent1 to agent2? here is my declarative pipeline,
pipeline {
agent none
stages {
stage('Build') {
agent {
label 'agent1'
}
steps {
sh 'echo arjun > input.txt'
}
post {
always {
archiveArtifacts artifacts: 'input.txt',
fingerprint: true
}
}
}
stage('Test') {
agent {
label 'agent2'
}
steps {
sh 'cat input.txt'
}
}
}
}
You can use Copy Artifact Plugin that can do exactly that.
Given your Jenkinsfile, it then turns into this:
pipeline {
agent none
stages {
stage('Build') {
agent { label 'agent1' }
steps {
sh 'echo arjun > input.txt'
}
post {
always {
archiveArtifacts artifacts: 'input.txt', fingerprint: true
}
}
}
stage('Test') {
agent { label 'agent2' }
steps {
// this brings artifacts from job named as this one, and this build
step([
$class: 'CopyArtifact',
filter: 'input.txt',
fingerprintArtifacts: true,
optional: true,
projectName: env.JOB_NAME,
selector: [$class: 'SpecificBuildSelector',
buildNumber: env.BUILD_NUMBER]
])
sh 'cat input.txt'
}
}
}
}
Use stash and unstash.
example from:
https://www.jenkins.io/doc/pipeline/examples/
// First we'll generate a text file in a subdirectory on one node and stash it.
stage "first step on first node"
// Run on a node with the "first-node" label.
node('first-node') {
// Make the output directory.
sh "mkdir -p output"
// Write a text file there.
writeFile file: "output/somefile", text: "Hey look, some text."
// Stash that directory and file.
// Note that the includes could be "output/", "output/*" as below, or even
// "output/**/*" - it all works out basically the same.
stash name: "first-stash", includes: "output/*"
}
// Next, we'll make a new directory on a second node, and unstash the original
// into that new directory, rather than into the root of the build.
stage "second step on second node"
// Run on a node with the "second-node" label.
node('second-node') {
// Run the unstash from within that directory!
dir("first-stash") {
unstash "first-stash"
}
// Look, no output directory under the root!
// pwd() outputs the current directory Pipeline is running in.
sh "ls -la ${pwd()}"
// And look, output directory is there under first-stash!
sh "ls -la ${pwd()}/first-stash"
}
I am working on a declarative Jenkins pipeline for Terraform deployments. I want to have the terraform init / select workspace / plan in one stage, ask for approval in another stage, and then do the apply in another stage. I have the agent at the top set to none and then using a kubernetes agent for a docker image we created that has packages we need for the stages. I am declaring those images in each stage. When I execute the pipeline, I get an error that I need to reinitialize Terraform in the apply stage even though I initialized in the init/plan stage. I figure this is nature of the stages running in different nodes.
I have it working by doing init / plan and stashing the plan. In the apply stage, it unstashes the plan, calls init / select workspace again, and then finally applies the unstashed plan.
I realize I could set the agent at the top, but according to Jenkins documentation, that is bad practice, as waiting for user input will block the execution.
I feel like there has to be a way to do this more elegantly. Any suggestions?
Here's my code:
def repositoryURL = env.gitlabSourceRepoHttpUrl != null && env.gitlabSourceRepoHttpUrl != "" ? env.gitlabSourceRepoHttpUrl : env.RepoURL
def repositoryBranch = env.gitlabTargetBranch != null && env.gitlabTargetBranch != "" ? env.gitlabTargetBranch : env.RepoBranch
def notificationEmail = env.gitlabUserEmail != null && env.gitlabUserEmail != "" ? env.gitlabSourceRepoHttpUrl : env.Email
def projectName = env.ProjectName
def deployAccountId = env.AccountId
pipeline {
agent none
stages {
stage("Checkout") {
agent any
steps {
git branch: "${repositoryBranch}", credentialsId: '...', url: "${repositoryURL}"
stash name: 'tf', useDefaultExcludes: false
}
}
stage("Terraform Plan") {
agent {
kubernetes {
label 'myagent'
containerTemplate {
name 'cis'
image 'docker-local.myrepo.com/my-image:v2'
ttyEnabled true
command 'cat'
}
}
}
steps {
container('cis') {
unstash 'tf'
script {
sh "terraform init"
try {
sh "terraform workspace select ${deployAccountId}_${projectName}_${repositoryBranch}"
} catch (Exception e) {
sh "terraform workspace new ${deployAccountId}_${projectName}_${repositoryBranch}"
}
sh "terraform plan -out=${deployAccountId}_${projectName}_${repositoryBranch}_plan.tfplan -input=false"
stash includes: "*.tfplan" name: "tf-plan", useDefaultExcludes: false
}
}
}
post{
success{
echo "Terraform init complete"
}
failure{
echo "Terraform init failed"
}
}
}
stage ("Terraform Plan Approval") {
agent none
steps {
script {
def userInput = input(id: 'confirm', message: 'Apply Terraform?', parameters: [ [$class: 'BooleanParameterDefinition', defaultValue: false, description: 'Apply terraform', name: 'confirm'] ])
}
}
}
stage ("Terraform Apply") {
agent {
kubernetes {
label 'myagent'
containerTemplate {
name 'cis'
image 'docker-local.myrepo.com/my-image:v2'
ttyEnabled true
command 'cat'
}
}
}
steps {
container("cis") {
withCredentials([[
$class: 'AmazonWebServicesCredentialsBinding',
credentialsId: 'my-creds',
accessKeyVariable: 'AWS_ACCESS_KEY_ID',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
script {
unstash "tf"
unstash "tf-plan"
sh "terraform init"
try {
sh "terraform workspace select ${deployAccountId}_${projectName}_${repositoryBranch}"
} catch (Exception e) {
sh "terraform workspace new ${deployAccountId}_${projectName}_${repositoryBranch}"
}
sh """
set +x
temp_role="\$(aws sts assume-role --role-arn arn:aws:iam::000000000000:role/myrole --role-session-name jenkinzassume)" > /dev/null 2>&1
export AWS_ACCESS_KEY_ID=\$(echo \$temp_role | jq .Credentials.AccessKeyId | xargs) > /dev/null 2>&1
export AWS_SECRET_ACCESS_KEY=\$(echo \$temp_role | jq .Credentials.SecretAccessKey | xargs) > /dev/null 2>&1
export AWS_SESSION_TOKEN=\$(echo \$temp_role | jq .Credentials.SessionToken | xargs) > /dev/null 2>&1
set -x
terraform apply ${deployAccountId}_${projectName}_${repositoryBranch}_plan.tfplan
"""
}
}
}
}
}
}
}
I have a pipeline and I'm building my image through a docker container and it output the image tag, I want to pass that image tag to next stage, when I echo it in the next stage it prints out. but when I use it in a shell it goes empty. here is my pipeline
pipeline {
agent any
stages {
stage('Cloning Git') {
steps {
git( url: 'https://xxx#bitbucket.org/xxx/xxx.git',
credentialsId: 'xxx',
branch: 'master')
}
}
stage('Building Image') {
steps{
script {
env.IMAGE_TAG = sh script: "docker run -e REPO_APP_BRANCH=master -e REPO_APP_NAME=exampleservice -e DOCKER_HUB_REPO_NAME=exampleservice --volume /var/run/docker.sock:/var/run/docker.sock registry.xxxx/build", returnStdout: true
}
}
}
stage('Integration'){
steps{
script{
echo "passed: ${env.IMAGE_TAG}"
sh """
helm upgrade exampleservice charts/exampleservice --set image.tag=${env.IMAGE_TAG}
"""
sh "sleep 5"
}
}
}
}
}
pipeline output
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Integration)
[Pipeline] script
[Pipeline] {
[Pipeline] echo
passed:
b79c3bf-b6eec4f
[Pipeline] sh
[test101] Running shell script
+ helm upgrade exampleservice charts/exampleservice --set image.tag=
getting empty image tag
You should override this by using the 'env'.
Replace your code with this one:
pipeline {
agent any
stages {
stage('Cloning Git') {
steps {
git( url: 'https://xxx#bitbucket.org/xxx/xxx.git',
credentialsId: 'xxx',
branch: 'master')
}
}
stage('Building Image') {
steps{
script {
env.IMAGE_TAG = sh script: "docker run -e REPO_APP_BRANCH=master -e REPO_APP_NAME=exampleservice -e DOCKER_HUB_REPO_NAME=exampleservice --volume /var/run/docker.sock:/var/run/docker.sock registry.xxxx/build", returnStdout: true
}
}
}
stage('Integration'){
steps{
script{
echo "passed: ${env.IMAGE_TAG}"
sh """
helm upgrade exampleservice charts/exampleservice\
--set image.tag="${env.IMAGE_TAG}"
"""
sh "sleep 5"
}
}
}
}
}
I am using the declarative syntax for my pipeline, and would like to store the path to the workspace being used on one of my stages, so that same path can be used in a later stage.
I have seen I can call pwd() to get the current directory, but how do I assign to a variable to be used between stages?
EDIT
I have tried to do this by defining by own custom variable and using like so with the ws directive:
pipeline {
agent { label 'master' }
stages {
stage('Build') {
steps {
script {
def workspace = pwd()
}
sh '''
npm install
bower install
gulp set-staging-node-env
gulp prepare-staging-files
gulp webpack
'''
stash includes: 'dist/**/*', name: 'builtSources'
stash includes: 'config/**/*', name: 'appConfig'
node('Protractor') {
dir('/opt/foo/deploy/') {
unstash 'builtSources'
unstash 'appConfig'
}
}
}
}
stage('Unit Tests') {
steps {
parallel (
"Jasmine": {
node('master') {
ws("${workspace}"){
sh 'gulp karma-tests-ci'
}
}
},
"Mocha": {
node('master') {
ws("${workspace}"){
sh 'gulp mocha-tests'
}
}
}
)
}
post {
success {
sh 'gulp combine-coverage-reports'
sh 'gulp clean-lcov'
publishHTML(target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: false,
reportDir: 'test/coverage',
reportFiles: 'index.html',
reportName: 'Test Coverage Report'
])
}
}
}
}
}
In the Jenkins build console, I see this happens:
[Jasmine] Running on master in /var/lib/jenkins/workspace/_Pipelines_IACT-Jenkinsfile-UL3RGRZZQD3LOPY2FUEKN5XCY4ZZ6AGJVM24PLTO3OPL54KTJCEQ#2
[Pipeline] [Jasmine] {
[Pipeline] [Jasmine] ws
[Jasmine] Running in /var/lib/jenkins/workspace/_Pipelines_IACT-Jenkinsfile-UL3RGRZZQD3LOPY2FUEKN5XCY4ZZ6AGJVM24PLTO3OPL54KTJCEQ#2#2
The original workspace allocated from the first stage is actually _Pipelines_IACT-Jenkinsfile-UL3RGRZZQD3LOPY2FUEKN5XCY4ZZ6AGJVM24PLTO3OPL54KTJCEQ
So it doesnt look like it working, what am I doing wrong here?
Thanks
pipeline {
agent none
stages {
stage('Stage-One') {
steps {
echo 'StageOne.....'
script{ name = 'StackOverFlow'}
}
}
stage('Stage-Two'){
steps{
echo 'StageTwo.....'
echo "${name}"
}
}
}
}
Above prints StackOverFlow in StageTwo for echo "${name}"
You can also use sh "echo ${env.WORKSPACE}" to get The absolute path of the directory assigned to the build as a workspace.
You could put the value into an environment variable like described in this answer
CURRENT_PATH= sh (
script: 'pwd',
returnStdout: true
).trim()
Which version are you running? Maybe you can just assign the WORKSPACE variable to an environment var?
Or did i totally misunderstand and this is what you are looking for?