I have added helm as podtemplate in value.yaml file
podTemplates:
helm: |
- name: helm
label: jenkins-helm
serviceAccount: jenkins
containers:
- name: helm
image: lachlanevenson/k8s-helm:v3.1.1
command: "/bin/sh -c"
args: "cat"
ttyEnabled: true
privileged: true
resourceRequestCpu: "400m"
resourceRequestMemory: "512Mi"
resourceLimitCpu: "1"
resourceLimitMemory: "1024Mi"
so i want to run helm in pipeline as
steps {
container('helm') {
sh "helm upgrade --install --force ./helm"
}
}
but i got error
/home/jenkins/workspace/coverwhale#tmp/durable-4d1fbfd5/script.sh: 1: /home/jenkins/workspace/coverwhale#tmp/durable-4d1fbfd5/script.sh: helm: not found
Version of Helm and Kubernetes:
Helm Version:
$ helm version
version.BuildInfo{Version:"v3.5.2", GitCommit:"167aac70832d3a384f65f9745335e9fb40169dc2", GitTreeState:"dirty", GoVersion:"go1.15.7"}
Kubernetes Version:
$ kubectl version
Client Version: version.Info{Major:"1", Minor:"20", GitVersion:"v1.20.2", GitCommit:"faecb196815e248d3ecfb03c680a4507229c2a56", GitTreeState:"clean", BuildDate:"2021-01-13T13:28:09Z", GoVersion:"go1.15.5", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"20", GitVersion:"v1.20.2", GitCommit:"faecb196815e248d3ecfb03c680a4507229c2a56", GitTreeState:"clean", BuildDate:"2021-01-13T13:20:00Z", GoVersion:"go1.15.5", Compiler:"gc", Platform:"linux/amd64"}
Which version of the chart:
What happened:
/home/jenkins/workspace/coverwhale#tmp/durable-4d1fbfd5/script.sh: 1: /home/jenkins/workspace/coverwhale#tmp/durable-4d1fbfd5/script.sh: helm: not found
What you expected to happen:
run helm chart
pipeline code
pipeline {
agent any
stages {
stage('Initialize Docker'){
steps {
script {
def docker = tool 'whaledocker'
echo "${docker}"
echo "${env.PATH}"
env.PATH = "${docker}/bin:${env.PATH}"
echo "${env.PATH}"
}
}
}
stage('Checkout Source') {
steps {
git url:'https://github.com/alialrabi/laravel-example.git', branch: 'uat', credentialsId: 'github'
}
}
stage("Build image") {
steps {
script {
myapp = docker.build("alialrabi/coverwhale:${env.BUILD_ID}")
}
}
}
stage("Run Test") {
steps {
script {
docker.image("alialrabi/coverwhale:${env.BUILD_ID}").inside {
// sh 'composer install'
// sh 'php artisan test'
}
}
}
}
stage("Push image") {
steps {
script {
docker.withRegistry('https://registry.hub.docker.com', 'dockerhub') {
myapp.push("latest")
myapp.push("${env.BUILD_ID}")
}
}
}
}
stage('Deploy Uat') {
steps {
script {
echo "Done Uat"
sh "helm upgrade --install --force"
}
}
}
}
}
I have solved it by add containerTemplate to agent.
stage('Deploy dev') {
agent {
kubernetes {
containerTemplate {
name 'helm'
image 'lachlanevenson/k8s-helm:v3.1.1'
ttyEnabled true
command 'cat'
}
}
}
steps {
container('helm') {
sh "helm upgrade full-cover ./helm"
}
}
}
you can install helm on the instance as well in which you are running your jenkins pipeline
stage("helm install"){
steps{
echo "Helm install"
sh 'curl -o kubectl https://amazon-eks.s3.us-west-2.amazonaws.com/1.18.9/2020-11-02/bin/linux/amd64/kubectl'
sh 'curl -LO "https://dl.k8s.io/release/$(curl -L -s https://dl.k8s.io/release/stable.txt)/bin/linux/amd64/kubectl" '
sh 'sudo cp kubectl /usr/bin'
sh 'sudo chmod +x /usr/bin/kubectl'
sh 'wget https://get.helm.sh/helm-v3.6.1-linux-amd64.tar.gz'
sh 'ls -a'
sh 'tar -xvzf helm-v3.6.1-linux-amd64.tar.gz'
sh 'sudo cp linux-amd64/helm /usr/bin'
}
}
I am trying to run the JenkinsFile below which contains two sed commands. But I faced different issues with string interpolation when I cat the file.
Do you know how I can run it inside the JenkinsFile?
Thanks in advance.
pipeline {
agent any
tools {nodejs "NodeJS 6.7.0"}
stages {
stage('checking out gitlab branch master') {
steps {
checkout([$class: 'GitSCM', branches: [[name: '*/development']]])
}
}
stage('executing release process') {
environment {
ARTIFACTORY_APIKEY = credentials('sandbox-gms-password')
}
steps {
sh 'cp bowerrc.template .bowerrc'
sh 'sed -i -e "s/username/zest-jenkins/g" .bowerrc'
sh 'sed -i -e "s/password/${ARTIFACTORY_APIKEY}/g" .bowerrc'
sh 'cat .bowerrc'
}
}
}
}
Put the commands in single "sh" block, please take the reference from the below:-
pipeline {
agent any
tools {nodejs "NodeJS 6.7.0"}
stages {
stage('checking out gitlab branch master') {
steps {
checkout([$class: 'GitSCM', branches: [[name: '*/development']]])
}
}
stage('executing release process') {
environment {
ARTIFACTORY_APIKEY = credentials('sandbox-gms-password')
}
steps {
sh '''
cp bowerrc.template .bowerrc
sed -i -e "s/username/zest-jenkins/g" .bowerrc
sed -i -e "s/password/${ARTIFACTORY_APIKEY}/g" .bowerrc
cat .bowerrc
'''
}
}
}
}
I am working on a declarative Jenkins pipeline for Terraform deployments. I want to have the terraform init / select workspace / plan in one stage, ask for approval in another stage, and then do the apply in another stage. I have the agent at the top set to none and then using a kubernetes agent for a docker image we created that has packages we need for the stages. I am declaring those images in each stage. When I execute the pipeline, I get an error that I need to reinitialize Terraform in the apply stage even though I initialized in the init/plan stage. I figure this is nature of the stages running in different nodes.
I have it working by doing init / plan and stashing the plan. In the apply stage, it unstashes the plan, calls init / select workspace again, and then finally applies the unstashed plan.
I realize I could set the agent at the top, but according to Jenkins documentation, that is bad practice, as waiting for user input will block the execution.
I feel like there has to be a way to do this more elegantly. Any suggestions?
Here's my code:
def repositoryURL = env.gitlabSourceRepoHttpUrl != null && env.gitlabSourceRepoHttpUrl != "" ? env.gitlabSourceRepoHttpUrl : env.RepoURL
def repositoryBranch = env.gitlabTargetBranch != null && env.gitlabTargetBranch != "" ? env.gitlabTargetBranch : env.RepoBranch
def notificationEmail = env.gitlabUserEmail != null && env.gitlabUserEmail != "" ? env.gitlabSourceRepoHttpUrl : env.Email
def projectName = env.ProjectName
def deployAccountId = env.AccountId
pipeline {
agent none
stages {
stage("Checkout") {
agent any
steps {
git branch: "${repositoryBranch}", credentialsId: '...', url: "${repositoryURL}"
stash name: 'tf', useDefaultExcludes: false
}
}
stage("Terraform Plan") {
agent {
kubernetes {
label 'myagent'
containerTemplate {
name 'cis'
image 'docker-local.myrepo.com/my-image:v2'
ttyEnabled true
command 'cat'
}
}
}
steps {
container('cis') {
unstash 'tf'
script {
sh "terraform init"
try {
sh "terraform workspace select ${deployAccountId}_${projectName}_${repositoryBranch}"
} catch (Exception e) {
sh "terraform workspace new ${deployAccountId}_${projectName}_${repositoryBranch}"
}
sh "terraform plan -out=${deployAccountId}_${projectName}_${repositoryBranch}_plan.tfplan -input=false"
stash includes: "*.tfplan" name: "tf-plan", useDefaultExcludes: false
}
}
}
post{
success{
echo "Terraform init complete"
}
failure{
echo "Terraform init failed"
}
}
}
stage ("Terraform Plan Approval") {
agent none
steps {
script {
def userInput = input(id: 'confirm', message: 'Apply Terraform?', parameters: [ [$class: 'BooleanParameterDefinition', defaultValue: false, description: 'Apply terraform', name: 'confirm'] ])
}
}
}
stage ("Terraform Apply") {
agent {
kubernetes {
label 'myagent'
containerTemplate {
name 'cis'
image 'docker-local.myrepo.com/my-image:v2'
ttyEnabled true
command 'cat'
}
}
}
steps {
container("cis") {
withCredentials([[
$class: 'AmazonWebServicesCredentialsBinding',
credentialsId: 'my-creds',
accessKeyVariable: 'AWS_ACCESS_KEY_ID',
secretKeyVariable: 'AWS_SECRET_ACCESS_KEY'
]]) {
script {
unstash "tf"
unstash "tf-plan"
sh "terraform init"
try {
sh "terraform workspace select ${deployAccountId}_${projectName}_${repositoryBranch}"
} catch (Exception e) {
sh "terraform workspace new ${deployAccountId}_${projectName}_${repositoryBranch}"
}
sh """
set +x
temp_role="\$(aws sts assume-role --role-arn arn:aws:iam::000000000000:role/myrole --role-session-name jenkinzassume)" > /dev/null 2>&1
export AWS_ACCESS_KEY_ID=\$(echo \$temp_role | jq .Credentials.AccessKeyId | xargs) > /dev/null 2>&1
export AWS_SECRET_ACCESS_KEY=\$(echo \$temp_role | jq .Credentials.SecretAccessKey | xargs) > /dev/null 2>&1
export AWS_SESSION_TOKEN=\$(echo \$temp_role | jq .Credentials.SessionToken | xargs) > /dev/null 2>&1
set -x
terraform apply ${deployAccountId}_${projectName}_${repositoryBranch}_plan.tfplan
"""
}
}
}
}
}
}
}
I have tried all sort of ways but nothing seems to be working. Here is my jenkinsfile.
def ZIP_NODE
def CODE_VERSION
pipeline{
/*A declarative pipeline*/
agent {
/*Agent section*/
// where would you like to run the code
label 'ubuntu'
}
options{
timestamps()
}
parameters {
choice(choices: ['dev'], description: 'Name of the environment', name: 'ENV')
choice(choices: ['us-east-1', 'us-west-1','us-west-2','us-east-2','ap-south-1'], description: 'What AWS region?', name: 'AWS_DEFAULT_REGION')
string(defaultValue: "", description: '', name: 'APP_VERSION')
}
stages{
/*stages section*/
stage('Initialize the variables') {
// Each stage is made up of steps
steps{
script{
CODE_VERSION='${BUILD_NUMBER}-${ENV}'
ZIP_NODE='abcdefgh-0.0.${CODE_VERSION}.zip'
}
}
}
stage ('code - Checkout') {
steps{
checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'xxxxxxxxxxxxxxxxxxxxxxxxxx', url: 'http://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.git']]])
}
}
stage ('code - Build'){
steps{
sh '''
echo ${JOB_NAME}
pwd
echo ${ZIP_NODE}
echo 'remove alraedy existing zip files'
rm -rf *.zip
zip -r ${ZIP_NODE} .
chmod 777 $ZIP_NODE
'''
}
}
stage('Deploy on Beanstalk'){
steps{
build job: 'abcdefgh-PHASER' , parameters: [[$class: 'StringParameterValue', name: 'vpc', value: ENV], [$class: 'StringParameterValue', name: 'ZIP_NODE', value: ZIP_NODE], [$class: 'StringParameterValue', name: 'CODE_VERSION', value: CODE_VERSION], [$class: 'StringParameterValue', name: 'APP_VERSION', value: BUILD_NUMBER], [$class: 'StringParameterValue', name: 'AWS_DEFAULT_REGION', value: AWS_DEFAULT_REGION], [$class: 'StringParameterValue', name: 'ParentJobName', value: JOB_NAME]]
}
}
}
}
The output of step script in stage ('Initialize the variables') gives me nothing, It is not setting the value of global variable ZIP_NODE:
[Pipeline] stage
[Pipeline] { (Initialize the variables)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
And then we go to stage (code - Build) we do not get the value of ZIP_NODE. See echo statement at 22:34:17
[Pipeline] stage
[Pipeline] { (code - Build)
[Pipeline] sh
22:34:16 [abcdefgh-ci-dev-pipeline] Running shell script
22:34:17 + echo abcdefgh-ci-dev-pipeline
22:34:17 abcdefgh-ci-dev-pipeline
22:34:17 + pwd
22:34:17 /home/advisor/Jenkins/workspace/abcdefgh-ci-dev-pipeline
22:34:17 + echo
22:34:17
22:34:17 + echo remove alraedy existing zip files
Thanks to #awefsome, I had some observation which I would like add in details:
When I use below code I get desired output i.e. correct value of ZIP_NODE:
stage ('code - Build'){
steps{
sh "echo ${JOB_NAME} && pwd && echo ${ZIP_NODE} && echo 'remove alraedy existing zip files' && rm -rf *.zip && zip -r ${ZIP_NODE} . && chmod 777 $ZIP_NODE"
}
}
But when I use below code I do not get the value of ZIP_NODE:
stage ('code - Build'){
steps{
sh '''
echo ${ZIP_NODE}
echo ${JOB_NAME}
pwd
echo ${ZIP_NODE}
echo ${CODE_VERSION}
#rm -rf .ebextensions
echo 'remove alraedy existing zip files'
rm -rf *.zip
zip -r ${ZIP_NODE} .
chmod 777 $ZIP_NODE
'''
}
}
sh '''
'''
should be
sh """
"""
with single quotes the variables don't get processed.
Try following and see how it goes:
def ZIP_NODE
def CODE_VERSION
pipeline{
/*A declarative pipeline*/
agent {
/*Agent section*/
// where would you like to run the code
label 'master'
}
options{
timestamps()
}
parameters {
choice(choices: ['dev'], description: 'Name of the environment', name: 'ENV')
choice(choices: ['us-east-1', 'us-west-1','us-west-2','us-east-2','ap-south-1'], description: 'What AWS region?', name: 'AWS_DEFAULT_REGION')
string(defaultValue: "", description: '', name: 'APP_VERSION')
}
stages{
/*stages section*/
stage('Initialize the variables') {
// Each stage is made up of steps
steps{
script{
CODE_VERSION="${BUILD_NUMBER}-${ENV}"
ZIP_NODE="abcdefgh-0.0.${CODE_VERSION}.zip"
}
}
}
stage ('code - Checkout') {
steps{
println "checkout skipped"
//checkout([$class: 'GitSCM', branches: [[name: '*/master']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'xxxxxxxxxxxxxxxxxxxxxxxxxx', url: 'http://xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx.git']]])
}
}
stage ('code - Build'){
steps{
sh "echo ${JOB_NAME} && pwd && echo ${ZIP_NODE} && echo 'remove alraedy existing zip files' && rm -rf *.zip && zip -r ${ZIP_NODE} . && chmod 777 $ZIP_NODE"
}
}
stage('Deploy on Beanstalk'){
steps{
println "build job skipped"
//build job: 'abcdefgh-PHASER' , parameters: [[$class: 'StringParameterValue', name: 'vpc', value: ENV], [$class: 'StringParameterValue', name: 'ZIP_NODE', value: ZIP_NODE], [$class: 'StringParameterValue', name: 'CODE_VERSION', value: CODE_VERSION], [$class: 'StringParameterValue', name: 'APP_VERSION', value: BUILD_NUMBER], [$class: 'StringParameterValue', name: 'AWS_DEFAULT_REGION', value: AWS_DEFAULT_REGION], [$class: 'StringParameterValue', name: 'ParentJobName', value: JOB_NAME]]
}
}
}
}
I got following output:
Started by user jenkins
Running in Durability level: MAX_SURVIVABILITY
[Pipeline] node
Running on Jenkins in /Users/Shared/Jenkins/Home/workspace/test
[Pipeline] {
[Pipeline] timestamps
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Initialize the variables)
[Pipeline] script
[Pipeline] {
[Pipeline] }
[Pipeline] // script
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (code - Checkout)
[Pipeline] echo
21:19:06 checkout skipped
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (code - Build)
[Pipeline] sh
21:19:06 [test] Running shell script
21:19:06 + echo test
21:19:06 test
21:19:06 + pwd
21:19:06 /Users/Shared/Jenkins/Home/workspace/test
21:19:06 + echo abcdefgh-0.0.17-dev.zip
21:19:06 abcdefgh-0.0.17-dev.zip
21:19:06 + echo 'remove alraedy existing zip files'
21:19:06 remove alraedy existing zip files
21:19:06 + rm -rf '*.zip'
21:19:06 + zip -r abcdefgh-0.0.17-dev.zip .
21:19:06
21:19:06 zip error: Nothing to do! (try: zip -r abcdefgh-0.0.17-dev.zip . -i .)
[Pipeline] }
[Pipeline] // stage
Global environments on jenkins should be outside the pipeline ,can be initialized/used inside the scripts, and of course known by all scopes inside the pipeline:
example:
def internal_ip
pipeline {
agent { node { label "test" } }
stages {
stage('init') {
steps {
script {
def x
if(env.onHold.toBoolean() == true){
x=1
}else{
x=2
}
internal_ip = sh (
script: "echo 192.168.0.${x}",
returnStdout: true
).trim()
}
}
}
stage('test') {
steps {
script {
echo "my internal ip is: ${internal_ip}"
}
}
}
}
}
}
In addition to #avivamg's answer which is correct.
One remaining problem is that if you want to access Jenkins' environment variables (see http://<JENKINS_URL>/env-vars.html/) to initialize these globals, e.g.:
def workspaceDir=${WORKSPACE}
you get:
groovy.lang.MissingPropertyException: No such property: WORKSPACE for class: groovy.lang.Binding
The idea of :
def workspaceDir
pipeline {
environment {
workspaceDir=${WORKSPACE}
}
stages {
stage('Globals test') {
steps {
script {
echo "workspaceDir=${workspaceDir}"
echo workspaceDir
}
}
}
}
}
leads to the output:
workspaceDir=null
null
since there are two different contexts involved: environment and Groovy, which apparently are independent from each other.
It works with:
environment {
workspaceDir=${WORKSPACE}
}
but that's an environment variable then, not a variable in the Groovy context.
Declaring and initializing in the Groovy context works via stage(s):
def workspaceDir
pipeline {
stages {
stage('Initializing globals') {
steps {
script {
workspaceDir = "${WORKSPACE}"
}
}
}
stage('Globals test') {
steps {
script {
echo "workspaceDir=${workspaceDir}"
echo workspaceDir
}
}
}
}
}
Output:
workspaceDir=C:\Users\jenkins\AppData\Local\Jenkins\.jenkins\workspace\Globals-test-project
C:\Users\jenkins\AppData\Local\Jenkins\.jenkins\workspace\Globals-test-project
My experience with using global variables inside groovy shellscript is to declare them global and then use a stage environment variable. It looks that the shell script has only access to this private environment variables and not global. As sample:
environment{
COMMIT_ID = "foo"
}
stages{
stage('Build'){
environment {
commitId = sh(returnStdout: true, script: 'git rev-parse HEAD').trim()
}
steps {
script {
COMMIT_ID = commitId
}
}
}
stage ('Build-Win-Container'){
environment {
commitId = "${COMMIT_ID}"
}
steps{
sh label: 'build/push container', script: '''
echo "With Windows commitId: ${commitId}"
'''
}
}
}
I tried dozen of ways also from stackoverflow but none was working the way to access the variable inside the shell script. I think this is because the shell script is an private instance inside a step and only has access to variables inside this step (my opinion)
So I set an global environment variable
Fill this inside step 1 with a script
Set a environment variable inside the next step and fill it there.
Finaly it was available inside the shell script. No other way worked for me.
How can I publish build info when when pushing a npm registry to Artifactory?
I can do it with Maven using these steps
def rtMaven = Artifactory.newMavenBuild()
def buildInfo = rtMaven.run pom: 'maven-example/pom.xml', goals: 'clean install'
Currently I am just using npm publish
But I would like to have Builds info for my tgz files. Is it possible?
Thanks!
here a sample pipeline code doing simple build.
The only thouble with this script is that i can't figure out how to fill dependencies informations.
node {
stage('Cleaning') {
deleteDir()
}
stage('Preparing') { // for display purposes
// Get some code from Git
git credentialsId: 'MyGitCredential', url: 'http://gitRepourl/MyGitProject.git'
}
stage('Fetch Dependencies') {
// Run the node build
nodejs(nodeJSInstallationName: 'MyNode', configId: 'aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa') {
sh 'npm install'
}
}
stage('Build') {
// Run the node build
nodejs(nodeJSInstallationName: 'MyNode', configId: 'aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa') {
sh 'npm pack'
}
}
stage('Test') {
// Run the node build
nodejs(nodeJSInstallationName: 'MyNode', configId: 'aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa') {
sh 'npm run test'
}
}
stage('Results') {
junit 'test-results.xml' // need to setup karma junit reporter
archive 'myArtefact-*.tgz'
}
stage('Artifactory') {
def uploadSpec = """{
"files": [
{
"pattern": "myArtefact-*.tgz",
"target": "artifactory-npm-repo/myArtefact/"
}
]
}"""
def server = Artifactory.server 'MyArtifactory'
def buildInfo = server.upload(uploadSpec)
buildInfo.retention maxBuilds: 2
server.publishBuildInfo(buildInfo)
}
}
But with this, you've got your build ans artifacts in artifactory