How can I add another git pull from one jenkins pipeline stage - jenkins

I would like to ask on how can I add another step for pulling another repo on the jenkins pipeline I created. You see from the jenkins pipeline settings I already specified a repo for pulling the ui to build. Then after the build is made I need to pull another repo for api and build it as one docker image. I already tried this doc however I'm getting issue on getting the the ui files to combine on the api, here is my pipeline script used.
pipeline {
agent { label 'slave-jenkins'}
stages {blah blah
}
stage('Workspace Cleanup') {
steps {
step([$class: 'WsCleanup'])
checkout scm
}
}
stage('Download and Build UI Files') {
steps {
sh '''#!/bin/bash
echo "###########################"
echo "PERFORMING NPM INSTALL"
echo "###########################"
npm install
echo "###########################"
echo "PERFORMING NPM RUN BUILD"
echo "###########################"
npm run build
echo "###########################"
echo "Downloading API Repo"
echo "###########################"
**git branch: 'master',**
**credentialsId: 'XXXXXXXXXXXX',**
**url: 'ssh://git#XXXXXXe:7999/~lXXXXXX.git'**
echo ""
'''
}
}
}

You shouldn't include Jenkins Pipeline DSL into shell script - it must be a separate step, for example
steps {
// first step to run shell script
sh '''#!/bin/bash
echo "###########################"
echo "PERFORMING NPM INSTALL"
echo "###########################"
npm install
echo "###########################"
echo "PERFORMING NPM RUN BUILD"
echo "###########################"
npm run build
echo "###########################"
echo "Downloading API Repo"
echo "###########################"
'''
// second step to checkout git repo
git branch: 'master',
credentialsId: 'XXXXXXXXXXXX',
url: 'ssh://git#XXXXXXe:7999/~lXXXXXX.git'
}
But mixing two repos in one workspace (and two responsibilities in one job) probably isn't a good idea. It would be better if you split your continuous delivery process into multiple jobs:
job to build UI;
job to build API;
job to create Docker image.
and then chain this jobs, so that they are executed one by one and pass build artifacts to each other. So each part of your CD process will satisfy the principle of
single responsibility.

Related

how to go inside a specific directory and run commands inside it in a jenkins pipeline

I am trying to run a gradle command inside a jenkins pipeline and for that i should cd <location> where gradle files are.
I added a cd command inside my pipeline but that is not working. I did this
stage('build & SonarQube Scan') {
withSonarQubeEnv('sonarhost') {
sh 'cd $WORKSPACE/sonarqube-scanner-gradle/gradle-basic'
sh 'echo ${PWD}'
sh 'gradle tasks --all'
sh 'gradle sonarqube --debug'
}
}
But the cd is not working, I tried dir step as suggested in pipeline docs, but i want to cd inside $WORKSPACE folder.
How can i fix this?
Jenkins resets the directory after each command. So after the first sh, it goes back to the previous location. The dir command is the correct approach, but it should be used like this:
dir('') {
}
Similar to how you have used withSonarQubeEnv
Alternatively, you can simply chain all the commands
sh 'cd $WORKSPACE/sonarqube-scanner-gradle/gradle-basic & echo ${PWD} & ...'
But this is not recommended. Since this will all be in the same command, it will run fine though.

How to specify JDK version in Jenkinsfile pipeline script

I have a pipeline script to deploy applications to server. I'm building project using maven, I want Jenkins to use specified JDK version for building the project. My pipeline script looks like this:
pipeline {
agent any
tools {
// Install the Maven version configured as "M3" and add it to the path.
maven "Maven 3.6.3"
}
stages {
stage('Build') {
steps {
// Run Maven on a Unix agent.
sh "mvn clean package -DskipTests=true -U"
}
post {
// If Maven was able to run the tests, even if some of the test
// failed, record the test results and archive the jar file.
success {
archiveArtifacts "**/${war}"
}
}
}
stage('Deploy EQM Instance 1') {
steps {
sshagent(credentials: ['credentials']) {
sh "echo 1"
sh "echo Initializing deployment to Instance 1"
sh "scp target/${war} ${bastionHost}:/home/opc"
sh "echo 2.1"
sh "echo Uploaded war file to bastion instance"
sh "scp ${bastionHost}:/home/opc/${war} opc#${instanceDns}:/home/opc"
sh "echo 3.2"
sh "echo Uploaded war file from bastion instance to Instance 1 ${instanceDns}"
sh "echo Undeploying old war file"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo rm /opt/tomcat/webapps/${war}"
sh "echo 4.2.2"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo chown tomcat:tomcat -R ${war}"
sh "echo Deploying new war file"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo mv ${war} /opt/tomcat/webapps/"
sh "echo 4.3"
}
}
}
}
There are other already configured on Jenkins, I don't want to disturb them. So I want to specify JDK version in desired job configuration.

Jenkins cannot find sh script in git repo

In my git repo I have build script located in /Src/build.sh. I have set up simple Jenkins file which runs this script:
node {
stage 'Checkout'
checkout scm
stage 'Build'
sh "chmod +x ./Src/build.sh"
sh "./Src/build.sh"
}
Execution of chmod seems to be successfull, but running build.sh gives me this error:
+ ./Src/build.sh
/var/lib/jenkins/workspace/wrksp#tmp/durable-d3f345e7/script.sh: 1:
/var/lib/jenkins/workspace/wrksp#tmp/durable-d3f345e7/script.sh: ./Src/build.sh: not found
I have no idea what is wrong?

How to Create Jenkins Input thats no blocking, and based on previous command output

I have 2 issues, that are both part of the same problem. I am running terraform inside a JenkinsFile, this is all happening on a docker container that runs on a specific node. I have a few different environments with the ec2_plugin, that are labeled 'environment_ec2'. Its done this way since we use ansible, and I want to be able to execute ansible locally in the VPC.
1) How do you create an input and stage that are only executed if a previous command returns a specific output?
2) How can I make this non blocking?
node('cicd_ec2') {
stage('Prepare Environment'){
cleanWs()
checkout scm
}
withAWSParameterStore(credentialsId: 'jenkin_cicd', naming: 'relative', path: '/secrets/cicd/', recursive: true, regionName: 'us-east-1') {
docker.image('jseiser/jenkins_devops:0.7').inside {
stage('Configure Git Access') {
sh 'mkdir -p ~/.ssh'
sh 'mv config ~/.ssh/config'
sh 'chmod 600 ~/.ssh/config'
sh "echo '$BITBUCKET_CLOUD' > ~/.ssh/bitbucket_rsa"
sh 'chmod 600 ~/.ssh/bitbucket_rsa'
sh "echo '$CICD_CODE_COMMIT_KEY' > ~/.ssh/codecommit_rsa"
sh 'chmod 600 ~/.ssh/codecommit_rsa'
sh "echo '$IDAUTO_CICD_MGMT_PEM' > ~/.ssh/idauto-cicd-mgmt.pem"
sh 'chmod 600 ~/.ssh/idauto-cicd-mgmt.pem'
sh 'ssh-keyscan -t rsa bitbucket.org >> ~/.ssh/known_hosts'
sh 'ssh-keyscan -t rsa git-codecommit.us-east-1.amazonaws.com >> ~/.ssh/known_hosts'
}
stage('Terraform'){
sh './init-ci.sh'
sh 'terraform validate'
sh 'terraform plan -detailed-exitcode -out=create.tfplan'
}
input 'Deploy stack?'
stage ('Terraform Apply') {
sh 'terraform apply -no-color create.tfplan'
}
stage('Ansible'){
sh 'ansible-galaxy -vvv install -r requirements.yml'
sh 'ansible-playbook -i ~/ vpn.yml'
}
}
}
}
I only want to run the input and terraform apply, if the result of the below command is == 2.
terraform plan -detailed-exitcode
Since this all has to run on a ec2 instance, and it all has to use this container, I am not sure how I can do this input outside of a node like its recommended. Since if the input sits long enough, this instance may go down and the rest of the code would be run on a new instance/workspace and the information I need from the git repo's and the terraform plan would not be present. The git repo that I checkout contains the terraform configurations, the ansible configurations, and some configuration for SSH so that terraform and ansible are able to pull in their modules/roles from private git repos. The 'create.tfplan' that I would need to use IF terraform has a change would also need to be passed around.
Just really confused how I can get a good input, only get that input if I really need to run terraform apply, and how I can make it non blocking.
I had to adopt this from my work-in-progess which is based on declarative pipeline, but I hope it still mostly works..
def tfPlanExitCode
node {
stage('Checkout') {
checkout scm
}
stage('Plan') {
tfPlanExitCode = sh('terraform plan -out=create.tfplan -detailed-exitcode', [returnStatus: true])
stash 'workspace'
}
}
if (tfPlanExitCode == "2") {
input('Deploy stack?')
stage('Apply') {
node {
unstash 'workspace'
sh 'terraform apply -no-color create.tfplan'
}
}
}
The building blocks are:
don't allocate an executor while the input is waiting (for hours..)
stash your workspace contents (you can optionally specify which files to copy) and unstash later on the agent that continues the build
The visualization might be a bit screwed up, when some builds have the Apply stage and some don't. That's why I'm using the declarative pipelines, which allows to nicely/explicitly skip stages.

Build only the necessary with Jenkins

repo like this:
repo
- service1
- service2
- service3
now I have put on the main directory the jenkinsfile to build and do some stuff, but I want to rebuild only service 1 if I change samething in the subfolder service1, how can i can do?
for now the pipeline is like this, but I want to make it better because now I rebuild everything:
pipeline{
agent any
stages {
stage('Build') {
steps {
script{
sh '''
cd security_manager;
STAGING=true;
sbt " -DSTAGING=$STAGING; reload; clean; compile; docker:publish";
'''
sh '''
cd storage_manager;
STAGING=true;
sbt " -DSTAGING=$STAGING; reload; clean; compile; docker:publish";
'''
sh '''
cd ;
STAGING=true;
sbt " -DSTAGING=$STAGING; reload; clean; compile; docker:publish";
'''
}
}
}
I would like to place same if/else between the sh script but don't know what condition i had to do, because I want to use the git diff between the $GIT_COMMIT and the last commit to check the file, but in the if statement condition I can't put shell script
You only need to find difference between last and current commit:
def diff = sh(returnStdout: true, script: 'git diff #~..#')
Then it is up to You how You have to handle that. The easiest way is to extract only changed directory from every line of diff

Resources