How to copy Jenkins config files in a jenkins pipeline to Web server - jenkins

I have some files.properties in Jenkins config File that I need to copy to a server during the jenkins pipeline.
pipeline code is more a less as showed, just to get an idea.
How can I add a step that copy this config file from jenkins on a destination server after las step after step DEPLOY WAR TO SERVER in pipeline like for example : "sh Scp file.properties jenkins#destinationserver:/destination/path/file.properties"
code {
stage ('Code Checkout') {
git branch: 'master',
credentialsId: 'b346fbxxxxxxxxxxxxxxxxxxx',
url: 'https://xxxxxxx#bitbucket.org/gr/code.git'
}
stage ('Check Branch') {
sh 'git branch'
}
stage('Compile and Build WAR') {
sh 'mvn clean compile war:war'
stage ('Deploy WAR to server') {
sh "scp .war jenkins#serverIp:/var/lib/tomcat/.war"
}

This is quite easy. You need to install the Config File Provider Plugin and then you can generate the appropriate line by visiting htts://localhost/jenkins/pipeline-syntax/. From there in the dropdown you can choose configFileProvider and fill the rest of the form.
The end result will be something like this:
configFileProvider(
[configFile(fileId: 'maven-settings-or-a-UUID-to-your-config-file', variable: 'MAVEN_SETTINGS')]) {
sh 'mvn -s $MAVEN_SETTINGS clean package'
}

Related

Copy key file to folder using Jenkingfile

I am using jenkins scripted file.
I have .key file stored in jenkins files ( Where all env files are present ).
And I need to copy that file to code folder.
Like i want to store device.key in src/auth/keys.
Then will run test on code in pipeline.
I am using scripted Jenkinsfile. And i am unable to find any way to this.
node{
def GIT_COMMIT_HASH
stage('Checkout Source Code and Logging Into Registry') {
echo 'Logging Into the Private ECR Registry'
checkout scm
sh "git rev-parse --short HEAD > .git/commit-id"
GIT_COMMIT_HASH = readFile('.git/commit-id').trim()
# NEED TO COPY device.key to /src/auth/key
}
stage('TEST'){
nodejs(nodeJSInstallationName:'node'){
sh 'npm install'
sh 'npm test'
}
}
}
How I solved this:
I installed Config File Provider Plugin
I added the files as custom files for each environment
In the JenkinsFile I replace the configuration file from the project with the one comming from jenkins:
stage('Add Config files') {
steps {
configFileProvider([configFile(fileId: 'ID-of-Jenkins-stored-file', targetLocation: 'relative-path-to-destination-file-in-the-project')]) {
// some block , maybe a friendly echo for debugging
} } }
Please see the plugin doc as it is capable of replacing tokens in XML and json files and many others.

jenkins pipeline script to deal module in subdirectory

I have a git url maven project which I want to only deal one of its submodule.
I write in pipeline script :
...
stage("mvn build") {
steps {
script {
sh "mvn package -DskipTests=true"
}
}
}
error arise: The goal you specified requires a project to execute but there is no POM in this directory (/xx/jenkins/workspace/biz-commons_deploy). so I add command :
sh "cd cmiot-services/comm" # subdir of biz-commons_deploy
def PWD = pwd();
echo "##=${PWD} "
sh "mvn package -DskipTests=true"
not work, print ##=/root/.jenkins/workspace/biz-commons_deploy, the error is the same as before .
how can I solve this problem and why the echo and error use different user space?
I make it using sh "mvn -f cmiot-services/comm/pom.xml package -DskipTests=true",still not know where this two user path come from and why sh cd not work.
steps {
sh '''
# list items in current directory to see where is your pom.xml
ls -l
# run job by comment out following two lines, if you don't know the
# relative path of folder where pom.xml insides exactly
cd <folder where pom.xml insides>
mvn package -DskipTests=true
'''
}
As Yong answered, every sh steps are independent, imagine Jenkins is opening a new ssh connection on your slave each time.
For your script, instead of a workaround with sh, why not using build in dir step ?
Something like this should do it :
stage("mvn build") {
steps {
script {
dir('cmiot-services/comm') {
sh "mvn package -DskipTests=true"
}
}
}
}
when you are executing Jenkins Pipline, the current directory is the Jenkins workspace directory.
You can add a step to clone the repo that your code is in (granted that the environment you are running the Jenkins instance is able to connect to your repo and clone).
You can then navigate into the directory that has the pom.xml. And finally execute the maven command.
...
stage("Clone Repo") {
steps {
script {
sh "git clone ssh://git#bitbucket.org:repo/app.git"
}
}
}
stage("mvn build") {
steps {
script {
sh "cd app/"
sh "pwd"
sh "mvn package -DskipTests=true"
}
}
}

Jenkins Multibranch Pipeline: How to checkout only once?

I have created very basic Multibranch Pipeline on my local Jenkins via BlueOcean UI. From default config I removed almost all behaviors except one for discovering branches. The config looks line follows:
Within Jenkinsfile I'm trying to setup following scenario:
Checkout branch
(optionally) Merge it to master branch
Build Back-end
Build Front-end
Snippet from my Jenkinsfile:
pipeline {
agent none
stages {
stage('Setup') {
agent {
label "master"
}
steps {
sh "git checkout -f ${env.BRANCH_NAME}"
}
}
stage('Merge with master') {
when {
not {
branch 'master'
}
}
agent {
label "master"
}
steps {
sh 'git checkout -f origin/master'
sh "git merge --ff-only ${env.BRANCH_NAME}"
}
}
stage('Build Back-end') {
agent {
docker {
image 'openjdk:8'
}
}
steps {
sh './gradlew build'
}
}
stage ('Build Front-end') {
agent {
docker {
image 'saddeveloper/node-chromium'
}
}
steps {
dir ('./front-end') {
sh 'npm install'
sh 'npm run buildProd'
sh 'npm run testHeadless'
}
}
}
}
}
Pipeline itself and building steps works fine, but the problem is that Jenkins adds "Check out from version control" step before each stage. The step looks for new branches, fetches refs, but also checks out current branch. Here is relevant output from full build log:
// stage Setup
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh git checkout -f my-branch
// stage Merge with master
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh git checkout -f origin/master
sh git merge --ff-only my-branch
// stage Build Back-end
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh ./gradlew build
// stage Build Front-end
> git checkout -f f067047bbdd3a5d5f9d1f2efae274bc175829595
sh npm install
sh npm run buildProd
sh npm run testHeadless
So as you see it effectively resets working directory to particular commit before every stage git checkout -f f067...595.
Is there any way to disable this default checkout behavior?
Or any viable option how to implement such optional merging to master branch?
Thanks!
By default, git scm will be executed in a Jenkins pipeline. You can disable it by doing:
pipeline {
agent none
options {
skipDefaultCheckout true
}
...
Also, I'd recommend take a look to other useful pipeline options https://jenkins.io/doc/book/pipeline/syntax/#options

How to change a Jenkins Declarative Pipeline environment variable?

I'm trying to create some Docker images. For that I want to use the version number specified in the Maven pom.xml file as tag. I am however rather new to the declarative Jenkins pipelines and I can't figure out how to change my environment variable so that VERSION contains the right version for all stages.
This is my code
#!groovy
pipeline {
tools {
maven 'maven 3.3.9'
jdk 'Java 1.8'
}
environment {
VERSION = '0.0.0'
}
agent any
stages {
stage('Checkout') {
steps {
git branch: 'master', credentialsId: '290dd8ee-2381-4c5b-8d33-5631d03ee7be', url: 'git#gitlab.crosslang.local:company/SOME-API.git'
sh "git clean -f && git reset --hard origin/master"
}
}
stage('Build and Test Java code') {
steps {
script {
def pom = readMavenPom file: 'pom.xml'
VERSION = pom.version
}
echo "${VERSION}"
sh "mvn clean install -DskipTests"
}
}
stage('Build Docker images') {
steps {
dir('whales-microservice/src/main/docker'){
sh 'cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar'
script {
docker.build "company/whales-microservice:${VERSION}"
}
}
}
}
}
}
The problem is the single quote of the statement
sh 'cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar'
single quotes don't expand variables in groovy: http://docs.groovy-lang.org/latest/html/documentation/#_string_interpolation
so you have to double quote your shell statement:
sh "cp ../../../target/whales-microservice-${VERSION}.jar whales-microservice.jar"
I just wanted to mention that if you have pipeline-utility-steps plugin installed you can use readMavenPom() in the environment part, too. It looks like this:
environment {
VERSION = readMavenPom().getVersion()
}

Jenkins Pipeline Error

I am using the pipeline plugin in Jenkins, but unable to run shell commands. I am receiving the following error:
[develop - pipeline] Running shell script
nohup: failed to run command ‘sh’: No such file or directory
The node is an Ubuntu instance.
node ('aws-ondemand') {
//println env.BUILD_NUMBER
try {
stage 'Checkout and Build'
git url: 'git#github.com:MyAndroidRepo.git',
branch: 'develop'
sh 'git submodule init'
sh 'git submodule update'
sh './gradlew clean build'
}catch (e) {
//currentBuild.result = "FAILED"
//notifyFailed()
throw e
}
}
Nevermind. Script is fine. I was injecting env variables in the build step. I removed it and now its working.

Resources