Copy key file to folder using Jenkingfile - jenkins

I am using jenkins scripted file.
I have .key file stored in jenkins files ( Where all env files are present ).
And I need to copy that file to code folder.
Like i want to store device.key in src/auth/keys.
Then will run test on code in pipeline.
I am using scripted Jenkinsfile. And i am unable to find any way to this.
node{
def GIT_COMMIT_HASH
stage('Checkout Source Code and Logging Into Registry') {
echo 'Logging Into the Private ECR Registry'
checkout scm
sh "git rev-parse --short HEAD > .git/commit-id"
GIT_COMMIT_HASH = readFile('.git/commit-id').trim()
# NEED TO COPY device.key to /src/auth/key
}
stage('TEST'){
nodejs(nodeJSInstallationName:'node'){
sh 'npm install'
sh 'npm test'
}
}
}

How I solved this:
I installed Config File Provider Plugin
I added the files as custom files for each environment
In the JenkinsFile I replace the configuration file from the project with the one comming from jenkins:
stage('Add Config files') {
steps {
configFileProvider([configFile(fileId: 'ID-of-Jenkins-stored-file', targetLocation: 'relative-path-to-destination-file-in-the-project')]) {
// some block , maybe a friendly echo for debugging
} } }
Please see the plugin doc as it is capable of replacing tokens in XML and json files and many others.

Related

Groovy in Jenkins pipeline - create a file with content

I am using Jenkins' shared library and my Jenkins file has a stage like this:
stage('sonarqube') {
when { branch 'master' }
steps {
generateUnitTestsReport()
}
}
I want to keep programmers' repos clean from scripts that create various reports, so my idea is to keep definitions of scripts in a shared library, and then, during step execution, create a file with the content.
For instance (file generateUnitTestsReport.groovy in shared library):
def call(){
def SCRIPT_CONTENT = '''
#!/bin/bash
#SCRIPT CONTENT
'''
sh '''echo''' +SCRIPT_CONTENT+ ''' > ut-report.sh'''
sh 'chmod +x ./ut-report.sh'
}
but it doesn't work like this. I also tried Groovy's new File, but no luck there either. How could this be done (note that this is a Jenkins slave node)?

jenkins pipeline script to deal module in subdirectory

I have a git url maven project which I want to only deal one of its submodule.
I write in pipeline script :
...
stage("mvn build") {
steps {
script {
sh "mvn package -DskipTests=true"
}
}
}
error arise: The goal you specified requires a project to execute but there is no POM in this directory (/xx/jenkins/workspace/biz-commons_deploy). so I add command :
sh "cd cmiot-services/comm" # subdir of biz-commons_deploy
def PWD = pwd();
echo "##=${PWD} "
sh "mvn package -DskipTests=true"
not work, print ##=/root/.jenkins/workspace/biz-commons_deploy, the error is the same as before .
how can I solve this problem and why the echo and error use different user space?
I make it using sh "mvn -f cmiot-services/comm/pom.xml package -DskipTests=true",still not know where this two user path come from and why sh cd not work.
steps {
sh '''
# list items in current directory to see where is your pom.xml
ls -l
# run job by comment out following two lines, if you don't know the
# relative path of folder where pom.xml insides exactly
cd <folder where pom.xml insides>
mvn package -DskipTests=true
'''
}
As Yong answered, every sh steps are independent, imagine Jenkins is opening a new ssh connection on your slave each time.
For your script, instead of a workaround with sh, why not using build in dir step ?
Something like this should do it :
stage("mvn build") {
steps {
script {
dir('cmiot-services/comm') {
sh "mvn package -DskipTests=true"
}
}
}
}
when you are executing Jenkins Pipline, the current directory is the Jenkins workspace directory.
You can add a step to clone the repo that your code is in (granted that the environment you are running the Jenkins instance is able to connect to your repo and clone).
You can then navigate into the directory that has the pom.xml. And finally execute the maven command.
...
stage("Clone Repo") {
steps {
script {
sh "git clone ssh://git#bitbucket.org:repo/app.git"
}
}
}
stage("mvn build") {
steps {
script {
sh "cd app/"
sh "pwd"
sh "mvn package -DskipTests=true"
}
}
}

How to copy Jenkins config files in a jenkins pipeline to Web server

I have some files.properties in Jenkins config File that I need to copy to a server during the jenkins pipeline.
pipeline code is more a less as showed, just to get an idea.
How can I add a step that copy this config file from jenkins on a destination server after las step after step DEPLOY WAR TO SERVER in pipeline like for example : "sh Scp file.properties jenkins#destinationserver:/destination/path/file.properties"
code {
stage ('Code Checkout') {
git branch: 'master',
credentialsId: 'b346fbxxxxxxxxxxxxxxxxxxx',
url: 'https://xxxxxxx#bitbucket.org/gr/code.git'
}
stage ('Check Branch') {
sh 'git branch'
}
stage('Compile and Build WAR') {
sh 'mvn clean compile war:war'
stage ('Deploy WAR to server') {
sh "scp .war jenkins#serverIp:/var/lib/tomcat/.war"
}
This is quite easy. You need to install the Config File Provider Plugin and then you can generate the appropriate line by visiting htts://localhost/jenkins/pipeline-syntax/. From there in the dropdown you can choose configFileProvider and fill the rest of the form.
The end result will be something like this:
configFileProvider(
[configFile(fileId: 'maven-settings-or-a-UUID-to-your-config-file', variable: 'MAVEN_SETTINGS')]) {
sh 'mvn -s $MAVEN_SETTINGS clean package'
}

Jenkins Pipeline Utility Steps - zip zipFile

I am trying to zip the folders which are created as output of my jenkins pipeline job using pipeline script. By googling i came to know the Jenkins
Pipeline Utility Steps - zip zipFile
https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/#code-zip-code-create-zip-file to zip folders/files but could not get exact pipeline syntax to zip.
In my job workspace, I have a folder by name 'Test' which has 2 sub folders as 'Test1', 'Test2'. Each sub folder will have .dll files. So, I would like to zip entire 'Test' folder with all subfolder.
node(Jenkinks_1)
{
echo "ZIP"
zip zipFile: 'Test.zip', dir:'C:\\workspace\\Build_Sample\\Test'
echo "END - ZIP"
}
Below are the Console Output from Jenkins:
Started by user XXXXX
[Pipeline] node
Running on Jenkinks_1 in C:\workspace\Build_Sample
[Pipeline] {
[Pipeline] echo
ZIP
[Pipeline] echo
END - ZIP
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS
Looking for some guidance to zip the folders using pipeline syntax. Appreciate your inputs.
I wanted to zip some files as output of my jenkins pipeline job
First, try the same operation in stages and step, as in here:
pipeline {
agent any
stages {
stage ('push artifact') {
steps {
sh 'mkdir archive'
sh 'echo test > archive/test.txt'
zip zipFile: 'test.zip', archive: false, dir: 'archive'
archiveArtifacts artifacts: 'test.zip', fingerprint: true
}
}
...
}
It uses archiveArtifacts to record the result.
If using an absolute path does now work, try a relative one ('..')
As seen by the OP Sri, zip zipFile is part of, and requires the JENKINS Pipeline Utility Steps Plugin.
See "Implemented Steps".
Regarding the syntax to be used for multi-criteria file selection, NicolasW notes in the comments that the documentation is vague: "use glob ant-style syntax"...
He got it to work though, with a basic coma separated syntax.
E.g.
zip zipFile: 'test.zip', archive: false, glob: 'config-/**/,scripts/**/*.*
But, as noted by Tanvir in the comments, issue 44078 means you need to replace zip by:
script{ zip zipFile: 'test.zip', archive: false, dir: 'archive' }
Meaning you need to use a script block.
Was able to Zip after installing the Pipeline Utility Steps plugin.
I came across this because zip was ... not installed on the host.
Reminder to self : If you need zip, install it first.
sudo yum install zip
you can just use sh (jenkins server need install zip);
sh '''
zip -r algo.zip algo
'''
pipeline script like this
node {
stage('Clean'){
cleanWs()
}
stage('Checkout') {
git branch: 'develop', url: 'ssh://user#ip:29418/prj.git'
}
stage('Zip') {
dir('algo-python') {
sh '''
zip -r algo.zip algo
'''
}
}
stage('Upload zip'){
dir('algo-python') {
sh '''
source /etc/profile
export HADOOP_USER_NAME=dev
hdfs dfs -put -f algo.zip /user/dev/zipfile/
'''
}
}
}

Pipeline step having trouble resolving a file path

I am having trouble getting a shell command to complete in a stage I have defined:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
The shell command issues a protractor call which takes a config file argument, but this file fails to be found when protractor tries to retrieve it.
If I take a look at the workspace directory for where the repo is checked out to from the checkout scm step I can see the test directory is present with the config file present the sh step is referencing.
So I'm unsure why the file cannot be found.
I thought about trying to verify the files that can be seen around the time the protractor command is being issued.
So something like:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
def files = findFiles(glob: 'test/**/*.conf.js')
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
}
}
}
}
But this doesnt work, I dont think findFiles can be used inside a step?
Can anyone offer any suggestions about what may be going on here?
Thanks
to do the debugging you were attempting (to see if the file is actually there) you could wrap the findFiles in a script (making sure your echo is before the step that fails) or use a basic find in an "sh" step like this:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
// you could use the unix find command instead of groovy's findFiles
sh 'find test -name *.conf.js'
// if you're using a non-dsl-step (like findFiles), you must wrap it in a script
script {
def files = findFiles(glob: 'test/**/*.conf.js')
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
}

Resources