jenkins clone and zip repository - jenkins

i'm currently new to jenkins and i'm trying clone the repo and create a zip and finally upload to s3.
But i'm currently stuck with zip as the default folder location of clone file is /var/lib/jenkins/<project-view>
def getFilename() {
char[] chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz".toCharArray()
StringBuilder sb = new StringBuilder(3)
Random random = new Random();
for (int i = 0; i < 3; i++) {
char c = chars[random.nextInt(chars.length)];
sb.append(c);
}
String randomString = sb.toString();
def now = new Date().format("yyyyMMdd")
String output = now + randomString
return output
}
pipeline {
agent any
environment {
FILENAME = getFilename()
}
stages {
stage("Zip"){
steps{
script{
echo "Creating zip archive ${env.FILENAME}.zip"
zip archive: true, dir: '', glob: '', zipFile: "${env.FILENAME}.zip"
archiveArtifacts artifacts: "${env.FILENAME}.zip", fingerprint: true
}
}
}
stage('Upload to AWS') {
steps {
withAWS(region:'eu-west-2',credentials:'7b42d7b6-f11b-41b6-8c14-61dafbd256c7') {
sh 'echo "Uploading content with AWS creds"'
s3Upload(pathStyleAccessEnabled: true, payloadSigningEnabled: true, file: "${env.FILENAME}.zip", bucket:'elasticbeanstalk-eu-west-2-246342104703')
}
}
}
}
}
i tried with providing specific dir to zip /var/lib/jenkins/<project-view>, but was unable to zip the file, any help for this newbie here šŸ™‚. thanks in advance

I guess your issue is that the S3 command is being executed in a different place from where the zip file is located. Try adding this:
dir('location to your zip') {
after steps {. This will ensure that the withAWS command runs in the same directory as your zip file, and it should be able to find the zip.

Related

Jenkins groovy - copy file based on string in params

I have a Jenkins job where that needs to copy a file to a specific server per user choice. Till today all was working since I needed to copy the same file to the server that the user choose.
Now I need to copy a specific file per server. I n case the user chooses to deploy Server_lab1-1.1.1.1 so lab1.file.conf file should be copied. in case the user chooses to deploy Server_lab2-2.2.2.2 , lab2.file.conf should be copied.
Iā€™m guessing that I need to add to the function
Check if the Server parameter includes lab1 if so, copy lab1.file.conf file and if the Server parameter includes lab2 if so, copy lab2.file.conf file
parameters {
extendedChoice(name: 'Servers', description: 'Select servers for deployment', multiSelectDelimiter: ',',
type: 'PT_CHECKBOX', value: 'Server_lab1-1.1.1.1, Server_lab2-2.2.2.2', visibleItemCount: 5)
stage ('Copy nifi.flow.properties file') {
steps { copy_file() } }
def copy_file() {
params.Servers.split(',').each { item -> server = item.split('-').last()
sh "scp **lab1.file.conf or lab2.file.conf** ${ssh_user_name}#${server}:${spath}"
}
}
Are you looking for something like below.
def copy_file() {
params.Servers.split(',').each { item ->
def server = item.split('-').last()
def fileName = item.contains('lab1') ? 'lab1.file' : 'lab2.file'
sh "scp ${fileName} ${ssh_user_name}#${server}:${spath}"
}
}
Update Classic if-else
def copy_file() {
params.Servers.split(',').each { item ->
def server = item.split('-').last()
def fileName = "default"
if( item.contains('lab1')) {
fileName = 'lab1.file'
} else if(item.contains('lab2')) {
fileName = 'lab2.file'
} else if(item.contains('lab3')) {
fileName = 'lab3.file'
}
sh "scp ${fileName} ${ssh_user_name}#${server}:${spath}"
}
}

How to integrate Jenkins pipeline jobs and pass dynamic variables using Groovy?

I want to integrate Jenkins jobs using Groovy by passing dynamic variables based on the projects for which the job is triggered.
Can anyone please suggest on how to proceed with this?
Looks like you would like to persist data between two jenkins jobs or two runs of the same jenkins job. In both cases, I was able to do this using files. you can use write file to do it using groovy or redirection operator (>) to just use bash.
In first job, you can write to the file like so.
node {
// write to file
writeFile(file: 'variables.txt', text: 'myVar:value')
sh 'ls -l variables.txt'
}
In second job, you can read from that file and empty the contents after you read it.
stage('read file contents') {
// read from the file
println readFile(file: 'variables.txt')
}
The file can be anywhere on the filesystem. Example with a file created in /tmp folder is as follows. You should be able to run this pipeline by copy-pasting.
node {
def fileName = "/tmp/hello.txt"
stage('Preparation') {
sh 'pwd & rm -rf *'
}
stage('write to file') {
writeFile(file: fileName, text: "myvar:hello", encoding: "UTF-8")
}
stage('read file contents') {
println readFile(file: fileName)
}
}
You could also use this file as a properties file and update a property that exists and append ones that don't . A quick sample code to do that looks like below.
node {
def fileName = "/tmp/hello.txt"
stage('Preparation') {
sh 'pwd & rm -rf *'
}
stage('write to file') {
writeFile(file: fileName, text: "myvar:hello", encoding: "UTF-8")
}
stage('read file contents') {
println readFile(file: fileName)
}
// Add property
stage('Add property') {
if (fileExists(fileName)) {
existingContents = readFile(fileName)
}
newProperty = "newvar:newValue"
writeFile(file: fileName, text: existingContents + "\n" + newProperty)
println readFile(file: fileName)
}
}
You could easily delete a line that has a property if you would like to get rid of it

Jenkins pipeline stage - loop over sub directories of target directory

I'm wanting to iterate over the contents of a folder that will contain a bunch of subdirectories so that I can run shell commands on each one.
I'm just trying to prove I can access the contents of the directory, and have this so far:
stage('Publish Libs') {
when {
branch productionBranch
}
steps {
echo "Publish Libs"
dir('dist/libs') {
def files = findFiles()
files.each{ f ->
if(f.directory) {
echo "This is directory: ${f.name} "
}
}
}
}
}
But getting this error
org.jenkinsci.plugins.workflow.cps.CpsCompilationErrorsException: startup failed:
/var/lib/jenkins/jobs/al-magma/branches/master/builds/5/libs/o3-app-pipeline/vars/magmaPipeline.groovy: 178: Expected a step # line 178, column 25.
def files = findFiles()
What's the correct syntax here please?
You code works great, thanks! Just needed to enclose it in a script block:
stage('Iterate directories') {
steps {
script {
dir('mydir') {
def files = findFiles()
files.each { f ->
if (f.directory) {
echo "This is a directory: ${f.name}"
}
}
}
}
}
}

Unable to Create New file in Jenkins Pipeline

I am trying to create New file in Jenkins Pipeline , by getting error.
error:
java.io.FileNotFoundException: /var/lib/jenkins/workspace/Pipeline-Groovy/test.txt (No such file or directory)
But when i am executing below commands without pipeline , It's created new file
def newFile = new File("/var/lib/jenkins/workspace/test/test.txt")
newFile.append("hello\n")
println newFile.text
If i use same code in Pipeline getting above error
pipeline {
agent any
options {
buildDiscarder(logRotator(numToKeepStr: '5'))
timestamps()
}
stages {
stage('Demo1-stage') {
steps {
deleteDir()
script {
def Jobname = "${JOB_NAME}"
echo Jobname
}
}
}
stage('Demo-2stage') {
steps {
script {
def workspace = "${WORKSPACE}"
echo workspace
def newFile = new File("/var/lib/jenkins/workspace/Pipeline-Groovy/test.txt")
newFile.createNewFile()
sh 'ls -lrt'
}
}
}
}
}
It looks like your folder is not present. Do not give absolute path while creating the file unless it is a requirement. I see that in your case, you need a file in workspace. Always use the ${WORKSPACE} to get the current work directory.
def newFile = new File("${WORKSPACE}/test.txt")
newFile.createNewFile()

How to list all directories from within directory in jenkins pipeline script

I want to get all directories present in particular directory from jenkins pipeline script.
How can we do this?
If you want a list of all directories under a specific directory e.g. mydir using Jenkins Utility plugin you can do this:
Assuming mydir is under the current directory:
dir('mydir') {
def files = findFiles()
files.each{ f ->
if(f.directory) {
echo "This is directory: ${f.name} "
}
}
}
Just make sure you do NOT provide glob option. Providing that makes findFiles to return file names only.
More info: https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/
I didn't find any plugin to list folders, so I used sh/bat script in pipeline, and also this will work irrespective of operating system.
pipeline {
stages {
stage('Find all fodlers from given folder') {
steps {
script {
def foldersList = []
def osName = isUnix() ? "UNIX" : "WINDOWS"
echo "osName: " + osName
echo ".... JENKINS_HOME: ${JENKINS_HOME}"
if(isUnix()) {
def output = sh returnStdout: true, script: "ls -l ${JENKINS_HOME} | grep ^d | awk '{print \$9}'"
foldersList = output.tokenize('\n').collect() { it }
} else {
def output = bat returnStdout: true, script: "dir \"${JENKINS_HOME}\" /b /A:D"
foldersList = output.tokenize('\n').collect() { it }
foldersList = foldersList.drop(2)
}
echo ".... " + foldersList
}
}
}
}
}
I haven't tried this, but I would look at the findFiles step provided by the Jenkins Pipeline Utility Steps Plugin and set glob to an ant-style directory patter, something like '**/*/'
If you just want to log them, use
sh("ls -A1 ${myDir}")
for Linux/Unix. (Note: that's a capital letter A and the number one.)
Or, use
bat("dir /B ${myDir}")
for Windows.
If you want the list of files in a variable, you'll have to use
def dirOutput = sh("ls -A1 ${myDir}", returnStdout: true)
or
def dirOutput = bat("dir /B ${myDir}", returnStdout: true)
and then parse the output.
Recursively getting all the Directores within a directory.
pipeline {
agent any
stages {
stage('Example') {
steps {
script {
def directories = getDirectories("$WORKSPACE")
echo "$directories"
}
}
}
}
}
#NonCPS
def getDirectories(path) {
def dir = new File(path)
def dirs = []
dir.traverse(type: groovy.io.FileType.DIRECTORIES, maxDepth: -1) { d ->
dirs.add(d)
}
return dirs
}
A suggestion for the very end of Jenkinsfile:
post {
always {
echo '\n\n-----\nThis build process has ended.\n\nWorkspace Files:\n'
sh 'find ${WORKSPACE} -type d -print'
}
}
Place the find wherever you think is better. Check more alternatives at here

Resources