I have a shared library that accept parameters i setup to compress files into a tar. The jenkinspipline looks like this.
stage("Package"){
steps{
compress_files("arg1", "arg2")
}
}
The shared library compress_file looks like this
#!/usr/bin/env groovy
// Process any number of arguments.
def call(String... args) {
sh label: 'Create Directory to store tar files.', returnStdout: true,
script: """ mkdir -p "$WORKSPACE/${env.PROJECT_NAME}" """
args.each {
sh label: 'Creating project directory.', returnStdout: true,
script: """ mkdir -p "$WORKSPACE/${env.PROJECT_NAME}" """
sh label: 'Coping contents to project directory.', returnStdout: true,
script: """ cp -rv ${it} "$WORKSPACE/${env.PROJECT_NAME}/." """
}
sh label: 'Compressing project directory to a tar file.', returnStdout: true,
script: """ tar -czf "${env.PROJECT_NAME}.tar.gz" "${env.PROJECT_NAME}" """
sh label: 'Remove the Project directory..', returnStdout: true,
script: """ rm -rf "$WORKSPACE/${env.PROJECT_NAME}" """
}
New requirement is to use an array instead of updating the argument values. How or can we pass an arrayname in the jenkinsfile stage
Yes it’s possible, from Jenkinsfile you can define the array inside stage() or outside stage() and make that use of, like
In declarative pipeline :
def files = ["arg1", "arg2"] as String[]
pipeline {
agent any
stages {
stage("Package") {
steps {
// script is optional
script {
// you can manipulate the variable value of files here
}
compress_files(files)
}
}
}
}
In scripted pipeline:
node() {
//You can define the value here as well
// def files = ["arg1", "arg2"] as String[]
stage("Package"){
def files = ["arg1", "arg2"] as String[]
compress_files(files)
}
}
And in the shared library, the method will be like
// var/compress_files.groovy
def call(String[] args) {
args.each {
// retrive the value from ${it} and proceed with your logic
}
}
or
def call(String... args) {
args.each {
// retrive the value from ${it} and proceed with your logic
}
}
Related
I am trying to replace the the DB cred based on the env name in Jenkins, but I am unable to achieve the same.
I have a JSON Config Files like this named 'JsonConfig'
{
"production": {
"DB_USERNAME": "userABC"
},
"development": {
"DB_USERNAME": "userXYZ"
}
}
and this what I have in Jenkinsfile
def getEnvName() {
if ("master".equals(env.BRANCH_NAME)) {
return "production";
}
return env.BRANCH_NAME;
}
def config;
node(){
configFileProvider([configFile(fileId: 'secret-credentials', targetLocation: 'JsonConfig')]) {
config = readJSON file: 'JsonConfig'
}
}
pipeline {
agent any
stages {
stage("Setup") {
when {
beforeAgent true
anyOf { branch 'master', branch 'development' }
}
steps {
sh """
sed -i 's#__DB_USERNAME__#config.${getEnvName()}.DB_USERNAME#' ./secret-data.yml
cat ./secret-data.yml
"""
//Alternative
sh "sed -i 's#__DB_USERNAME__#${config.getEnvName().DB_USERNAME}#' ./secret-data.yml"
}
}
}
}
If I statically pass the var name like this, then it is working fine.
sh "sed -i 's#__DB_USERNAME__#${config.production.DB_USERNAME}#' ./k8s/secret-data.yml"
I want to make "production" dynamic so that it reads the value which is returned from getEnvName() method.
The problematic line is
sh """
sed -i 's#__DB_USERNAME__#config.${getEnvName()}.DB_USERNAME#' ./secret-data.yml
"""
This will evaluate as the shell command
sed -i 's#__DB_USERNAME__#config.production.DB_USERNAME#' ./secret-data.yml
But you want to be evaluated to
sed -i 's#__DB_USERNAME__#userABC#' ./secret-data.yml
Since the config is a Groovy object representing the parsed JSON file we can access its properties dynamically using the subscript operator ([]):
sh """
sed -i 's#__DB_USERNAME__#${config[getEnvName()].DB_USERNAME}#' ./secret-data.yml
"""
Using curl in jenkins pipeline script getting below json file , after that How to get the list values of dependencies list and trigger those jobs?
{
"Dependencies": [
"Dep1",
"Dep2",
"Dep3"
]
}
My current program is looking like this which is not working. And after getting values I need to form the Jenkins jobs and trigger from pipeline
pipeline {
agent {
label 'Dep-Demo'
}
stages{
stage('Getting Dependecies jason-object'){
steps {
script {
final String url = "https://***************/repository/files/sample.jason/raw?ref=main"
withCredentials([usernamePassword(credentialsId: 'api', passwordVariable: 'PASSWORD', usernameVariable: 'USELESS')]) {
sh ''' echo $PASSWORD '''
def response = sh(script: "curl -s --header \"PRIVATE-TOKEN: $PASSWORD\" $url ", returnStdout:true)
echo "***** $response"
def jsonObj = readJSON text: response.join(" ")
echo jsonObj
Check the following example.
script {
def jString = '''
{
"Dependencies": [
"Dep1",
"Dep2",
"Dep3"
]
}
'''
def jsonObj = readJSON text: jString
jsonObj.Dependencies.each { dep ->
echo "Building ${dep}"
//Building the Job
build job: dep
}
}
I have a Jenkins file that sits in a directory on a github repository.
I am basically trying to automate the build, test, and publication of all docker images that are build within the sub-directories of the root directory where the Jenkinsfile sits.
As I know right now, there is an issue with the dir block under the stage('Build'), the following is my Jenkinsfile:
#!groovy
// Iterate over data_pipeline directory
// Store all directory pipelines if Dockerfile found in variable map
def dirMap = [
"docker_hub_image_metadata",
"github_metrics",
"google_sheets_sync",
"jira_syn",
"sfdc_get_license_keys_from_clouds",
"sfdc_sync",
"sfdc_sync_reports_to_snowflake",
"snowflake_oppty_history",
"snowflake_telemetry_tasks",
"support_analytics",
"sync_ee_license_from_s3"
]
def builds = [:]
dirMap.each { dir ->
builds << [
"${dir}": { ->
node {
stage('Build') {
sh "echo 'Building Images for ${dir}...'"
sh "cd ${dir}"
sh "echo 'Switched into ${dir}'"
sh "pwd"
//dir("${dir}") {
sh "pwd"
// def image = docker.build("mirantiseng/${dir}")
//}
}
stage('Test') {
sh "echo 'Testing Images for ${dir}...'"
}
if (currentBuild.currentResult == 'SUCCESS') {
stage('Push') {
sh "echo 'Puhing Images for ${dir}...'"
}
}
}
}
]
}
parallel(builds)
As you can see I have my sub directories hardcoded as a list ad then that list get injected onto the map builds. However, I have an issue when I try to run the follwing inside my stage('Build'):
dir("${dir}") {
sh "pwd"
def image = docker.build("mirantiseng/${dir}")
}
The last know build that succeeded did not include the above block.
To run this, to something similar you can just mimic it by placing the file into a directory with other directories and hardcode them to the list. The file I am trying to get working with Jenkins looks like this:
#!groovy
// Iterate over data_pipeline directory
// Store all directory pipelines if Dockerfile found in the variable map
def dirMap = [
"docker_hub_image_metadata",
"github_metrics",
"google_sheets_sync",
"jira_syn",
"sfdc_get_license_keys_from_clouds",
"sfdc_sync",
"sfdc_sync_reports_to_snowflake",
"snowflake_oppty_history",
"snowflake_telemetry_tasks",
"support_analytics",
"sync_ee_license_from_s3"
]
def builds = [:]
dirMap.each { dir ->
builds << [
"${dir}": { ->
node {
stage('Build') {
sh "echo 'Building Images for ${dir}...'"
dir("${dir}") {
def image = docker.build("mirantiseng/${dir}")
}
}
stage('Test') {
sh "echo 'Testing Images for ${dir}...'"
}
if (currentBuild.currentResult == 'SUCCESS') {
stage('Push') {
sh "echo 'Puhing Images for ${dir}...'"
}
}
}
}
]
}
parallel(builds)
The error I got with the above Jenkinsfile showed this:
+ echo 'Building Images for docker_hub_image_metadata...'
Building Images for docker_hub_image_metadata...
groovy.lang.MissingMethodException: No signature of method: java.lang.String.call() is applicable for argument types: (org.codehaus.groovy.runtime.GStringImpl, org.jenkinsci.plugins.workflow.cps.CpsClosure2) values: [docker_hub_image_metadata, org.jenkinsci.plugins.workflow.cps.CpsClosure2#dde112]
Possible solutions: wait(), any(), trim(), split(), collect(), grep()
I have defined global variable in Jenkins pipeline
def BUILDNRO = '0'
pipeline { ...
Then i manipulate variable with shell script to enable running builds parallel by using job build number as identifier so we don't mix different docker swarms.
stage('Handle BUILD_NUMBER') {
steps {
script {
BUILDNRO = sh( script: '''#!/bin/bash
Build=`echo ${BUILD_NUMBER} | grep -o '..$'`
# Check if BUILD first character is 0
if [[ $Build:0:1 == "0" ]]; then
# replace BUILD first character from 0 to 5
Build=`echo $Build | sed s/./5/1`
fi
echo $Build
''',returnStdout: true).trim()
}
}
}
i get value out from previos stage and trying to get global variable on next stage
stage('DOCKER: Init docker swarm') {
steps {
echo "BUILDNRO is: ${BUILDNRO}" --> Value is here.
sh '''#!/bin/bash
echo Buildnro is: ${BUILDNRO} --> This is empty.
...
}
}
This will out give global variable empty. why? in previous stage there was value in it.
EDIT 1.
Modified code blocks to reflect current status.
I managed to figure it out. Here is solution how i managed to did it.
BUILDNRO is groovy variable and if wanting to used in bash variable it have to pass using withEnv. BUILD_NUMBER in first stage is bash variable hence it can be used directly script in first stage.
def BUILDNRO = '0'
pipeline {
....
stages {
stage('Handle BUILD_NUMBER') {
steps {
script {
BUILDNRO = sh( script: '''#!/bin/bash
Build=`echo ${BUILD_NUMBER} | grep -o '..$'`
''',returnStdout: true).trim()
}
}
}
stage('DOCKER: Init docker swarm') {
steps {
dir("prose_env/prose_api_dev_env") {
withEnv(["MYNRO=${BUILDNRO}"]) {
sh(returnStdout: false, script: '''#!/bin/bash
echo Buildnro is: ${MYNRO}`
'''.stripIndent())
}
}
}
}
}
}
If you are using single quotes(```) in the shell module, Jenkins treats every variable as a bash variable. The solution is using double quotes(""") but then if you made bash variable you have to escape it. Below an example with working your use case and escaped bash variable
pipeline {
agent any
stages {
stage('Handle BUILD_NUMBER') {
steps {
script {
BUILDNRO = sh(script: 'pwd', returnStdout: true).trim()
echo "BUILDNRO is: ${BUILDNRO}"
}
}
}
stage('DOCKER: Init docker swarm') {
steps {
sh """#!/bin/bash
echo Buildnro is: ${BUILDNRO}
variable=world
echo "hello \${variable}"
sh """
}
}
}
}
output of the second stage:
Buildnro is: /var/lib/jenkins/workspace/stack1
hello world
I want to get all directories present in particular directory from jenkins pipeline script.
How can we do this?
If you want a list of all directories under a specific directory e.g. mydir using Jenkins Utility plugin you can do this:
Assuming mydir is under the current directory:
dir('mydir') {
def files = findFiles()
files.each{ f ->
if(f.directory) {
echo "This is directory: ${f.name} "
}
}
}
Just make sure you do NOT provide glob option. Providing that makes findFiles to return file names only.
More info: https://jenkins.io/doc/pipeline/steps/pipeline-utility-steps/
I didn't find any plugin to list folders, so I used sh/bat script in pipeline, and also this will work irrespective of operating system.
pipeline {
stages {
stage('Find all fodlers from given folder') {
steps {
script {
def foldersList = []
def osName = isUnix() ? "UNIX" : "WINDOWS"
echo "osName: " + osName
echo ".... JENKINS_HOME: ${JENKINS_HOME}"
if(isUnix()) {
def output = sh returnStdout: true, script: "ls -l ${JENKINS_HOME} | grep ^d | awk '{print \$9}'"
foldersList = output.tokenize('\n').collect() { it }
} else {
def output = bat returnStdout: true, script: "dir \"${JENKINS_HOME}\" /b /A:D"
foldersList = output.tokenize('\n').collect() { it }
foldersList = foldersList.drop(2)
}
echo ".... " + foldersList
}
}
}
}
}
I haven't tried this, but I would look at the findFiles step provided by the Jenkins Pipeline Utility Steps Plugin and set glob to an ant-style directory patter, something like '**/*/'
If you just want to log them, use
sh("ls -A1 ${myDir}")
for Linux/Unix. (Note: that's a capital letter A and the number one.)
Or, use
bat("dir /B ${myDir}")
for Windows.
If you want the list of files in a variable, you'll have to use
def dirOutput = sh("ls -A1 ${myDir}", returnStdout: true)
or
def dirOutput = bat("dir /B ${myDir}", returnStdout: true)
and then parse the output.
Recursively getting all the Directores within a directory.
pipeline {
agent any
stages {
stage('Example') {
steps {
script {
def directories = getDirectories("$WORKSPACE")
echo "$directories"
}
}
}
}
}
#NonCPS
def getDirectories(path) {
def dir = new File(path)
def dirs = []
dir.traverse(type: groovy.io.FileType.DIRECTORIES, maxDepth: -1) { d ->
dirs.add(d)
}
return dirs
}
A suggestion for the very end of Jenkinsfile:
post {
always {
echo '\n\n-----\nThis build process has ended.\n\nWorkspace Files:\n'
sh 'find ${WORKSPACE} -type d -print'
}
}
Place the find wherever you think is better. Check more alternatives at here