Groovy in Jenkins pipeline - create a file with content - jenkins

I am using Jenkins' shared library and my Jenkins file has a stage like this:
stage('sonarqube') {
when { branch 'master' }
steps {
generateUnitTestsReport()
}
}
I want to keep programmers' repos clean from scripts that create various reports, so my idea is to keep definitions of scripts in a shared library, and then, during step execution, create a file with the content.
For instance (file generateUnitTestsReport.groovy in shared library):
def call(){
def SCRIPT_CONTENT = '''
#!/bin/bash
#SCRIPT CONTENT
'''
sh '''echo''' +SCRIPT_CONTENT+ ''' > ut-report.sh'''
sh 'chmod +x ./ut-report.sh'
}
but it doesn't work like this. I also tried Groovy's new File, but no luck there either. How could this be done (note that this is a Jenkins slave node)?

Related

Copy key file to folder using Jenkingfile

I am using jenkins scripted file.
I have .key file stored in jenkins files ( Where all env files are present ).
And I need to copy that file to code folder.
Like i want to store device.key in src/auth/keys.
Then will run test on code in pipeline.
I am using scripted Jenkinsfile. And i am unable to find any way to this.
node{
def GIT_COMMIT_HASH
stage('Checkout Source Code and Logging Into Registry') {
echo 'Logging Into the Private ECR Registry'
checkout scm
sh "git rev-parse --short HEAD > .git/commit-id"
GIT_COMMIT_HASH = readFile('.git/commit-id').trim()
# NEED TO COPY device.key to /src/auth/key
}
stage('TEST'){
nodejs(nodeJSInstallationName:'node'){
sh 'npm install'
sh 'npm test'
}
}
}
How I solved this:
I installed Config File Provider Plugin
I added the files as custom files for each environment
In the JenkinsFile I replace the configuration file from the project with the one comming from jenkins:
stage('Add Config files') {
steps {
configFileProvider([configFile(fileId: 'ID-of-Jenkins-stored-file', targetLocation: 'relative-path-to-destination-file-in-the-project')]) {
// some block , maybe a friendly echo for debugging
} } }
Please see the plugin doc as it is capable of replacing tokens in XML and json files and many others.

How to use inject environment variables (Properties File Path) in Jenkins Pipeline

Want to use the below functionality(shown in image link) in Jenkins as code, but i'm failing to do, kindly help me replicate the functionality in the image to groovy script
stage ('Build Instance') {
sh '''
bash ./build.sh -Ddisable-rpm=false
'''
env "/fl/tar/ver.prop"
}
Jenkins GUI usage of Env Inject
Got a simple workaround :
script {
def props = readProperties file: '/fl/tar/ver.prop' //readProperties is a step in Pipeline Utility Steps plugin
env.WEATHER = props.WEATHER //assuming the key name is WEATHER in properties file
}

Jenkinsfile, prebuild script

I'm a using the jenkins pipeline. My usecase is that the developer are using a simple *.ini file that is parsed by a python script to add or remove stage within the jenkinsfile whenever they want. I don't want them to manually edit the jenkinsfile because they won't know how it works.
Expected behaviour is:
When a build is triggered I would like to first execute a python script which might write into the jenkinsfile to add/remove stage according to the *.ini file.
As far as I understand, when an event trigger a jenkins build, the first thing it does is opening the jenkinsfile. However I would like to know if it's possible to run some prebuild script before that ?
Thanks
Edit: here's a simple view of run of the pipeline (blue ocean UI)
The ini file might for example remove in the stage Compilation the step Building Plan C by removing the groovy code doing that in the jenkins file
Give an example for reference.
node {
git url: '', branch: '', credentialsId: ''
def parseStr = sh(script: 'python parser.py xxx.ini', returnStdout: true).trim()
// the python parser expect to return a JSON string like:
// {'run_stage1': false, 'run_stage2': true}
def parseObj = readJSON text: parseStr
stage('stage 1') {
if(parseObj.run_stage1) {
echo 'stage1'
...
}
}
stage('stage 1') {
if(parseObj.run_stage2) {
echo 'stage1'
....
}
}
}
Jenkins pipeline had supply apis: readJSON, readYaml, readProperties to read JSON, YAML and Properties files.
If you choose any of them to replace ini file, you can drop the python parser to make your pipeline more simple

Calling functions in jenkinsfile with variables

I need some help in calling a function in Jenkinsfile along with a variable.
I have created a bash function to copy certain test results from jenkins slave to Jenkins master's userContent directory.
I want to use this function across diff jobs. Different jobs might have different report path, instead of hardcoding the path inside the function i want to use a variable in jenkinsfile to pass along with the function.
Here is my function:
def call() {
sh '''
mkdir -p $JOB_NAME
foldername="$BUILD_NUMBER.$(date '+%d-%m-%Y')"
echo ${foldername}
mkdir -p $JOB_NAME/${foldername}
pwd
reportPath=""
dest="./$JOB_NAME/${foldername}"
cp -R ${reportPath}/*.xml ${dest}
scp -r $JOB_NAME jenkins#master_ip:/var/lib/jenkins/userContent/
'''
}
How do i call the function in jenkinsfile and have a variable with report path?
I guess you want to copy Jenkins Slave artifact to master. You can use following plugin https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin
You can use following method for declarative pipeline jobs:
stages {
stage('Copy Archive') {
steps {
script {
step ([$class: 'CopyArtifact',
projectName: 'Create_archive',
filter: "packages/infra*.zip",
target: 'Infra']);
}
}
}

Pipeline step having trouble resolving a file path

I am having trouble getting a shell command to complete in a stage I have defined:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
The shell command issues a protractor call which takes a config file argument, but this file fails to be found when protractor tries to retrieve it.
If I take a look at the workspace directory for where the repo is checked out to from the checkout scm step I can see the test directory is present with the config file present the sh step is referencing.
So I'm unsure why the file cannot be found.
I thought about trying to verify the files that can be seen around the time the protractor command is being issued.
So something like:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
def files = findFiles(glob: 'test/**/*.conf.js')
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
}
}
}
}
But this doesnt work, I dont think findFiles can be used inside a step?
Can anyone offer any suggestions about what may be going on here?
Thanks
to do the debugging you were attempting (to see if the file is actually there) you could wrap the findFiles in a script (making sure your echo is before the step that fails) or use a basic find in an "sh" step like this:
stages {
stage('E2E Tests') {
steps {
node('Protractor') {
checkout scm
// you could use the unix find command instead of groovy's findFiles
sh 'find test -name *.conf.js'
// if you're using a non-dsl-step (like findFiles), you must wrap it in a script
script {
def files = findFiles(glob: 'test/**/*.conf.js')
echo """${files[0].name} ${files[0].path} ${files[0].directory} ${files[0].length} ${files[0].lastModified}"""
sh '''
npm install
sh 'protractor test/protractor.conf.js --params.underTestUrl http://192.168.132.30:8091'
'''
}
}
}
}
}

Resources