I am refactoring a Jenkins shared library that used classes, e.g.
workflow = new myWorkflow( def script ) {
this.script = script
and replacing with shared (script) libraries in the /vars directory:
https://www.jenkins.io/doc/book/pipeline/shared-libraries/
#!/usr/bin/env groovy
# file: /vars/myFunc.groovy
doSomething() {
myFlow = new myWorkflow(this)
}
Scripts in the /vars directory are very useful and lightweight compared to classes. Your code is very very FP. Here is my issue, the code I transition from has a reference to the Jenkinsfile script, e.g.
script.sh "ls -la" but the global functions don't know their reference, they don't know their owner, the script they are running in, being called from. This makes it impossible to call some of the existing objects. In the above example, "this" is myFunc, not Jenkinsfile this.
Is there a this.owner or this.parent ?
Related
I have some groovy code (Jenkins Pipeline) to loop over a coverage directory and find any files matching a certain pattern. For any files found, I want to determine what their parent directory is named as. These files are currently in sub dirs under a coverage directory.
def coverageReportFiles = findFiles(glob: 'coverage/**/coverage-summary.json')
for(file in coverageReportFiles ){
echo "${file.directory}" // currently prints out 'false'
}
Are there any utilities with Jenkins Pipeline steps or anything native in groovy I can use to determine this?
It should be possible with file.parentFile.name
I am creating jenkins pipeline for all our application where I wanted to build and deploy. I can able to achieve that but all the deployment paths are hard coded on the pipeline script.
We have around 8 application and 5 environments. it means I need to specify 40 different deployable path on the pipeline scripts.
I like to know, are they any best way to store the deployment path?. I thought about storing them in XML and reading that while doing the build, but not sure on implementation.
looking for some ideas.
script {
def msbuild tool name: 'Msbuila', type: 'msbuild'
def action "${msbuild}\\msbuild.exe"
def rootPath "${NORKSPACE}\\test\\test";
def sinPath "${rootPath}\\test.sin"
def binPath "${rootPath}\\test\\bin"
bat “nuget restore \"${sinPath}\""
bat "\"${action}\" \"${sinPath)\" "
robocopy("\"${binPath}\" \"\\\\t.test.com\\test\" /MIR /xF ")
}
What I would do is use a config repository, having it configured this way:
Each application is a different repository (example: app_config)
Each environment is a different file
The same enviroment file in each repository is called by the same name
Each enviroment file is a yaml (key:value)
Then on the jenkins pipeline I would get the repo, read the yaml using readYAML (check the command usage and name, theres is a while since I used it) and load it on a map.
Then you use the variables of the map and that should help you
The tricky part is how to match the code repositories and the config repositories. As I mentioned before, I would use the same name and append "_config"
I would like to use the "input step" of Jenkins to upload a binary file to the current workspace.
However, the code below seems to upload the file to the Jenkins master, not to the workspace of the current job on the slave where the job is running.
Is there any way to fix that?
Preferably without having to add an executor on the master or clutter the master disk with files.
def inFile = input id: 'file1', message: 'Upload a file', parameters: [file(name: 'data.tmp', description: 'Choose a file')]
Seems Jenkins officially doesn't support upload of binary file yet as you can see in JENKINS-27413. You can still make use of the input step to get binary file in your workspace. We will be using a method to get this working but we will not use it inside the Jenkinsfile otherwise we will encounter errors related to In-process Script Approval. Instead, we will use Global Shared Libraries, which is considered one of Jenkins' best practices.
Please follow these steps:
1) Create a shared library
Create a repository test-shared-library
Create a directory named vars in above repository. Inside vars directory, create a file copy_bin_to_wksp.groovy with the following content:
def inputGetFile(String savedfile = null) {
def filedata = null
def filename = null
// Get file using input step, will put it in build directory
// the filename will not be included in the upload data, so optionally allow it to be specified
if (savedfile == null) {
def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload'), string(name: 'filename', defaultValue: 'demo-backend-1.0-SNAPSHOT.jar')]
filedata = inputFile['library_data_upload']
filename = inputFile['filename']
} else {
def inputFile = input message: 'Upload file', parameters: [file(name: 'library_data_upload')]
filedata = inputFile
filename = savedfile
}
// Read contents and write to workspace
writeFile(file: filename, encoding: 'Base64', text: filedata.read().getBytes().encodeBase64().toString())
// Remove the file from the master to avoid stuff like secret leakage
filedata.delete()
return filename
}
2) Configure Jenkins for accessing Shared Library in any pipeline job
Go to Manage Jenkins » Configure System » Global Pipeline Libraries section
Name the library whatever you want (in my case, my-shared-library as shown below)
Keep the default to master (this is the branch where i pushed my code)
No need to check/uncheck the check-boxes unless you know what you're doing
3) Access shared library in your job
In Jenkinsfile, add the following code:
#Library('my-shared-library#master') _
node {
// Use any file name in place of *demo-backend-1.0-SNAPSHOT.jar* that i have used below
def file_in_workspace = copy_bin_to_wksp.inputGetFile('demo-backend-1.0-SNAPSHOT.jar')
sh "ls -ltR"
}
You're all set to run the job. :)
Note:
Make sure Script Security plugin is always up-to-date
How are Shared Libraries affected by Script Security?
Global Shared Libraries always run outside the sandbox. These libraries are considered "trusted:" they can run any methods in Java, Groovy, Jenkins internal APIs, Jenkins plugins, or third-party libraries. This allows you to define libraries which encapsulate individually unsafe APIs in a higher-level wrapper safe for use from any Pipeline. Beware that anyone able to push commits to this SCM repository could obtain unlimited access to Jenkins.
Folder-level Shared Libraries always run inside the sandbox. Folder-based libraries are not considered "trusted:" they run in the Groovy sandbox just like typical Pipelines.
Code Reference: James Hogarth's comment
I am using Serenity Cucumber framework using Gradle. I have integrated this with Jenkins. My requirement is that as a part of one Jenkins job I would like to run the feature files from one folder and as a part of next Jenkins job feature files located from other folder should get executed. Can you please suggest how I can pass the parameters to Cucumber Runner file during run time. For e.g.:
Below is my feature file
#RunWith(CucumberWithSerenity.class)
#CucumberOptions(features = "src/test/resources/Sanity/")
public class TestRunnerSerenity {
}
As a part of Sanity Build all my test cases from src/test/resources/Sanity/ folder should get executed. So how I can pass this folder path value to the Cucumber Runner class run time. So that I can maintain my builds without having multiple runner files and manual intervention.
Thanks in advance for your help.
There is some configuration need to do in Code,Gradle and Jenkins and Here is step by step
In Grade file, I will create systemProperty function.
task prodTest(type: Test)
prodTest{
systemProperty 'test.folder', testFolder
}
task "runProdTest" {
dependsOn "clean", "cleanTest", "prodTest"
}
In Constants Class,
Create static final String
public static final String TEST_FOLDER = system.getProperty("test.folder","FolderName")`
When you run locally, It will not failed because of external parameter is not available.
Go to runner file,
#RunWith(CucumberWithSerenity.class)
#CucumberOptions(features = Constants.TEST_FOLDER)
public class TestRunnerSerenity {
}
Now Jenkins Configuration
Go to Jenkins - Configuration
There should be Invoke Gradle Script Script. And You can add parameter under "switches". You will tell jenkis to u
-Ptest.folder="src/test/resources/Sanity/"
Tasks = runProdTest
You all set !!
I've got a Job Parameter, for example PROPERTIES_FILE.
I want to inject all Env Variables from that file into my job session, using the EnvInject plugin.
Is there a way to do that?
I managed to inject variables
only for a hard coded file path, and not from a parameter.
EnvFile plugin (not EnvInject) seems to support variables in the properties file path (as it uses $WORKSPACE in the example).
Also, if you are on Windows, maybe all you need for EnvInject plugin to work is to use $param for variable, not %param%. While the scripts need %% notation, inside Jenkins still uses $