How to print the results of the Parameterized values in Jenkins? - jenkins

I have a build with a parameter, also I have few dependencies from another project. I had to print the results of the parameters and save them in a specific folder. I used echo, but is there any other way or plugin

If I understand correctly, you're wanting to write parameters that were passed, out to a file in a folder. Here's an example using Groovy / Scripted pipeline
stage('Create and write output file') {
f = new File("${WORKSPACE}/output_file.txt")
f.append("SomeParameter = ${params.SomeParameter}"
}

Related

Jenkins - Add a dynamic label to a build

I am trying to achieve the following task in Jenkins:
1) Build a maven project
2) When running the test cases I print certain messages to the console output
3) Parse the console output of the build and determine if certain patterns exist in the output
4) If the pattern exists I want to label the build with a specific string
I have achieved steps 1-3. I am not able to create a dynamic label and tie it to a build. I have a Groovy script that parses the console output and determines if the pattern exists in the build's output.
Bamboo provides this feature to label a build based on regular expressions present in the build's console output.
Link - https://confluence.atlassian.com/bamboo0606/using-bamboo/jobs-and-tasks/configuring-jobs/configuring-miscellaneous-settings-for-a-job/configuring-automatic-labeling-of-job-build-results
I have gone through various existing Jenkins plugins but have not been successful in achieving this functionality. Is there a plugin to achieve this functionality or can I add additional lines in the Groovy script to create a dynamic build label.
Any help is appreciated.
you can use if to set agent:
def AGENT_LABEL = null
node('master') {
stage('Checkout and set agent'){
checkout scm
### Or just use any other approach to figure out agent label: read file, etc
if (env.BRANCH_NAME == 'master') {
AGENT_LABEL = "prod"
} else {
AGENT_LABEL = "dev"
}
}
}
pipeline {
agent {
label "${AGENT_LABEL}"
}

file parameter in declarative pipeline

I am developing declarative pipeline and want to use file parameter to read its content, but its not working as expected
parameters{
file(fileLocation:'list.txt', description:'contains list of projects to be build')
}
I am getting following error
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: 12: Invalid parameter "fileLocation", did you mean "description"? # line 12, column 14.
file(fileLocation:'release-list.txt', description:'contains list of projects to be build')
Following is another option mentioned for basic step plugin
readFile: Read file from workspace
Reads a file from a relative path (with root in current directory, usually workspace) and returns its content as a plain string.
file
Relative ( /-separated) path to file within a workspace to read.
Type: String
encoding (optional)
Type: String
its working in script step like
def myfile = readFile('list.txt')
echo "${myfile}"
But how to use it directly in declarative script as we used other basic steps like dir??
The correct arguments for the file parameter are name and description. So it should be:
file(name:'list.txt', description:'contains list of projects to be build')
However there's an open jenkins issue dating back from 2015 about the file parameter not working for pipelines, so I don't think even this will solve your issue. https://issues.jenkins-ci.org/browse/JENKINS-27413
Following syntax is working
parameters{
file name:'list.txt', description:'contains list of projects to be build'
}
But fileLocation parameter is not acceptable still.
Below syntax is available in Jenkins2 Up & Running book but its not working
parameters{
file(fileLocation:'list.txt', description:'contains list of projects to be build')
}
Till outstanding issues gets fixed, I believe we may have to stick to freestyle mode & handle things either in downstream pipeline job or within same job leveraging needy plugin feature.
Here is my attempt which looks to work file irrespective (yes supports Binaries as well) types : https://i.stack.imgur.com/vH7mQ.png
${list.txt} will point to right file in your case..
Take a look at the plug-in https://plugins.jenkins.io/file-parameters/.
This plug-in adds support for file parameters in your Jenkinsfile: https://plugins.jenkins.io/file-parameters/#plugin-content-usage-in-declarative-pipeline
parameters {
base64File 'small'
stashedFile 'large'
}
https://github.com/jenkinsci/file-parameters-plugin

Is it possible to include calls in Declarative Pipelines in the shared library (Jenkins)?

As you know in shared libraries in Jenkins, it is possible to call in the Jenkinsfile a content of a file in vars folder.
For example in vars folder in the shared library we can have a file named build.groovy and in the Jenkinsfile we can call it by :
build {
parameter1 = "some param1"
parameter2 = "some param2"
}
As it is described in this section .
I have no problems having groovy files and calling them with call() method in Jenkinsfile.
But I want to customize a pipeline and make it as generic as possible.
So I want to call a groovy file contained in vars folder but in the same pipeline : Calling a genericStage.groovy in an other file contained in the same vars folder in the shared library.
So what I have is a groovy file in vars folder : genericStage.groovy
What I have is :
Pipeline{
agent{label myNode}
stages{
stage("init"){
//steps
}
genericStage{
parameter1 = "some param"
}
}
}
And in the genericStage :
def call(Closure body) {
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
stage(config.parameter1){
steps{
//steps
}
}
}
But I get the error :
Expected a stage # line 125, column 6.
genericStage{
^
1 error
at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:310)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1085)
So how to make calls in the same shared library as we do for the Jenkinsfile ?
In your shared library method defining a DSL style step like this you can either define a block of steps and script that does some common processing you need to share between jobs or stages in a job or a whole complete pipeline. So your genericStage.groovy can only contain what comes after the "// steps" comment and cannot contain the stage and steps definition like this. I do this type of library custom step quite a lot in the style you have here but without trying to define the stage / steps inside it and it works fine. What is happening for you here is the pipeline validation / parser is failing the main pipeline syntax before it even gets to handling your custom step because you don't have it wrapped in a stage and steps definition.
If you read the documentation link you have at the end there is a section on defining a complete declarative pipeline in a shared library. It says at the end "Only entire pipelines can be defined in shared libraries as of this time. This can only be done invars/*.groovy, and only in a call method. Only one Declarative Pipeline can be executed in a single build, and if you attempt to execute a second one, your build will fail as a result." This suggests that a partial pipeline like you have won't work, you should have just steps / script or a complete pipeline.

How to include multiple pipeline scripts into jenkinsfile

I have a jenkins file as below
pipelineJob('My pipeline job'){
displayName('display name')
logRotator {
numToKeep(10)
daysToKeep(30)
artifactDaysToKeep(7)
artifactNumToKeep(1)
}
definition{
cps {
script(readFileFromWorkspace('./cicd/pipelines/clone_git_code.groovy'))
script(readFileFromWorkspace('./cicd/pipelines/install_dependencies_run_quality_checks.groovy'))
}
}
}
with above jenkinsfile the last script file is replacing other scripts.
Basically I have split tasks into multiple groovy files so that i wont repeat the same code in all jenkinsfile and reuse the same for other jobs as well, like I can now use the clone_git_code.groovy script in dev build as well as QA builds.
You have to use shared libraries (https://jenkins.io/doc/book/pipeline/shared-libraries/). You can define multiple groovy files with classes to return a processed object or simply creating calls with method where you define a step and the execution will be sequential.
I had this same issue when trying to include multiple scripts into a Jenkins job. After doing some research, I found the below solution to be the simplest:
definition {
cps {
script (
ScriptsLibrary.pipelineTest('did it work?') +
ScriptsLibrary.scmConf('repoURL_input', 'accessCredentials', 'activeBranch')
)
}
}
Add the "+" to concatenate the Strings. Got the job done for me :)

Jenkins, how to check regressions against another job

When you set up a Jenkins job various test result plugins will show regressions if the latest build is worse than the previous one.
We have many jobs for many projects on our Jenkins and we wanted to avoid having a 'job per branch' set up. So currently we are using a parameterized build to build eg different development branches using a single job.
But that means when I build a new branch any regressions are measured against the previous build, which may be for a different branch. What I really want is to measure regressions in a feature branch against the latest build of the master branch.
I thought we should probably set up a separate 'master' build alongside the parameterized 'branches' build. But I still can't see how I would compare results between jobs. Is there any plugin that can help?
UPDATE
I have started experimenting in the Script Console to see if I could write a post-build script... I have managed to get the latest build of master branch in my parameterized job... I can't work out how to get to the test results from the build object though.
The data I need is available in JSON at
http://<jenkins server>/job/<job name>/<build number>/testReport/api/json?pretty=true
...if I could just get at this data structure it would be great!
I tried using JsonSlurper to load the json via HTTP but I get 403, I guess because my script has no auth session.
I guess I could load the xml test results from disk and parse them in my script, it just seems a bit stupid when Jenkins has already done this.
I eventually managed to achieve everything I wanted, using a Groovy script in the Groovy Postbuild Plugin
I did a lot of exploring using the script console http://<jenkins>/script and also the Jenkins API class docs are handy.
Everyone's use is going to be a bit different as you have to dig down into the build plugins to get the info you need, but here's some bits of my code which may help.
First get the build you want:
def getProject(projectName) {
// in a postbuild action use `manager.hudson`
// in the script web console use `Jenkins.instance`
def project = manager.hudson.getItemByFullName(projectName)
if (!project) {
throw new RuntimeException("Project not found: $projectName")
}
project
}
// CloudBees folder plugin is supported, you can use natural paths:
project = getProject('MyFolder/TestJob')
build = project.getLastCompletedBuild()
The main test results (jUnit etc) seem to be available directly on the build as:
result = build.getTestResultAction()
// eg
failedTestNames = result.getFailedTests().collect{ test ->
test.getFullName()
}
To get the more specialised results from eg Violations plugin or Cobertura code coverage you have to look for a specific build action.
// have a look what's available:
build.getActions()
You'll see a list of stuff like:
[hudson.plugins.git.GitTagAction#2b4b8a1c,
hudson.scm.SCMRevisionState$None#40d6dce2,
hudson.tasks.junit.TestResultAction#39c99826,
jenkins.plugins.show_build_parameters.ShowParametersBuildAction#4291d1a5]
These are instances, the part in front of the # sign is the class name so I used that to make this method for getting a specific action:
def final VIOLATIONS_ACTION = hudson.plugins.violations.ViolationsBuildAction
def final COVERAGE_ACTION = hudson.plugins.cobertura.CoberturaBuildAction
def getAction(build, actionCls) {
def action = build.getActions().findResult { act ->
actionCls.isInstance(act) ? act : null
}
if (!action) {
throw new RuntimeException("Action not found in ${build.getFullDisplayName()}: ${actionCls.getSimpleName()}")
}
action
}
violations = getAction(build, VIOLATIONS_ACTION)
// you have to explore a bit more to find what you're interested in:
pylint_count = violations?.getReport()?.getViolations()?."pylint"
coverage = getAction(build, COVERAGE_ACTION)?.getResults()
// if you println it looks like a map but it's really an Enum of Ratio objects
// convert to something nicer to work with:
coverage_map = coverage.collectEntries { key, val -> [key.name(), val.getPercentageFloat()] }
With these building blocks I was able to put together a post-build script which compared the results for two 'unrelated' build jobs, then using the Groovy Postbuild plugin's helper methods to set the build status.
Hope this helps someone else.

Resources