I'm creating a post job on Jenkins pipeline to publish test results using junit, html and cobertura. The code looks like this
post {
always {
publishTestResults(
script: this,
junit: [
active:true,
allowEmptyResults:true,
archive: true,
pattern: '**/reports/mocha.xml',
updateResults: true
],
cobertura: [
active:true,
allowEmptyResults:true,
archive:true,
pattern: '**/coverage/cobertura/cobertura-coverage.xml'
],
html: [
active:true,
allowEmptyResults:true,
archive:true,
name: 'NYC/Mocha',
path: '**/coverage/html'
],
lcov: [
active:true,
allowEmptyResults:true,
archive:true,
name: 'LCOV Coverage',
path: '**/coverage/lcov/lcov-report'
]
)
cobertura coberturaReportFile: 'coverage/cobertura/cobertura-coverage.xml'
junit 'reports/mocha.xml'
cleanWs()
// deleteDir()
script {
FAILED_STAGE = env.STAGE_NAME
}
}
}
}
The problem is when I execute the job on Jenkins I receive an error message:
find . -wholename **/reports/mocha.xml -exec touch {} ;
touch: cannot touch './reports/mocha.xml': Permission denied
I suppose the issue raised by junit command. How could I solve this problem?
P/S: Jenkins server runs on Ubuntu. I tried to modify /etc/sudoers and add this line to make Jenkins executes command as root. It still could not solve my problem.
jenkins ALL=(ALL) NOPASSWD: ALL
From checking the code at: https://github.com/SAP/jenkins-library/blob/5c13a0e2a20132336824c70b743c364bcb5341f4/vars/testsPublishResults.groovy#L136
Looks like you can avoid the issue by setting updateResults to false
If you absolutely have to update the timestamp on the result file, you'll have to open a terminal session, go to the project workspace (with jenkins user) and try to run touch ./reports/mocha.xml and then debug it from there.
Related
I am stuck in trying to get a Jenkinsfile to work. It keeps failing on sh step and gives the following error
process apparently never started in /home/jenkins/workspace
...
(running Jenkins temporarily with -Dorg.jenkinsci.plugins.durabletask.BourneShellScript.LAUNCH_DIAGNOSTICS=true might make the problem clearer)
I have tried adding
withEnv(['PATH+EXTRA=/usr/sbin:/usr/bin:/sbin:/bin'])
before sh step in groovy file
also tried to add
/bin/sh
in Manage Jenkins -> Configure System in the shell section
I have also tried replacing the sh line in Jenkinsfile with the following:
sh "docker ps;"
sh "echo 'hello';"
sh ./build.sh;"
sh ```
#!/bin/sh
echo hello
```
This is the part of Jenkinsfile which i am stuck on
node {
stage('Build') {
echo 'this works'
sh 'echo "this does not work"'
}
}
expected output is "this does not work" but it just hangs and returns the error above.
what am I missing?
It turns out that the default workingDir value for default jnlp k8s slave nodes is now set to /home/jenkins/agent and I was using the old value /home/jenkins
here is the config that worked for me
containerTemplate(name: 'jnlp', image: 'lachlanevenson/jnlp-slave:3.10-1-alpine', args: '${computer.jnlpmac} ${computer.name}', workingDir: '/home/jenkins/agent')
It is possible to get the same trouble with the malformed PATH environment variable. This prevents the sh() method of the Pipeline plugin to call the shell executable. You can reproduce it on a simple pipeline like this:
node('myNode') {
stage('Test') {
withEnv(['PATH=/something_invalid']) {
/* it hangs and fails later with "process apparently never started" */
sh('echo Hello!')
}
}
}
There is variety of ways to mangle PATH. For example you use withEnv(getEnv()) { sh(...) } where getEnv() is your own method which evaluates the list of environment variables depending on the OS and other conditions. If you make a mistake in the getEnv() method and PATH gets overwritten you get it reproduced.
I am running newman with newman-reporter-htmlextra in a Jenkins pipeline, generating a html report which I want to publish via the jenkins html publisher.
This is the stage in the pipeline I´m using
stage('Newman tests') {
steps() {
script {
dir("${JENKINS_HOME}/workspace/myproject") {
sh 'newman run "./Collections/my_collection.postman_collection.json" --reporters cli,junit,htmlextra --reporter-junit-export "newman_result.xml" --reporter-htmlextra-export "newman_result.html"'
junit "*.xml"
}
}
publishHTML target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: '.',
reportFiles: 'newman_result.html',
reportName: 'Newman HTML Reporter'
]
}
This is running, and creating an entry Newman HTML Reporter in my Jenkins project.
However, when I open such a report, it is empty, pls check .
Any ideas?
Many thanks in advance,
Christian
I guess you are accessing the wrong folder when you want to publish your html result.
You are creating your file not in the regular jenkins workspace:
dir("${JENKINS_HOME}/workspace/myproject") {
sh 'newman run "./Collections/my_collection.postman_collection.json" --reporters cli,junit,htmlextra --reporter-junit-export "newman_result.xml" --reporter-htmlextra-export "newman_result.html"'
junit "*.xml"
}
After you are leaving the script{} you are accessing the original workspace of Jenkins so reportDir: '.' is not the same folder where your file is which means no file = no html can be displayed.
You have 3 choices here:
You create the file in the regular workspace of Jenkins
The HTML Publisher Plugin aims to the correct folder
Put the HTML Publisher plugin into the scope of your dir{} as you did with your junit plugin
To find out easily which folder you are accessing in which scope you can run an echo pwd. Do this once in the scope of your dir{} and once in the scope of your plugin then it should be clear that the reportDir of your plugin is wrong.
Yesterday, I wrote and ran a Katalon test suite, and today, I'm trying to integrate Katalon with Jenkins. I successfully setup Jenkins, created a new job for the Katalon testing, as per these instructions, but when I went to build it, I get failing builds.
In particular, this is the error message I keep getting :
Recording test results
ERROR: Step ‘Publish JUnit test result report’ failed: No test report files were found. Configuration error?
Finished: FAILURE
I went ahead and copied the Reports folder structure from the project directory that I specified to the Jenkins workspace. Upon later inspection, I found that, when Jenkins was running the Katalon tests, the JUnit_Report.xml file was actually getting created in the project's Reports folder, instead of at %JENKINS_HOME%\workspace\[project name]\Reports. I explicitly told it to generate test reports to : Reports/LoginSuite/*/JUnit_Report.xml.
NOTE: I'm on a Windows machine.
How can I fix this so that I can display test results from Jenkins?!
UPDATE : I have revised my Windows shell code to the following
C:
cd C:\Katalon
katalon -runMode=console -projectPath="C:\Users\mwarren\Katalon Studio\TestProject" -reportFolder="../../.jenkins/workspace/Katalon Studio Tests/Reports" -reportFileName="report" -retry=0 -testSuitePath="Test Suites/LoginSuite" -browserType="Chrome"
and it's still giving me the same error, even though now the tests are being created there.
Report folder is generated in the jenkins job folder.
Reports/**/JUnit_Report.xml
PEBKAC, apparently. I should have, from the getgo, listened to Jenkins and set my Test Report XMLs as */JUnit_Report.xml
Copy all reports to a temp folder, rename each xml with test case name and then copy it back to junit folder.
testCasesTxt = sh (
script: 'sudo find $WORKSPACE -name "*.ts*" -type f -printf "%f\n"',
returnStdout: true
).trim()
testCasesTxt = testCasesTxt.replace(".ts", "")
testCases = testCasesTxt.split("\n")
for (int i = 0; i < testCases.size(); i++) {
script {
try {
wrap([$class: 'Xvfb']) {
sh """
cd /opt/katalon
./katalon -noSplash -consoleLog -runMode=console -projectPath=$WORKSPACE/"katalon-project.prj" -reportFolder="Reports" -reportFileName="report" -retry=0 -testSuitePath="Test Suites/${testCases[i]}" -executionProfile="qa" -browserType="Chrome"
"""
}
} catch (any) {
currentBuild.result = 'FAILURE'
throw any //rethrow exception to prevent the build from proceeding
} finally {
sh """
cd /home/environment/tmp/
cd Reports
mkdir ${testCases[i]}
cd $WORKSPACE
cp -r Reports/ /home/environment/tmp/Reports/${testCases[i]}
cd /home/environment/tmp/Reports/${testCases[i]}/Reports
mv JUnit_Report.xml JUnit_Report_${testCases[i]}.xml
cd $WORKSPACE
cp -r "Data Files/" "/home/environment/tmp/"
"""
//
}
}
}
According to this thread, this can happen when trying to execute two Katalon instances at the same moment.
If that's the case, try changing Jenkins number of executors 1.
If your report is at \Reports\20200611_172240\TestSuite1\20200611_172240/JUnit_Report.xml location in your project folder, then configure the folder path as below - /Reports///*/JUnit_Report.xml
as the 3 folders after report folder name are always going to change after each execution.
Please use the Test report XMLs path as like this
**/target/surefire-reports/*.xml
Maven clean install generate new html file in following location
/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/DocsJmeterTests_20170601_151330/index.html
Here "DocsJmeterTests_20170601_151330" will change for every run. So i am trying to publish html report using publish html report plugin. Following is my Pipeline script
node {
build job: 'Docs_LoadTest'
stage('Results') {
publishHTML([allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir:
'/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/*/',
reportFiles: 'index.html',
reportName: 'Docs Loadtest Dashboard'
])
}
}
Getting following error while running the job
[htmlpublisher] Archiving HTML reports...
[htmlpublisher] Archiving at BUILD level
/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/* to
/var/lib/jenkins/jobs/Docs_Pipeline/builds/10/htmlreports/Docs_Loadtest_Dashboard
ERROR: Specified HTML directory '/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/*' does not exist.
Even we tried following options didnt worked
/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/**/
/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/DocsJmeterTests_*
/var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/DocsJmeterTests_*
_*
The HTML Publisher plugin does not seem to understand wildcards. What you could do in your Pipeline is using Linux's copy command, since that can work with wildcards.
This copies over the contents of all directories in the [Docs_LoadTest]/jmeter/reports folder to a jmeter_results folder in the local workspace:
sh 'cp -r /var/lib/jenkins/workspace/Docs_LoadTest/target/jmeter/reports/*/. target/jmeter_results/'
Note that you must clean both your target folder in the Docs_LoadTest folder and your Pipeline in between runs, else multiple reports will be copied over with this solution.
A better solution:
Would be to apply this trick in the Docs_LoadTest and use the Publish Artifact and Copy Artifact features. This works around having to hardcode the path to the other job and will work even if the Pipeline executes on another slave than the Docs_LoadTest. This does require the Copy Artifacts plugin.
Assuming Docs_LoadTest is a Freestyle job:
Add an Execute Shell Build step that copies the results to a fixed folder, e.g. jmeter_results:
mkdir -p target/jmeter_results/
cp -r target/jmeter/reports/*/. target/jmeter_results/
Then add an Archive Artifacts Post Build Archive Step with the following files to archive:
target/jmeter_results/*
In your Pipeline:
Use the Copy Artifact step to copy the files to target/jmeter_results folder in the local workspace:
step ([$class: 'CopyArtifact',
projectName: 'Docs_LoadTest',
filter: 'target/jmeter_results/*']);
Change the call to the HTML publisher to use this folder:
publishHTML([allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: 'target/jmeter_results',
reportFiles: 'index.html',
reportName: 'Docs Loadtest Dashboard'
])
I was having similar problem, only that I wanted to publish multiple reports.
What I ended up doing was I added simple groovy script to iterate through files in reports directory. You can use same/similar approach to get file name.
stage('publish reports') {
steps {
unstash 'source'
script {
sh 'ls target/jmeter/reports > listFiles.txt'
def files = readFile("listFiles.txt").split("\\r?\\n");
sh 'rm -f listFiles.txt'
for (i = 0; i < files.size(); i++) {
publishHTML target: [
allowMissing:false,
alwaysLinkToLastBuild: false,
keepAll:true,
reportDir: 'target/jmeter/reports/' + files[i],
reportFiles: 'index.html',
reportName: files[i]
]
}
}
}
}
Note: this example is used in declarative pipeline. Docs about readFile function.
I have tried simply the followings.
stage('Test-Junit') {
steps {
sh 'gradle test'
}
post {
always {
script {
def moduleNames = ["app", "core", ...]
for(i=0; i<moduleNames.size(); i++ ) {
publishHTML target: [
allowMissing:false,
alwaysLinkToLastBuild: false,
keepAll:true,
reportDir: moduleNames[i] + '/build/reports/tests/test',
reportFiles: 'index.html',
reportName: 'Test Report:' + moduleNames[i]
]
}
}
}
}
}
It will make all modules report and thus you can find them on left nav-bar of project dash-board.
It is not exactly the same scenario, but decided to publish my code because was really hard to understand, clarify and get documentation and accurate examples on how to publish different reports in just one final consolidated report, using the publishHTML plug-in for Jenkins.
A bit of background, we are executing different packages of testing, but some test cases can't run together because they could kill each other, so we need to execute, from the same code, in two different time frames due that we run test cases in parallel.
The solution was to execute by tags, so once the different execution using a Jenkins DSL - pipeline happens the builds produce just one report with different tabs on it.
So this is the final code that works for me:
pipeline {
agent any
stages {
stage('Git') {
steps {
git .....
}
}
stage('Exec-1') {
steps {
bat 'mvn -B clean verify -Dcucumber.filter.tags=#exec1 -Dserenity.outputDirectory=reports/site/serenity/exec1'
}
stage('Exec-2') {
steps {
bat 'mvn -B clean verify -Dcucumber.filter.tags=#exec2 -Dserenity.outputDirectory=reports/site/serenity/exec2'
}
}
stage('Exec-3') {
steps {
bat 'mvn -B clean verify -Dcucumber.filter.tags=#exec3 -Dserenity.outputDirectory=reports/site/serenity/exec3'
}
}
}
post {
always {
publishHTML target: [
reportName: 'Test',
reportDir: 'reports/site/serenity',
reportFiles: 'exec1/index.html, exec2/index.html, exec3/index.html',
reportTitles: 'Exec-1, Exec-2, Exec-3',
keepAll: true,
alwaysLinkToLastBuild: true,
allowMissing: false
]
}
}
}
I'm trying to migrate my build pipelines to the "Pipeline plugin" using the groovy build scripts.
My pipelines are usually:
Test (gradle)
IntegrationTest (gradle)
Build (gradle)
Publish (artifactory)
I would like to use the gradle variables like version/group etc. in my jenkins build script to publish to the correct folders in artifactory. Something the artifactory plugin would take care of for me in the past. How can this be achieved?
For a single gradle project I use something like this:
node('master')
{
def version = 1.0
def gitUrl = 'some.git'
def projectRoot = ""
def group = "dashboard/frontend/"
def artifactName = "dashboard_ui"
def artifactRepo = "ext-release-local"
stage "git"
git branch: 'develop', poll: true, url: "${gitUrl}"
dir(projectRoot)
{
sh 'chmod +x gradlew'
stage "test"
sh './gradlew clean test'
stage "build"
sh './gradlew build createPom'
stage "artifact"
def server = Artifactory.server('artifactory_dev01')
def uploadSpec = """{
"files": [
{
"pattern": "build/**.jar",
"target": "${artifactRepo}/$group/${artifactName}/${version}/${artifactName}-${version}.jar"
},
{
"pattern": "pom.xml",
"target": "${artifactRepo}/$group/${artifactName}/${version}/${artifactName}.pom"
}
]
}"""
def buildInfo1 = server.upload spec: uploadSpec
server.publishBuildInfo buildInfo1
}
}
For future reference here an example with the more modern declarative pipeline:
pipeline {
agent any
stages {
stage('somestage') {
steps {
script {
def version = sh (
script: "./gradlew properties -q | grep \"version:\" | awk '{print \$2}'",
returnStdout: true
).trim()
sh "echo Building project in version: $version"
}
}
}
}
}
see also:
Gradle plugin project version number
How to do I get the output of a shell command executed using into a variable from Jenkinsfile (groovy)?
I think you actually have two different approaches to tackle this problem :
1. Get version/group from sh script
Find a way to get Gradle version from gradle build tool (e.g. gradle getVersion(), but I'm not familiar with Gradle) and then use shell script to get this version. If Gradle command to get the version is gradle getVersion(), you would do in your pipeline :
def projectVersion = sh script: "gradle getVersion()", returnStdout: true
def projectGroup= sh script: "gradle getGroup()", returnStdout: true
and then just inject your $projectVersion and $projectGroup variables in your current pipeline.
2. Configure your Gradle build script to publish to Artifactory
This is the reverse approach, which I personnaly prefer : instead of giving Artifactory all your Gradle project information, juste give Gradle your Artifactory settings and use Gradle goal to easily publish to Artifactory.
JFrog has a good documentation for this solution in their Working with Gradle section. Basically, you will follow the following steps :
Generate a compliant Gradle build script from Artifactory using Gradle Build Script Generator and include it to your project build script
Use Gradle goal gradle artifactoryPublish to simply publish your current artifact to Artifactory
For others who Google'd their way here, if you have the Pipeline Utility Steps plugin and store what you need in your gradle.properties file, you can do something like this in the environment block:
MY_PROPS = readProperties file:"${WORKSPACE}/gradle.properties"
MY_VERSION = MY_PROPS['version']