Currently , my cypress testes are runnning in docker container on one stage
stage('Run E2E tests') {
steps {
withCredentials([
sshUserPrivateKey(credentialsId: '*********', keyFileVariable: 'SSH_KEY_FILE', usernameVariable: 'SSH_USER')
]) {
sh """
eval `ssh-agent -s`
ssh-add ${SSH_KEY_FILE}
~/earthly \
--no-cache \
--config=.earthly/config.yaml \
+e2e
eval `ssh-agent -k`
"""
}
}
}
And publishing the test report to via publishHTML.
post {
always {
echo "-- Archive report artifacts"
archiveArtifacts artifacts: 'results', allowEmptyArchive: 'true'
echo "-- Publish HTLM test result report"
publishHTML (target: [
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: 'results/html/',
reportFiles: 'mochawesome-bundle.html',
reportName: "Test Result Report"
])
}
}
But i need to make the build failure if any of the TC failure in the cypress mocha report
what can be the solution for this..?
Thanks in advance
Related
I run my regression tests on docker container and I am trying to publish Test Results in jenkins-pipline using HTML-Publisher. This doesn't work properly, thought I get a mistake by trying to copy the result-file from docker container (Error type: such file does not exist).
My Jenkinsfile looks like this:
//https://www.jenkins.io/doc/book/pipeline/syntax/
pipeline {
agent any
stages {
stage('Deploy webstore') {
steps {
//start and run an application container using .yml file
sh "docker compose -f webstore-compose.yml up -d"
}
}
stage ('Regression Tests') {
//setting up docker container for regression tests
agent {
docker {
image 'localhost:5000/dotnet_s3'
args '--add-host=host.docker.internal:host-gateway'
reuseNode true
}
}
steps {
//running tests located in /guiautomationtask directory in the top layer and logging into testResults.html file
sh 'id; cd /guiautomationtask; dotnet test --logger "html;logfilename=testResults.html"'
sleep(time: 10, unit: "SECONDS")
/*To Do:
copy logfile from container to local*/
//console output
echo "++++++++++++++++++++++++++++++++++ Display Test Results in the Console +++++++++++++++++++++++++++++++++++++++++"
echo "Running build ${env.BUILD_ID} on jenkins ${env.JENKINS_URL}"
echo "current docker container ID is ${hostname}"
sh "id; cd /guiautomationtask; dotnet test -v normal"
echo "++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++"
/*sh "dotnet publish /guiautomationtask/GuiTest/GuiTest.csproj"
sleep(time: 10, unit: "SECONDS")*/
}
}
stage ('Publish results') {
steps {
//view test-logs via HTML Publisher plugin
publishHTML(target:[
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: false,
reportDir: "", //here should be report directory with saved html report file
reportFiles: "testResults.html",
reportName: 'HTML-Report',
//reportTitles: ''
])
echo "artifacts saved in zip";
}
}
}
post {
always {
//stop an application container
sh "docker compose -f webstore-compose.yml stop"
}
}
}
In my pipeline, I run testcases in the docker container then I copy some directories from the docker container to the Jenkins workspace.
It isn't necessary that all directories will exist in the docker container (for example screenshot dir may exists or not according to failing tests). How can I check for directory or file existence before copying it from docker container.
Here the part I mention in the pipeline
post {
always {
echo 'Generating Test Reports ...'
sh 'make posttest'
echo('Copying Test Files ...')
sh 'docker cp container-name:/app/results/mochareports/assets/videos ./results'
sh 'docker cp container-name:/app/results/mochareports/assets/screenshots ./results'
sh 'docker cp container-name:/app/results/mochareports/report.html ./results'
echo 'Publish Test Reports ...'
publishHTML (target : [allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: 'results/mochareports',
reportFiles: 'report.html',
reportName: 'Cypress Test Reports',
reportTitles: 'The test report'])
echo 'Destroy Build'
sh 'make destroy'
cleanWs()
}
}
Groovy-way is to use fileExists https://www.jenkins.io/doc/pipeline/steps/workflow-basic-steps/#fileexists-verify-if-file-exists-in-workspace
if(fileExists('/app/results/mochareports/assets/screenshots/')) {
...
}
I'm running my cypress tests on Jenkins inside a dockerized container and I generate cypress mocha awesome report, but I don't know how to display it inside Jenkins.
This is my cypress.json content
{
"integrationFolder": "test/specs",
"supportFile": "test/support/index.js",
"video": true,
"reporter": "node_modules/cypress-multi-reporters",
"reporterOptions": {
"reporterEnabled": "mochawesome",
"mochawesomeReporterOptions": {
"reportDir": "results/mocha",
"overwrite": false,
"html": false,
"json": true,
"timestamp": "mmddyyyy_HHMMss",
"showSkipped": true,
"charts": true,
"quite": true,
"embeddedScreenshots": true
}
},
"screenshotOnRunFailure": true,
"screenshotsFolder": "results/mochareports/assets/screenshots",
"videosFolder": "results/mochareports/assets/videos",
"baseUrl": "http://testurl.com",
"viewportWidth": 1920,
"viewportHeight": 1080,
"requestTimeout": 10000,
"responseTimeout": 10000,
"defaultCommandTimeout": 10000,
"watchForFileChanges": true,
"chromeWebSecurity": false
}
And here is my scripts which I run locally.
"clean:reports": "rm -R -f results && mkdir results && mkdir results/mochareports",
"pretest": "npm run clean:reports",
"cypress:interactive": "cypress open",
"scripts:e2e": "cypress run",
"combine-reports": "mochawesome-merge results/mocha/*.json > results/mochareports/report.json",
"generate-report": "marge results/mochareports/report.json -f report -o results/mochareports -- inline",
"posttest": "npm run combine-reports && npm run generate-report",
"test:e2e": "npm run pretest && npm run scripts:e2e || npm run posttest",
I can view my generated report successfully in the local environment.
Here is my jenkinsfile content
#!groovy
pipeline {
agent any
stages {
stage('Checkout') {
steps {
echo 'Checking out the PR'
checkout scm
}
}
stage('Build') {
steps {
echo 'Destroy Old Build'
sh 'make destroy'
echo 'Building'
sh 'make upbuild_d'
}
}
stage('Test') {
steps {
echo 'Running Tests'
sh 'make test-e2e'
}
}
stage('Destroy') {
steps {
echo 'Destroy Build'
sh 'make destroy'
}
}
}
}
The make test-e2e actually runs the test:e2e script inside a docker container, the tests actually run and I can see the reports get generated on Jenkins but I don't know how to view it.
I need to view it in a separate inside Jenkins, also I don't know why I can't access it via Jenkins workspace.
btw. I'm adding the results file in .gitignore
This is my local report preview
You can use the HTML publisher plugin for Jenkins for this:
https://plugins.jenkins.io/htmlpublisher/
Within your Jenkinsfile add a stage to publish the HTML reports
e.g.
publishHTML([
allowMissing: false,
alwaysLinkToLastBuild: false,
keepAll: true,
reportDir: 'cypress/cypress/reports/html',
reportFiles: 'index.html',
reportName: 'HTML Report',
reportTitles: ''])
I used the HTML Publisher plugin as the mentioned solution above however my problem was that my results file was in the docker container not in Jenkins workspace and I fixed this problem by copying the folder from a docker container to Jenkins workspace.
docker cp container_name:/app/results ./results
I have a Jenkins pipeline for .Net Core REST API and I am getting an error on the command for executing Jmeter tests :
[Pipeline] { (Performance Test)
[Pipeline] sh
+ docker exec 884627942e26 bash
[Pipeline] sh
+ /bin/sh -c cd /opt/apache-jmeter-5.4.1/bin
[Pipeline] sh
+ /bin/sh -c ./jmeter -n -t /home/getaccountperftest.jmx -l /home/golide/Reports/LoadTestReport.csv -e -o /home/golide/Reports/PerfHtmlReport
-n: 1: -n: ./jmeter: not found
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Performance Test Report)
Stage "Performance Test Report" skipped due to earlier failure(s)
I have jmeter running as a Docker container on the server as per this guide Jmeter On Linux and I am able to extract the reports but this same command fails when I run within Jenkins context :
/bin/sh -c ./jmeter -n -t /home/getaccountperftest.jmx -l /home/golide/Reports/LoadTestReport.csv -e -o /home/golide/Reports/PerfHtmlReport
This is my pipeline :
pipeline {
agent any
triggers {
githubPush()
}
environment {
NAME = "cassavagateway"
REGISTRYUSERNAME = "golide"
WORKSPACE = "/var/lib/jenkins/workspace/OnlineRemit_main"
VERSION = "${env.BUILD_ID}-${env.GIT_COMMIT}"
IMAGE = "${NAME}:${VERSION}"
}
stages {
.....
.....
stage ("Publish Test Report") {
steps{
publishHTML target: [
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: '/var/lib/jenkins/workspace/OnlineRemit_main/IntegrationTests/BuildReports/Coverage',
reportFiles: 'index.html',
reportName: 'Code Coverage'
]
archiveArtifacts artifacts: 'IntegrationTests/BuildReports/Coverage/*.*'
}
}
stage ("Performance Test") {
steps{
sh 'docker exec 884627942e26 bash'
sh '/bin/sh -c cd /opt/apache-jmeter-5.4.1/bin'
sh '/bin/sh -c ./jmeter -n -t /home/getaccountperftest.jmx -l /home/golide/Reports/LoadTestReport.csv -e -o /home/Reports/HtmlReport'
sh 'docker cp 884627942e26:/home/Reports/HtmlReport /var/lib/jenkins/workspace/FlexToEcocash_main/IntegrationTests/BuildReports/Coverage bash'
}
}
stage ("Publish Performance Test Report") {
steps{
step([$class: 'ArtifactArchiver', artifacts: '**/*.jtl, **/jmeter.log'])
}
}
stage ("Docker Build") {
steps {
sh 'cd /var/lib/jenkins/workspace/OnlineRemit_main/OnlineRemit'
echo "Running ${VERSION} on ${env.JENKINS_URL}"
sh "docker build -t ${NAME} /var/lib/jenkins/workspace/OnlineRemit_main/OnlineRemit"
sh "docker tag ${NAME}:latest ${REGISTRYUSERNAME}/${NAME}:${VERSION}"
}
}
stage("Deploy To K8S"){
sh 'kubectl apply -f {yaml file name}.yaml'
sh 'kubectl set image deployments/{deploymentName} {container name given in deployment yaml file}={dockerId}/{projectName}:${BUILD_NUMBER}'
}
}
}
My issues :
What doI need to change for that command to execute ?
How can I incorporate a condition to break the pipeline if the tests fail?
Jenkins Environment : Debian 10
Platform : .Net Core 3.1
The Shift-Left.jtl is a results file which JMeter will generate after execution of the `Shift-Left.jmx
By default it will be in CSV format, depending on what you're trying to achieve you can:
Generate charts from the .CSV file
Generate HTML Reporting Dashboard
If you have Jenkins Performance Plugin you can get performance trend graphs, possibility to automatically fail the build depending on various criteria, etc.
Based on this post, I'm trying to test this pipeline code in my environment:
pipeline {
agent any
stages {
stage ('push artifact') {
steps {
sh '[ -d archive ] || mkdir archive'
sh 'echo test > archive/test.txt'
sh 'rm -f test.zip'
zip zipFile: 'test.zip', archive: false, dir: 'archive'
archiveArtifacts artifacts: 'test.zip', fingerprint: true
}
}
stage('pull artifact') {
steps {
sh 'pwd'
sh 'ls -l'
sh 'env'
step([ $class: 'CopyArtifact',
filter: 'test.zip',
projectName: '${JOB_NAME}',
fingerprintArtifacts: true,
selector: [$class: 'SpecificBuildSelector', buildNumber: '${BUILD_NUMBER}']
])
unzip zipFile: 'test.zip', dir: './archive_new'
sh 'cat archive_new/test.txt'
}
}
}
}
but it gives the error message:
ERROR: Unable to find project for artifact copy: test
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
Finished: FAILURE
How can I fix his pipeline code?
If you enable authorization(like rbac), you must grant permission 'Copy Artifact' to the project. In project configuration, General -> Permission to Copy Artifact, check the box and set the projects that can copy the artifact
Rather than using projectName: '${JOB_NAME}', what worked for me is using projectName: env.JOB_NAME. I.e. your complete copy-artifacts step would look like this:
step([ $class: 'CopyArtifact',
filter: 'test.zip',
projectName: env.JOB_NAME,
fingerprintArtifacts: true,
selector: [$class: 'SpecificBuildSelector', buildNumber: env.BUILD_NUMBER]
])
Or using the more modern syntax:
copyArtifacts(
filter: 'test.zip',
projectName: env.JOB_NAME,
fingerprintArtifacts: true,
selector: specific(env.BUILD_NUMBER)
)