I have a Jenkins pipeline for .Net Core REST API and I am getting an error on the command for executing Jmeter tests :
[Pipeline] { (Performance Test)
[Pipeline] sh
+ docker exec 884627942e26 bash
[Pipeline] sh
+ /bin/sh -c cd /opt/apache-jmeter-5.4.1/bin
[Pipeline] sh
+ /bin/sh -c ./jmeter -n -t /home/getaccountperftest.jmx -l /home/golide/Reports/LoadTestReport.csv -e -o /home/golide/Reports/PerfHtmlReport
-n: 1: -n: ./jmeter: not found
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Performance Test Report)
Stage "Performance Test Report" skipped due to earlier failure(s)
I have jmeter running as a Docker container on the server as per this guide Jmeter On Linux and I am able to extract the reports but this same command fails when I run within Jenkins context :
/bin/sh -c ./jmeter -n -t /home/getaccountperftest.jmx -l /home/golide/Reports/LoadTestReport.csv -e -o /home/golide/Reports/PerfHtmlReport
This is my pipeline :
pipeline {
agent any
triggers {
githubPush()
}
environment {
NAME = "cassavagateway"
REGISTRYUSERNAME = "golide"
WORKSPACE = "/var/lib/jenkins/workspace/OnlineRemit_main"
VERSION = "${env.BUILD_ID}-${env.GIT_COMMIT}"
IMAGE = "${NAME}:${VERSION}"
}
stages {
.....
.....
stage ("Publish Test Report") {
steps{
publishHTML target: [
allowMissing: false,
alwaysLinkToLastBuild: true,
keepAll: true,
reportDir: '/var/lib/jenkins/workspace/OnlineRemit_main/IntegrationTests/BuildReports/Coverage',
reportFiles: 'index.html',
reportName: 'Code Coverage'
]
archiveArtifacts artifacts: 'IntegrationTests/BuildReports/Coverage/*.*'
}
}
stage ("Performance Test") {
steps{
sh 'docker exec 884627942e26 bash'
sh '/bin/sh -c cd /opt/apache-jmeter-5.4.1/bin'
sh '/bin/sh -c ./jmeter -n -t /home/getaccountperftest.jmx -l /home/golide/Reports/LoadTestReport.csv -e -o /home/Reports/HtmlReport'
sh 'docker cp 884627942e26:/home/Reports/HtmlReport /var/lib/jenkins/workspace/FlexToEcocash_main/IntegrationTests/BuildReports/Coverage bash'
}
}
stage ("Publish Performance Test Report") {
steps{
step([$class: 'ArtifactArchiver', artifacts: '**/*.jtl, **/jmeter.log'])
}
}
stage ("Docker Build") {
steps {
sh 'cd /var/lib/jenkins/workspace/OnlineRemit_main/OnlineRemit'
echo "Running ${VERSION} on ${env.JENKINS_URL}"
sh "docker build -t ${NAME} /var/lib/jenkins/workspace/OnlineRemit_main/OnlineRemit"
sh "docker tag ${NAME}:latest ${REGISTRYUSERNAME}/${NAME}:${VERSION}"
}
}
stage("Deploy To K8S"){
sh 'kubectl apply -f {yaml file name}.yaml'
sh 'kubectl set image deployments/{deploymentName} {container name given in deployment yaml file}={dockerId}/{projectName}:${BUILD_NUMBER}'
}
}
}
My issues :
What doI need to change for that command to execute ?
How can I incorporate a condition to break the pipeline if the tests fail?
Jenkins Environment : Debian 10
Platform : .Net Core 3.1
The Shift-Left.jtl is a results file which JMeter will generate after execution of the `Shift-Left.jmx
By default it will be in CSV format, depending on what you're trying to achieve you can:
Generate charts from the .CSV file
Generate HTML Reporting Dashboard
If you have Jenkins Performance Plugin you can get performance trend graphs, possibility to automatically fail the build depending on various criteria, etc.
Related
I have a pipeline script to deploy applications to server. I'm building project using maven, I want Jenkins to use specified JDK version for building the project. My pipeline script looks like this:
pipeline {
agent any
tools {
// Install the Maven version configured as "M3" and add it to the path.
maven "Maven 3.6.3"
}
stages {
stage('Build') {
steps {
// Run Maven on a Unix agent.
sh "mvn clean package -DskipTests=true -U"
}
post {
// If Maven was able to run the tests, even if some of the test
// failed, record the test results and archive the jar file.
success {
archiveArtifacts "**/${war}"
}
}
}
stage('Deploy EQM Instance 1') {
steps {
sshagent(credentials: ['credentials']) {
sh "echo 1"
sh "echo Initializing deployment to Instance 1"
sh "scp target/${war} ${bastionHost}:/home/opc"
sh "echo 2.1"
sh "echo Uploaded war file to bastion instance"
sh "scp ${bastionHost}:/home/opc/${war} opc#${instanceDns}:/home/opc"
sh "echo 3.2"
sh "echo Uploaded war file from bastion instance to Instance 1 ${instanceDns}"
sh "echo Undeploying old war file"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo rm /opt/tomcat/webapps/${war}"
sh "echo 4.2.2"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo chown tomcat:tomcat -R ${war}"
sh "echo Deploying new war file"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo mv ${war} /opt/tomcat/webapps/"
sh "echo 4.3"
}
}
}
}
There are other already configured on Jenkins, I don't want to disturb them. So I want to specify JDK version in desired job configuration.
My jenkins is run in docker, I write a demo to remote my server with ssh-agent.
Here is my pipeline
pipeline {
agent any
stages {
stage('Hello') {
steps {
sshagent (credentials: ['hehu']) {
sh 'ssh -o StrictHostKeyChecking=no -l yunwei xxx.xxx.xx.25 -a'
sh 'pwd'
sh 'whoami'
}
}
}
}
}
Output
It looks like pwd and whoami command still run in jenkins docker not my server. I have no idea how to use this plugin, I can't find any usage from ssh-agent document.
You should use:
sh 'ssh -o StrictHostKeyChecking=no -l yunwei x.x.x.x pwd && whoami && cmd...'
I am setting up a Jenkins pipeline to deploy a PHP application. The application is using composer, so I am running composer install -o in the script to ensure that all dependencies are there. The test setup also ensures that the vendor/autoload.php is generated (it's in the phpunit.xml bootstrap config)
My scripts are based on https://modess.io/jenkins-php/ and http://jenkins-php.org/
My issue is that the vendor folder is not included, and the generated config.inc.php is not included.
The Jenkins log shows that my deployment line sh "cp -rp ${SOURCE_DIR}/* ${DEPLOY_DIR}" is being expanded into cp -rp src/globals.template.inc.php src/index.php src/phpinfo.php /usr/nasShare/htdocs/sometest which is not including the mentioned files. (in fact those are the only files in the src directory in SCM. Looking in the working directory on the jenkins server, the files are generated... )
Jenkinsfile
#!groovy
pipeline {
agent any
environment {
SOURCE_DIR="src"
TEMPLATE_FILE="globals.template.inc.php"
CONFIG_FILE="globals.inc.php"
}
stages {
stage ('Testing'){
steps {
echo "Running ant clean"
sh 'ant clean'
echo "running composer"
sh "composer install -o -d ${SOURCE_DIR}"
echo ""
sh 'ant quick-build'
}
}
stage ('Staging'){
steps {
echo "Building config file"
script {
def inptext = readFile file: "${SOURCE_DIR}/${TEMPLATE_FILE}"
inptext = inptext.replaceAll(~/¤GIT_BRANCH¤/, "${GIT_BRANCH_NAME}")
inptext = inptext.replaceAll(~/¤GIT_COMMIT¤/, "${sh(returnStdout: true, script: "git log -n 1 --pretty=format:'%h'").trim()}")
inptext = inptext.replaceAll(~/¤GIT_TAG¤/, "${sh(returnStdout: true, script: "git -C . describe --tags").trim()}")
writeFile file: "${SOURCE_DIR}/${CONFIG_FILE}", text: inptext
}
}
}
stage ('Remote Deploy'){
agent any
when{
//https://stackoverflow.com/a/44231270/1725871
environment name: 'DEPLOY_TYPE', value: 'remote'
}
steps {
echo "Deploying via SSH on ${SSH_SERVER_NAME}:${DEPLOY_DIR}"
//TODO: rename backup file
sh "ssh ${SSH_USERNAME}#${SSH_SERVER_NAME} tar -cvpzf BACKUP_FNAME ${DEPLOY_DIR}/* "
sh "ssh ${SSH_USERNAME}#${SSH_SERVER_NAME} rm -R ${DEPLOY_DIR}/*"
sh "scp -rp ${SOURCE_DIR}/* ${SSH_USERNAME}#${SSH_SERVER_NAME}:${DEPLOY_DIR}"
//TODO: delete backup on success
}
}
stage ('Local Deploy'){
agent any
when{
environment name: 'DEPLOY_TYPE', value: 'local'
}
steps {
echo "Deploying to ${DEPLOY_DIR} "
//TODO: backup existing files
sh "rm -R ${DEPLOY_DIR}/*"
sh "cp -rp ${SOURCE_DIR}/* ${DEPLOY_DIR}"
}
}
}
}
I found the error of my way. Adding ${WORKSPACE}/ to my SOURCE_DIR variable solved it. now all the expected files are being copied.
Working environment from jenkinsfile
environment {
SOURCE_DIR="${WORKSPACE}/src"
TEMPLATE_FILE="globals.template.inc.php"
CONFIG_FILE="globals.inc.php"
}
I am trying to remove the directory junit located in the workspace of my Jenkins job using scripted Pipeline which looks somewhat like this:
node {
stage('Build') {
checkout scm
app = docker.build("...")
}
stage('Test') {
app.withRun("--name = ${CONTAINER_ID} ...") {
// sh "mkdir -p junit"
// sh "rm -rf junit/"
dir "junit" {
deleteDir
}
sh "docker exec ${CONTAINER_ID} /bin/bash -c 'source venv/bin/activate && python run.py test -x junit'"
sh "docker cp ${CONTAINER_ID}:/home/foo/junit junit"
}
}
junit 'junit/*.xml'
}
However I am getting the following (red haring?) error, e.g.
java.lang.ClassCastException:
hudson.tasks.junit.pipeline.JUnitResultsStep.testResults expects class
java.lang.String but received class
org.jenkinsci.plugins.workflow.cps.CpsClosure2
However when I am using the shell steps:
sh "mkdir -p junit"
sh "rm -rf junit/"
It works as expected. What am I doing wrong?
Try to use parentheses:
dir ("junit") {
deleteDir()
}
I want to hide jenkins sh execute command in pipeline
pipeline {
agent any
stages {
stage('Load Lib') {
steps {
sh "ls -al /"
}
}
}
}
Current result:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Load Lib)
[Pipeline] sh
[Test] Running shell script
+ ls -al /
I want to hide Running shell script ls -al / command in output.
Please help
This is definitely related to Echo off in Jenkins Console Output
For pipeline, what this means is:
pipeline {
agent any
stages {
stage('Load Lib') {
steps {
sh '''
set +x
//commands to be echoed off
ls -al
set -x
'''
}
}
}
}
''' indicates a multi line command. set +x turns off command echoing, and set -x turns it back on again.
You can override this behaviour for the whole script by putting the following at the top of the build step:
#!/bin/bash +x