How to run a task if tests fail in Jenkins - jenkins

I have a site in production. And I have a simple playwright test that browses to the site and does some basic checks to make sure that it's up.
I'd like to have this job running in Jenkins every 5 minutes, and if the tests fail I want to run a script that will restart the production server. If the tests pass, I don't want to do anything.
What's the easiest way of doing this?
I have the MultiJob plugin that I thought I could use, and have the restart triggered on the failed test step, but it doesn't seem to have the ability to trigger specifically on fail.

Something like the following will do the Job for you. I'm assuming you have a second Job that will take care of the restart.
pipeline {
agent any
triggers{
cron('*/5 * * * *')
}
stages {
stage("Run the Test") {
steps{
echo "Running the Test"
// I'm returning exit code 1 so jenkins will think this failed
sh '''
echo "RUN SOMETHING"
exit 1
'''
}
}
}
post {
success {
echo "Success: Do nothing"
}
failure {
echo 'I failed :(, Execute restart Job'
// Executing the restart Job.
build job: 'RestartJob'
}
}
}

Related

Create process dump when Jenkins pipeline step runs into timeout

We run unit tests on Jenkins and one of our tests freezes sometimes.
We have timeouts defined in the Jenkins pipeline and the freeze triggers the timeout and that kills the testing process.
Is there a way (via Jenkins pipelines, maybe via Groovy) to execute a command (e.g. create a process dump of the testing process) as soon as we run into a timeout, but (of course) before the timeout kills the testing process?
You can wrap our test execution with a try-catch and do whatever you need after catching the timeout exception. Here is a sample Pipeline.
pipeline {
agent any
stages {
stage('Hello') {
steps {
script{
try {
timeout(unit: 'SECONDS', time: 5) {
echo "Running your Tests here!!!!"
sleep 10
}
} catch (e){
echo "The tests erroredout!!!" + e.getCauses()
if(e.getCauses()[0] instanceof org.jenkinsci.plugins.workflow.steps.TimeoutStepExecution$ExceededTimeout) {
echo "This is a timeout, do whatever you want..."
}
}
}
}
}
}
}

Jenkins cucumber reports

I'm using Cucumber reports plugin in my declarative pipeline like that:
cucumber '**/cucumber.json'
I'm able to check if some tests fail through link on the sidebar, but do I need to do something to mark the stage containing cucumber.json check as failed if some cucumber reports are failed? Because the problem is the build and stage are both green and successful despite there are some failed cucumber reports.
Jenkins version is 2.176.3
Cucumber reports version is 4.10.0
Cucumber command you are using just generates the report regardless the test result.
So yes, you have to make your pipeline fail somehow as the problem you are facing is that your test command is not returning making your pipeline fail.
The way to go is to make that the command that runs the tests returns non-zero exit code (exit 1) if something went wrong on your tests. That would make your pipeline stage to go red.
In case you run your tests using Maven this would be automatically managed on 'mvn test' (or whatever).
Otherwise, if you cannot do that, you will have to manage to make something like for example an sh script
that returns the exit code (0 pass / 1 fail) or a groovy function inside 'script' tag that sets the pipeline currentBuild.result value:
def checkTestResult() {
// Check some file to see if tests went fine or not
return 'SUCCESS' // or 'FAILURE'
}
...
stage {
script {
currentBuild.result = checkTestResult()
if (currentBuild.result == 'FAILURE') {
sh "exit 1" // Force pipeline exit with build result failed
}
}
}
...
I recommend you to use cucumber command on a 'always' post build action of your declarative pipeline
as it is a step that you will likely execute every time at the end of the pipeline either if it passes or fails. See the following example:
pipeline {
stages {
stage('Get code') {
// Whatever
}
stage('Run tests') {
steps {
sh "mvn test" // run_tests.sh or groovy code
}
}
}
post {
always {
cucumber '**/cucumber.json'
}
}
}
It is possible to set BuildStatus : 'FAILURE' to mark build as failed if a report marked as failed.
cucumber fileIncludePattern: '**/cucumber.json', buildStatus: 'FAILURE'

How to start up a server in jenkins to run acceptance tests in parallel?

As part of the build, these are the stages I have:
Compile and package the project
Parallel Jobs:
Run unit tests
Start up server by executing the jar file created in stage 1
Wait for server to be ready and run acceptance tests
against that
The question is when acceptance tests are finished, how do I signal the other parallel job to shutdown the server? At the moment the jar execution process remains running even if I kill the build from the Jenkins.
Also, in the event of the build being cancelled manually, how can we capture that event and signal the server to shut itself down using scripted pipeline/groovy?
Sample code:
node('jenkins-jenkins-slave') {
stage('Parallel jobs') {
parallel
stage('Preparing server for acceptance test') {
script {
def propertiesString = ''
def jvmOptions = ''
def properties = readProperties file: "${propertiesFile}".toString()
properties.each { k, v ->
propertiesString = propertiesString + "-D${k}=${v} ".toString()
}
sh "java ${jvmOptions} -Dloader.path=${loaderPath} ${propertiesString}-jar ${jarFile} --spring.profiles.active=${activeProfiles}"
}
},
stage('Acceptance tests') {
script {
def scriptName = 'wait_until_env_ready.sh'
def exists = fileExists "$scriptName"
if(!exists) {
def scriptContent = libraryResource "${scriptName}"
writeFile file: "${scriptName}", text: scriptContent
sh "chmod +x ${scriptName}"
}
sh "./wait_until_env_ready.sh http://localhost:8090/manage/health"
sh "mvn -B test -Dmaven.repo.local=/root/.m2/repository -DskipTests=true -DskipITs=false"
// How to signal the above stage to shut down the jar execution?
}
}
}
}
What you are looking for is a post-always -section.
Also you need some way to terminate the server. For example find its PID and kill it within the post.

jenkins complicated buildflow, is it possible?

I would like to have a Jenkins build flow that looks like this.
After the build is triggered all slaves run the same job in parallel (a setup job).
If any slaves fail this job they should not continue on.
For the all the slaves that to pass that job, they should grab a job out of a pool of jobs that need to be completed. And once a slave completes a job they should go back to complete another job in the pool.
I have only started working with Jenkins a few weeks ago and they way I have it setup now is as each job is picked up by a slave they have to run the setup job first. This really slows down build times because I have about 30 jobs and the setup takes ~2 minutes.
I am using Jenkins as an automated testing platform and all the jobs in the job pool can run independently of each other. I have 5 slaves currently and ~30 jobs.
The following should do the trick:
def jobPool = new ArrayDeque()
jobPool.add({
echo "Doing stuff on ${env.NODE_NAME}"
});
jobPool.add({
echo "Doing other stuff on ${env.NODE_NAME}, a little slower"
sleep 4
});
jobPool.add({
echo "Doing more stuff on ${env.NODE_NAME}, even slower"
sleep 10
});
jobPool.add({
echo "Doing stuff quick on ${env.NODE_NAME}"
});
jobPool.add({
echo "Doing stuff quicker on ${env.NODE_NAME}"
});
def par = [:]
for (x in ["master", "urban"]) {
def nodeName = x; // needed due to variable scoping
par[nodeName] = {
node (nodeName) {
try {
echo "Doing setup on ${env.NODE_NAME}!"
// Do you're setup
echo "Done with setup"
} catch (Exception e) {
echo "Will not use this node as it failed setup!"
return;
}
while (true) {
// echo "${jobPool.size()}"
def subTask = jobPool.poll()
//echo "${jobPool.size()} ${subTask}"
if (subTask == null) {
break;
}
// Might wan't try catch around the next line if you wan't to continue if a job fails
subTask()
}
}
}
}
parallel par
if (!jobPool.isEmpty()) {
error "Not all tasks was done!"
}
Simply add your "job pool jobs" to the jobPool variable and modify the setup part.
It seems like you want separate stages in the same job. This is made much easier in jenkins 2's pipelines. There are some pictures here:
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Stage+View+Plugin
the [groovy] code ends up looking like this:
node {
stage 'Checkout'
svn 'https://svn.mycorp/trunk/'
stage 'Build'
sh 'make all'
stage 'Test'
sh 'make test'
}

Scripted Jenkins pipeline: continue on fail

Is there a way to continue execution of the scripted pipeline even if the previous stage failed? I need to run specific commands (cleanup) when the build fails before the whole job fails.
The accepted answer wouldn't fail the stage or even mark it as unstable. It is now possible to fail a stage, continue the execution of the pipeline and choose the result of the build:
pipeline {
agent any
stages {
stage('1') {
steps {
sh 'exit 0'
}
}
stage('2') {
steps {
catchError(buildResult: 'SUCCESS', stageResult: 'FAILURE') {
sh "exit 1"
}
}
}
stage('3') {
steps {
sh 'exit 0'
}
}
}
}
In the example above, all stages will execute, the pipeline will be successful, but stage 2 will show as failed:
As you might have guessed, you can freely choose the buildResult and stageResult, in case you want it to be unstable or anything else. You can even fail the build and continue the execution of the pipeline.
Just make sure your Jenkins is up to date, since this is a fairly new feature.
The usual approach is to wrap your steps within a try block.
try {
sh "..."
} catch (err) {
echo "something failed"
}
// cleanup
sh "rm -rf *"
To ease the pain and make the pipeline code more readable, I've encapsulated this in another method here in my global library code.
Another approach, esp. created because of this very issue, are the declarative pipelines (blog, presentation).
post {
always {
cleanWs()
}
}
}
Will always cleanup the job even if the rest fails

Resources