How to print stdout & stderr of failed job in GNU parallel? - gnu-parallel

I'm invoking GNU parallel on a bunch of commands using parallel --keep-order --line-buffer --halt 2. Every once in a while one of the commands fails, and GNU parallel prints:
parallel: This job failed:
<failing command comes here>
Is there any way to to print the stdout and stderr of ONLY the failed job whenever this happens?

neno (no-error-no-output) does that:
neno 'echo stdout; echo stderr >&2; false'
neno 'echo stdout; echo stderr >&2; true'
https://gitlab.com/ole.tange/tangetools/-/tree/master/neno
So:
parallel --halt 2 neno ...

Related

How to run a shellscript from backgorund in Jenkins

I have the below command but it is not working, i see the process is created and killed automatically
BUILD_ID=dontKillMe nohup /Folder1/job1.sh > /Folder2/Job1.log 2>&1 &
Jenkins Output:
[ssh-agent] Using credentials user1 (private key for user1)
[job1] $ /bin/sh -xe /tmp/jenkins19668363456535073453.sh
+ BUILD_ID=dontKillMe
+ nohup /Folder1/job1.sh
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 8765 killed;
[ssh-agent] Stopped.
Finished: SUCCESS
I have run a command without the nohup, so you may try to use:
sh """
BUILD_ID=dontKillMe /Folder1/job1.sh > /Folder2/Job1.log 2>&1 &
"""
In my case, I did not need to redirect STDERR to STDOUT because the process I was running captured all errors and displayed on STDOUT directly.
We use daemonize for that. It properly starts the program in the background and does not kill it when the parent process (bash) exits.

Groovy shell script with a sed command in a Jenkins Pipeline

So writing Groovy with basic shell scripts seem to be much more difficult than it really should be.
I have a pipeline that needs to replace an entry in a file after running a packer command. It seems sensible to do this in the same shell script as the packer command as the variables are not available outside of the shell script even when exported.
The problem is that the sed command needs escape upon escape and still doesn't work. So this is what the Jenkins Pipeline Syntax generator suggested:
parallel (
"build my-application" : {
sh '''#!/bin/bash
export PATH=$PATH:~/bin
cd ${WORKSPACE}/platform/packer
packer build -machine-readable template.json | tee packer.out
AMI_APP=$(grep amazon-ebs,artifact,0,id,eu-west-2:ami- packer.out | awk -F: \'{ print $NF }\')
[[ ! ${AMI_APP} ]] && exit 1
sed -i.bak \'s!aws_ami_app = \\".*\\"!aws_ami_app = \\"\'"${AMI_APP}"\'\\"!\' ${WORKSPACE}/platform/terraform/env-${ENV}/env.auto.tfvars
'''
},
"build some-more-apps" : {
sh ''' *** same again different name ***
'''
}
)
What is the correct way to get a variable is a sed command working in a bash script running in groovy?
Any tips for the correct syntax going forward with Jenkins, groovy and bash - any documentation that actually helps?
EDIT
The original sed command that is running in a Jenkins Job shell is:
sed -i.bak 's!aws_ami_app = \".*\"!aws_ami_app = \"'"${AMI_APP}"'\"!' ${WORKSPACE}/platform/terraform/env-${ENV}/env.auto.tfvars
Because you put the shell script inside ''' which won't trigger Groovy String interpolation.
So you no need to escape any character, write the script as when you typing in Shell cmd window.
Below is example:
sh '''#!/bin/bash +x
echo "aws_ami_app = docker.xy.com/xy-ap123/conn:7et45u.1.23" > test.txt
echo "cpu = 512" >> test.txt
cat test.txt
AMI_APP=docker.xy.com/xy-ap123/conn:7et45u.1.25
sed -i 's,aws_ami_app.*,aws_ami_app = '"$AMI_APP"',' test.txt
cat test.txt
'''
Output in jenkins console:
[Pipeline] sh
[poc] Running shell script
aws_ami_app = docker.xy.com/xy-ap123/conn:7et45u.1.23
cpu = 512
aws_ami_app = docker.xy.com/xy-ap123/conn:7et45u.1.25
cpu = 512

Is it possible to send all output of the sh DSL command in the Jenkins pipeline to a file?

I'm trying to de-clutter my Jenkins output. Thanks to Is it possible to capture the stdout from the sh DSL command in the pipeline, I know I can send the output of each sh command to a file. However, the commands themselves will still be written to the Jenkins output instead of the file. For example:
sh '''echo "Hello World!"
./helloworld
./seeyoulater
'''
As is, this results in the Jenkins output looking like this:
echo "Hello World!"
Hello World!
./helloworld
<helloworld output, possibly many lines>
./seeyoulater
<seeyoulater output, possibly many lines>
However, if I send the output to a file, I get Jenkins output like this:
echo "Hello World!" > output.log
./helloworld >> output.log
./seeyoulater >> output.log
and output.log looking like this:
Hello World!
<helloworld output>
<seeyoulater output>
This leads to my Jenkins output being less cluttered, but output.log ends up not having any separators between the script outputs. I suppose I could have echo <command> right before each command, but that just means my Jenkins output gets more cluttered again.
Is there any way to send the entire output of the sh DSL command to a file? Basically something like sh '''<commands here>''' > output.log is what I'm looking for.
I wasn't able to find a solution, but I did find a workaround. As mentioned in the sh command documentation, the default is to run using the -xe flags. The -x flag is why the commands are shown in the Jenkins output. The remedy is to add set +x:
sh '''set +x
echo "Hello World!" > output.log
./helloworld >> output.log
./seeyoulater >> output.log
'''
The set +x shows up in the Jenkins output, but the rest of the commands do not. From there, it's just a matter of adding enough echo statements in there to make output.log sufficiently readable.

How do I fail a Jenkins build if a Docker Pipeline Plugin withRun command returns a non-zero exit code?

I'm using the Docker Pipeline Plugin to execute my build scripts via Docker containers. I noticed that if I had a script return a non-zero exit code when executing within an inside() command, Jenkins would mark the pipeline execution as a failure. This example Jenkinsfile illustrates that scenario:
docker.image('alpine').inside() {
sh 'exit 1'
}
However, if I use the withRun() command, a similar Jenkinsfile will not cause the build to fail, even though the docker ps -l command shows that the container exited with a non-zero status:
node() {
sh 'touch ./test.sh'
sh 'echo "exit 1" >> ./test.sh'
sh 'chmod 755 ./test.sh'
docker.image('alpine').withRun("-v ${WORKSPACE}:/newDir", '/bin/sh /newDir/test.sh') {container ->
sh "docker logs ${container.id} -f"
sh 'docker ps -l'
}
}
Is there a way to make withRun() fail the build if the container exits with a non-zero code?
One of possible solutions:
docker.withRegistry("https://${REGISTRY}", 'creds-id') {
stage("RUN CONTAINER"){
Image = docker.image("${IMAGE}-${PROJECT}:${TAG}")
try {
c = Image.run("-v /mnt:/mnt")
sh "docker logs -f ${c.id}"
def out = sh script: "docker inspect ${c.id} --format='{{.State.ExitCode}}'", returnStdout: true
sh "exit ${out}"
} finally {
c.stop()
}
}
}
I couldn't find any more information on exit codes from the withRun() command, so I ended up just executing a docker run command from an sh step:
node() {
sh 'touch ./test.sh'
sh 'echo "exit 1" >> ./test.sh'
sh 'chmod 755 ./test.sh'
sh "docker run --rm -v ${WORKSPACE}:/newDir alpine /bin/sh /newDir/test.sh"
}
How about running a script that exits based upon the output from docker wait?
sh "exit \$(docker wait ${container.id})"
wait prints the container's exit code, which in case of error causes the build to fail according to sh docs:
Normally, a script which exits with a nonzero status code will cause the step to fail with an exception.

How to configure Jenkins pipeline so that if there are multiple shell scripts and if one fails the jenkins jobs still runs instead of exiting

I want to configure a Jenkins pipeline job so that it should be able to run multiple shell script jobs. Even if one shell script fails the job should run the other two before failing the job.
You need to tweak your shell script, not Jenkins pipeline to achieve what you want!
Try this in your shell script
shell script command > /dev/null 2>&1 || true
so fail/pass it will execute and go to next shell script
You can always try catch the potentially failing sh execution
node {
sh "echo test"
try {
sh "/dev/null 2>&1"
} catch (error) {
echo "$error"
}
sh "echo test1"
}
Above runs successfully and produces
Started by user Blazej Checinski
[Pipeline] node
Running on agent2 in /home/build/workspace/test
[Pipeline] {
[Pipeline] sh
[test] Running shell script
+ echo test
test
[Pipeline] sh
[test] Running shell script
+ /dev/null
/home/build/workspace/test#tmp/durable-b4fc2854/script.sh: line 2: /dev/null: Permission denied
[Pipeline] echo
hudson.AbortException: script returned exit code 1
[Pipeline] sh
[test] Running shell script
+ echo test1
test1
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
Finished: SUCCESS

Resources