I want to hide jenkins sh execute command in pipeline
pipeline {
agent any
stages {
stage('Load Lib') {
steps {
sh "ls -al /"
}
}
}
}
Current result:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Load Lib)
[Pipeline] sh
[Test] Running shell script
+ ls -al /
I want to hide Running shell script ls -al / command in output.
Please help
This is definitely related to Echo off in Jenkins Console Output
For pipeline, what this means is:
pipeline {
agent any
stages {
stage('Load Lib') {
steps {
sh '''
set +x
//commands to be echoed off
ls -al
set -x
'''
}
}
}
}
''' indicates a multi line command. set +x turns off command echoing, and set -x turns it back on again.
You can override this behaviour for the whole script by putting the following at the top of the build step:
#!/bin/bash +x
Related
I'm very new to using docker and Jenkinfiles in Jenkins.
Currently, I want to run a docker container (pyinstaller-windows) on linux, so I wrote the following Jenkinsfile to test it:
pipeline {
agent none
stages {
stage('Deliver') {
agent {
docker {
image 'cdrx/pyinstaller-windows:python3'
}
}
steps {
sh 'cd app/'
sh 'pip install -r requirements.txt'
sh 'cd gui'
sh './gui_to_exe.sh' //containing pyinstaller command
}
post {
success {
archiveArtifacts 'appName.exe'
}
}
}
}
}
After running it in Jenkins, I received the following error message:
cdrx/pyinstaller-windows:python3 cat
$ docker top 8...5 -eo pid,comm
[Pipeline] // withDockerContainer
[Pipeline] }
[Pipeline] // withEnv
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // stage
[Pipeline] End of Pipeline
java.io.IOException: Failed to run top '8...5'. Error: Error response
from daemon: Container 8...5 is not running.
at org.jenkinsci.plugins.docker.workflow.client.DockerClient.listProcess(DockerClient.java:152)
at org.jenkinsci.plugins.docker.workflow.WithContainerStep$Execution.start(WithContainerStep.java:201)
at org.jenkinsci.plugins.workflow.cps.DSL.invokeStep(DSL.java:322)
at org.jenkinsci.plugins.workflow.cps.DSL.invokeMethod(DSL.java:196)
at org.jenkinsci.plugins.workflow.cps.CpsScript.invokeMethod(CpsScript.java:124)
at org.codehaus.groovy.runtime.callsite.PogoMetaClassSite.call(PogoMetaClassSite.java:47)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:116)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.methodCall(DefaultInvoker.java:20)
at org.jenkinsci.plugins.docker.workflow.Docker$Image.inside(Docker.groovy:140)
at org.jenkinsci.plugins.docker.workflow.Docker.node(Docker.groovy:66)
at org.jenkinsci.plugins.docker.workflow.Docker$Image.inside(Docker.groovy:125)
at org.jenkinsci.plugins.docker.workflow.declarative.DockerPipelineScript.runImage(DockerPipelineScript.groovy:54)
at
...
What am I doing wrong here?
Check the following minimal pipelinne. I fixed the issues I observed. Other than that, I'm not sure from where you are getting the app directory. Probably you may want to mount the workspace to your Container if you have any sources in the host machine.
pipeline {
agent any
stages {
stage('Deliver') {
agent {
docker {
image 'cdrx/pyinstaller-windows:python3'
args "--entrypoint=''"
}
}
steps {
echo "Something"
sh '''
cd app/
pip install -r requirements.txt
cd gui
./gui_to_exe.sh
'''
}
post {
success {
archiveArtifacts 'appName.exe'
}
}
}
}
}
I have a Jenkin DSL JOB. It's for java build. I am stuck in a strange problem.
jobname is DSL, I saw a workspace with the name of DSL is created, But when the job runs it added another workspace with the name of DSL#2. The problem I can not get final jar file from DSL workspace
pipeline
{
agent any
stages
{
stage('Build')
{
agent {
docker { image 'maven:latest'
args '-v /home/ubuntu/jenkins/jenkins_home/.m2:/root/.m2'
}
}
steps {
git branch: "${params.branch}", url: "git#github.org/repo.git"
sh 'mvn clean install -Dmaven.test.skip=true -Dfindbugs.skip=true'
sh "ls -la target/name.jar "
}
}
stage('Copy Artifects')
{
steps {
//print "$params.IP"
// sh '${params.IP}"
sh "ls -la && pwd "
sh "scp target/name.jar ubuntu#${params.IP}:/home/ubuntu/target/name.jar_2"
}
}
}
}
OUT Of the JOB
Compiling 19 source files to /var/jenkins_home/workspace/dsl#2/auth-client/target/classes
DSL#2 means you either have a concurrent job configured and two builds runnning at the same time, OR you got a bug https://issues.jenkins-ci.org/browse/JENKINS-30231
To address your issue:
you are building stage('Build') inside a docker container created from maven image.
However, stage('Copy Artifects') is run OUTSIDE of that container
To fix it, you need to move agent{} to pipeline{} level like this:
pipeline
{
agent {
docker {
image 'maven:latest'
args '-v /home/ubuntu/jenkins/jenkins_home/.m2:/root/.m2'
}
}
stages
{
stage('Build')
{
steps {
git branch: "${params.branch}", url: "git#github.org/repo.git"
sh 'mvn clean install -Dmaven.test.skip=true -Dfindbugs.skip=true'
sh "ls -la target/name.jar "
}
}
stage('Copy Artifects')
{
steps {
sh "ls -la && pwd "
sh "scp target/name.jar ubuntu#${params.IP}:/home/ubuntu/target/name.jar_2"
}
}
}
}
i am trying to ssh into a remote host and then execute certain commands on the remote host's shell. Following is my pipeline code.
pipeline {
agent any
environment {
// comment added
APPLICATION = 'app'
ENVIRONMENT = 'dev'
MAINTAINER_NAME = 'jenkins'
MAINTAINER_EMAIL = 'jenkins#email.com'
}
stages {
stage('clone repository') {
steps {
// cloning repo
checkout scm
}
}
stage('Build Image') {
steps {
script {
sshagent(credentials : ['jenkins-pem']) {
sh "echo pwd"
sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no'
sh "echo pwd"
sh 'sudo -i -u root'
sh 'cd /opt/docker/web'
sh 'echo pwd'
}
}
}
}
}
}
But upon running this job it executes sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no' successfully but it stops there and does not execute any further commands. I want to execute the commands that are written after ssh command inside the remote host's shell. any help is appreciated.
I would try something like this:
sshagent(credentials : ['jenkins-pem']) {
sh "echo pwd"
sh 'ssh -t -t ubuntu#xx.xxx.xx.xx -o StrictHostKeyChecking=no "echo pwd && sudo -i -u root && cd /opt/docker/web && echo pwd"'
}
I resolve this issue
script
{
sh """ssh -tt login#host << EOF
your command
exit
EOF"""
}
stage("DEPLOY CONTAINER"){
steps {
script {
sh """
#!/bin/bash
sudo ssh -i /path/path/keyname.pem username#serverip << EOF
sudo bash /opt/filename.sh
exit 0
<< EOF
"""
}
}
}
There is a better way to run commands on remote using SSH. I know this is late answer but I just explored this thing so would like to share and this will help others to resolve this problem easily.
I just found this link helpful on how to run multiple commands on remote using SSH. Also we can run multiple commands conditionally as mentioned in above blog.
By going through it, I found the syntax:
ssh username#hostname "command1; command2;commandN"
Now, how to run command inside remote hots using SSH in Jenkins pipeline?
Here is the solution:
pipeline {
agent any
environment {
/*
define your command in variable
*/
remoteCommands =
"""java --version;
java --version;
java --version """
}
stages {
stage('Login to remote host') {
steps {
sshagent(['ubnt-creds']) {
/*
Provide variable as argument in ssh command
*/
sh 'ssh -tt username#hostanem $remoteCommands'
}
}
}
}
}
Firstly and optionally, you can define a variable that holds all commands separated by ;(semicolon) and then pass it as parameter in command.
Another way, you can also pass your commands directly to ssh command as
sh "ssh -tt username#hostanem 'command1;command2;commandN'"
I have used it in my code and it's working great!
see the output here
Happy Learning :)
Jenkins (2.162), updated modules. I need to to add private github dependencies for cargo build. So, I need store SSH key into Jenkins container before cargo build.
I did:
stage('Build') {
steps{
script {
dir('api'){
withCredentials([string(credentialsId: 'GitKeyText', variable: 'ID_RSA')]) {
sh '''
set +x
eval `ssh-agent -s`
mkdir ~/.ssh
echo ${ID_RSA} >~/.ssh/id_rsa
chmod go-r ~/.ssh/id_rsa
ssh-add
cargo build
'''
}
}
input message: "wait"
}
}
}
All looks good and this sequence of command work well manually inside the docker container. But, Jenkins job had been failing at ssh-add without any error messages. Just ERROR: script returned exit code 1 at the end of the Jenkins console log.
add01:
I added echo comment to the code, and changed set +x to set -x
no output from ssh-add (console output)
.....
+ echo before ssh-add
before ssh-add
+ ssh-add
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // script
Post stage
.....
I used Jenkins SSH Agent Plugin.
All work as intended.
script {
dir('contract_api'){
sshagent(['GitSSHcred']) {
sh 'cargo build'
}
}
}
I am trying to remove the directory junit located in the workspace of my Jenkins job using scripted Pipeline which looks somewhat like this:
node {
stage('Build') {
checkout scm
app = docker.build("...")
}
stage('Test') {
app.withRun("--name = ${CONTAINER_ID} ...") {
// sh "mkdir -p junit"
// sh "rm -rf junit/"
dir "junit" {
deleteDir
}
sh "docker exec ${CONTAINER_ID} /bin/bash -c 'source venv/bin/activate && python run.py test -x junit'"
sh "docker cp ${CONTAINER_ID}:/home/foo/junit junit"
}
}
junit 'junit/*.xml'
}
However I am getting the following (red haring?) error, e.g.
java.lang.ClassCastException:
hudson.tasks.junit.pipeline.JUnitResultsStep.testResults expects class
java.lang.String but received class
org.jenkinsci.plugins.workflow.cps.CpsClosure2
However when I am using the shell steps:
sh "mkdir -p junit"
sh "rm -rf junit/"
It works as expected. What am I doing wrong?
Try to use parentheses:
dir ("junit") {
deleteDir()
}