ssh-add make the Jenkins pipline job to fail - docker

Jenkins (2.162), updated modules. I need to to add private github dependencies for cargo build. So, I need store SSH key into Jenkins container before cargo build.
I did:
stage('Build') {
steps{
script {
dir('api'){
withCredentials([string(credentialsId: 'GitKeyText', variable: 'ID_RSA')]) {
sh '''
set +x
eval `ssh-agent -s`
mkdir ~/.ssh
echo ${ID_RSA} >~/.ssh/id_rsa
chmod go-r ~/.ssh/id_rsa
ssh-add
cargo build
'''
}
}
input message: "wait"
}
}
}
All looks good and this sequence of command work well manually inside the docker container. But, Jenkins job had been failing at ssh-add without any error messages. Just ERROR: script returned exit code 1 at the end of the Jenkins console log.
add01:
I added echo comment to the code, and changed set +x to set -x
no output from ssh-add (console output)
.....
+ echo before ssh-add
before ssh-add
+ ssh-add
[Pipeline] }
[Pipeline] // withCredentials
[Pipeline] }
[Pipeline] // dir
[Pipeline] }
[Pipeline] // script
Post stage
.....

I used Jenkins SSH Agent Plugin.
All work as intended.
script {
dir('contract_api'){
sshagent(['GitSSHcred']) {
sh 'cargo build'
}
}
}

Related

How to specify JDK version in Jenkinsfile pipeline script

I have a pipeline script to deploy applications to server. I'm building project using maven, I want Jenkins to use specified JDK version for building the project. My pipeline script looks like this:
pipeline {
agent any
tools {
// Install the Maven version configured as "M3" and add it to the path.
maven "Maven 3.6.3"
}
stages {
stage('Build') {
steps {
// Run Maven on a Unix agent.
sh "mvn clean package -DskipTests=true -U"
}
post {
// If Maven was able to run the tests, even if some of the test
// failed, record the test results and archive the jar file.
success {
archiveArtifacts "**/${war}"
}
}
}
stage('Deploy EQM Instance 1') {
steps {
sshagent(credentials: ['credentials']) {
sh "echo 1"
sh "echo Initializing deployment to Instance 1"
sh "scp target/${war} ${bastionHost}:/home/opc"
sh "echo 2.1"
sh "echo Uploaded war file to bastion instance"
sh "scp ${bastionHost}:/home/opc/${war} opc#${instanceDns}:/home/opc"
sh "echo 3.2"
sh "echo Uploaded war file from bastion instance to Instance 1 ${instanceDns}"
sh "echo Undeploying old war file"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo rm /opt/tomcat/webapps/${war}"
sh "echo 4.2.2"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo chown tomcat:tomcat -R ${war}"
sh "echo Deploying new war file"
sh "ssh ${bastionHost} -tt ssh opc#${instanceDns} sudo mv ${war} /opt/tomcat/webapps/"
sh "echo 4.3"
}
}
}
}
There are other already configured on Jenkins, I don't want to disturb them. So I want to specify JDK version in desired job configuration.

I need to hide the arguments from jenkins log

I need to build docker image with set of attributes through Jenkins, i also need to pass my private key as a argument, but while build the image i am getting the private key in my Jenkins log. I need to get rid of it and i need only the image build logs, anyone please help me on this
docker build --build-arg SSH_PRIVATE_KEY=$(cat ~/.ssh/id_rsa)-t ${REGISTRY}/${APPLICATION_NAME}:PR-${CHANGE_ID} .
As mentioned by #David
You shouldn't pass ssh keys into your build sequence like this at all
But to answer your question, modify the bash script and it will not display the content your ssh-key.
set +x
docker build --build-arg SSH_PRIVATE_KEY="$(cat ~/.ssh/id_rsa)" -t ssha .
set -x
you will not able to see the ssh_key during the build time, but it will be set, you can verify
docker run --rm ssha bash -c "cat ~/.ssh/id_rsa"
Try using MaskPasswordWrapper, it would successfully mask all the variables that are specified throughout the jenkins console log.
Plugin Link: https://wiki.jenkins.io/display/JENKINS/Mask+Passwords+Plugin
script{
san7ket = 'lololol'
wrap([$class: 'MaskPasswordsBuildWrapper', varMaskRegexes: [[regex: '(.)']], varPasswordPairs: [[var: 'sjagtap', var:'a']]]) {
// some block
echo san7ket
echo a
}
}
Output:
[Pipeline] script
[Pipeline] {
[Pipeline] wrap
[Pipeline] {
[Pipeline] echo
********************************************************
[Pipeline] echo
****************************************************************************************
[Pipeline] }
[Pipeline] // wrap
[Pipeline] }
[Pipeline] // script

Jenkins pipeline not correctly using sshagent credentials

I have this code snippet that has to use a custom private key from the Jenkins credentials using the ssh-agent-plugin.
This doesn't seem to work, but it also doesn't print a very useful output.
Any ideas how to debug this?
stage('Test Git') {
steps {
sshagent(credentials : ['denpal']) {
sh 'git commit --allow-empty -m "test withCredentials"'
sh 'git push origin feature/Jenkinsfile'
}
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test Git)
[Pipeline] sshagent
[ssh-agent] Using credentials git (denpal)
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-WEsIsQvX4CFc/agent.12163
SSH_AGENT_PID=12166
Running ssh-add (command line suppressed)
[Pipeline] // sshagent
[Pipeline] }
I had the same problem trying to push code to my repo from Jenkins.
I found the solution here: https://www.reddit.com/r/docker/comments/b8lmc4/jenkins_pipeline_not_correctly_using_sshagent/
I replaced the sshagent code block with:
withCredentials([sshUserPrivateKey(credentialsId: 'myCredentials', keyFileVariable: 'KEY_FILE')]) {
sh "eval `ssh-agent -s` && ssh-add ${KEY_FILE} && ssh-add -L && git push -u origin develop"
}
It worked for me.

Hide command executed, only show output

I want to hide jenkins sh execute command in pipeline
pipeline {
agent any
stages {
stage('Load Lib') {
steps {
sh "ls -al /"
}
}
}
}
Current result:
[Pipeline] {
[Pipeline] stage
[Pipeline] { (Load Lib)
[Pipeline] sh
[Test] Running shell script
+ ls -al /
I want to hide Running shell script ls -al / command in output.
Please help
This is definitely related to Echo off in Jenkins Console Output
For pipeline, what this means is:
pipeline {
agent any
stages {
stage('Load Lib') {
steps {
sh '''
set +x
//commands to be echoed off
ls -al
set -x
'''
}
}
}
}
''' indicates a multi line command. set +x turns off command echoing, and set -x turns it back on again.
You can override this behaviour for the whole script by putting the following at the top of the build step:
#!/bin/bash +x

Jenkins, Host key verification failed, script returned exit code 255

I have a building-server where I have Jenkins 2.73.3 and another servers where I deploy my apps.
I have also set up a credential to connect from building-server to the other servers.
But everytime I add another server it is difficult to add it because I set up the authorized key in the new server and in the command line works, but not in Jenkins.
Here is a little recipe that fails:
pipeline {
agent any
stages {
stage('Set conditions') {
steps {
sshagent(['xxxx-xxxx-xxxx-xxxx-xxxx']) {
sh "ssh user#product.company.com 'echo $HOME'"
}
}
}
}
}
And here is the Log failure:
[ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
[check] Running shell script
+ ssh user#product.company.com echo /var/lib/jenkins
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 12567 killed;
[ssh-agent] Stopped.
Host key verification failed.
[Pipeline] }
[Pipeline] // sshagent
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 255
Finished: FAILURE
It seems that the solution was to add the parameter StrictHostKeyChecking to the shell script line
sh "ssh -o StrictHostKeyChecking=no user#product.company.com 'echo $HOME'"

Resources