I am attempting to use sshagent in Jenkins to pass my private key into the terraform container to allow terraform to source a module in a private repo.
stage('TF Plan') {
steps {
container('terraform') {
sshagent (credentials: ['6c92998a-bbc4-4f27-b925-b50c861ef113']){
sh 'ssh-add -L'
sh 'terraform init'
sh 'terraform plan -out myplan'
}
}
}
}
When running the job it fails with the following output:
[ssh-agent] Using credentials (id_rsa_jenkins)
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
Executing shell script inside container [terraform] of pod [gcp-tf-builder-h79rb-h5f3m]
Executing command: "ssh-agent"
exit
SSH_AUTH_SOCK=/tmp/ssh-2xAa2W04uQV6/agent.20; export SSH_AUTH_SOCK;
SSH_AGENT_PID=21; export SSH_AGENT_PID;
echo Agent pid 21;
SSH_AUTH_SOCK=/tmp/ssh-2xAa2W04uQV6/agent.20
SSH_AGENT_PID=21
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/agent/workspace/demo#tmp/private_key_2729797926.key (user#workstation.local)
[ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
+ ssh-add -L
ssh-rsa REDACTED user#workstation.local
[Pipeline] sh
+ terraform init
[0m[1mInitializing modules...[0m
- module.demo_proj
Getting source "git::ssh://git#bitbucket.org/company/terraform-module"
[31mError downloading modules: Error loading modules: error downloading 'ssh://git#bitbucket.org/company/deploy-kickstart-project': /usr/bin/git exited with 128: Cloning into '.terraform/modules/e11a22f40c64344133a98e564940d3e4'...
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
[0m[0m
[Pipeline] }
Executing shell script inside container [terraform] of pod [gcp-tf-builder-h79rb-h5f3m]
Executing command: "ssh-agent" "-k"
exit
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 21 killed;
[ssh-agent] Stopped.
I've triple checked and I am for sure using the correct key pair. I am able to git clone locally from my mac to the repo with no issues.
An important note is that this Jenkins deployment is running within Kubernetes. The Master stays up and uses the Kubernetes plugin to spawn agents.
What does the Host key verification failed. error mean? From my research it can be due to known_hosts not properly being set. Is ssh-agent responsible for that?
Turns out it was an issue with known_hosts not being set. As a workaround we added this to our jenkinsfile
environment {
GIT_SSH_COMMAND = "ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no"
}
Related
I am trying to use a container within my jenkins pipeline, however I cant get ssh-agent to work inside it. I am on v1.19 of the plugin, when I run the below code I get
Host key verification failed. fatal: Could not read from remote
repository.
Please make sure you have the correct access rights and the repository
exists.
However if I run the code from outside the image it works perfect, proving that the user has the correct permissions.
node('nodeName'){
cleanWs()
ws("short"){
withDockerRegistry([credentialsId: 'token', url: "https://private.repo.com"]) {
docker.image("img:1.0.0").inside("-u root:root --network=host") {
sshagent(credentials: ["bitbucket_token"]) {
sh "mkdir ~/.ssh"
sh 'ssh-keyscan bitbucket.company.com >> ~/.ssh/known_hosts'
sh 'git clone ssh://git#bitbucket.company.com:PORT/repo.git'
}
}
}
}
}
Here is the output:
[Pipeline] sshagent
[ssh-agent] Using credentials jenkins (bitbucket_token)
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ docker exec abcdef123456 ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-qwertyu/agent.15
SSH_AGENT_PID=22
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/short#tmp/private_key_8675309.key (/home/jenkins/short#tmp/private_key_8675309.key)
[ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
+ mkdir /root/.ssh
[Pipeline] sh
+ ssh-keyscan bitbucket.company.com
# bitbucket.company.com:22 SSH-2.0-OpenSSH_6.6.1
# bitbucket.company.com:22 SSH-2.0-OpenSSH_6.6.1
# bitbucket.company.com:22 SSH-2.0-OpenSSH_6.6.1
[Pipeline] sh
+ git clone ssh://git#bitbucket.company.com:PORT/repo.git
Cloning into 'repo'...
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
[Pipeline] }
$ docker exec --env ******** --env ******** abcdef123456 ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 22 killed;
[ssh-agent] Stopped.
[Pipeline] // sshagent
Im completely stumped by this
I installed jenkins ssh agent plugin. I created ssh private key on the linux server(using ssh-keygen -t rsa command) I am trying to connect. Then under jenkins credintials added SSH Username with private key with all required fields. In jenkinsfile added simple command to run over ssh:
pipeline {
agent any
stages {
stage('---doingsomething---') {
steps {
sshagent (credentials: ['jbowner-195']) {
sh 'ssh -o StrictHostKeyChecking=no -l jbowner 10.10.23.195 uname -a'
}
}
}
}
}
When I press build button process is starting and never ending. No error, no timeout issue.
Here is piece of output on which jenkins stacks
[Pipeline] }
[Pipeline] // stage
[Pipeline] withEnv
[Pipeline] {
[Pipeline] stage
[Pipeline] { (---echoing---)
[Pipeline] sshagent
[ssh-agent] Using credentials jbowner (jbowner 10.10.23.195)
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-XSQPEUHOqZQR/agent.10226
SSH_AGENT_PID=10229
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/.jenkins/workspace/Eformgenerator-Prod#tmp/private_key_5151715321960722060.key (/home/jenkins/.jenkins/workspace/Eformgenerator-Prod#tmp/private_key_5151715321960722060.key)
[ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
+ ssh -o StrictHostKeyChecking=no -l jbowner 10.10.23.195 uname -a
any ideas?
Jenkins ssh plugin didn't work for me. My solution is
generate rsa keys on source machine using ssh-keygen -t rsa
ssh-copy-id username#destination_ip. Then enter destination password. This will add as destination ip as a known host and adds source key on the destination machine as a authorized key.
then instead of using jenkins ssh agent I used standard ssh command like this.
pipeline {
agent any
stages {
stage('---echoing---') {
steps {
sh 'ssh -o StrictHostKeyChecking=no jbowner#10.10.23.195 uptime'
}
}
}
}
This is working because servers have been trusting each other using ssh key
I have this code snippet that has to use a custom private key from the Jenkins credentials using the ssh-agent-plugin.
This doesn't seem to work, but it also doesn't print a very useful output.
Any ideas how to debug this?
stage('Test Git') {
steps {
sshagent(credentials : ['denpal']) {
sh 'git commit --allow-empty -m "test withCredentials"'
sh 'git push origin feature/Jenkinsfile'
}
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (Test Git)
[Pipeline] sshagent
[ssh-agent] Using credentials git (denpal)
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-WEsIsQvX4CFc/agent.12163
SSH_AGENT_PID=12166
Running ssh-add (command line suppressed)
[Pipeline] // sshagent
[Pipeline] }
I had the same problem trying to push code to my repo from Jenkins.
I found the solution here: https://www.reddit.com/r/docker/comments/b8lmc4/jenkins_pipeline_not_correctly_using_sshagent/
I replaced the sshagent code block with:
withCredentials([sshUserPrivateKey(credentialsId: 'myCredentials', keyFileVariable: 'KEY_FILE')]) {
sh "eval `ssh-agent -s` && ssh-add ${KEY_FILE} && ssh-add -L && git push -u origin develop"
}
It worked for me.
Use case:
I have a Jenkins pipeline to update my development environment.
My dev env is an EC2 aws instance with docker compose.
The automation was written along the lines of:
withAWS(profile: 'default') {
sh "ssh -o StrictHostKeyChecking=no -i ~/my-key.pem user#$123.456.789 /bin/bash -c 'run some command like docker pull'"
}
Now, I have other test environments, and they all have some sort of docker-compose file, configurations and property files that requires me to go over all of them when something needs to change.
To help with that, I created a new repository to keep all the different environment configurations, and my plan is to have a clone of this repo in all my development and test environments, so when I need to change something, I can just do it locally, push it, and have my jenkins pipeline update the repository in whichever environment it is updating.
My jenkins has a ssh credential for my repo (it uses in another job that clones the repo and run tests on source code), so I want to use that same credential.
Question: can I somehow, through ssh'ing into another machine, use Jenkins ssh-agent credentials to clone/update a bitbucket repository?
Edit:
I changed the pipeline to:
script {
def hgCommand = "hg clone ssh://hg#bitbucket.org/my-repo"
sshagent(['12345']) {
sh "ssh -o StrictHostKeyChecking=no -i ~/mykey.pem admin#${IP_ADDRESS} /bin/bash -c '\"${hgCommand}\"'"
}
}
And I am getting:
[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent] Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-FOburguZZlU0/agent.662
SSH_AGENT_PID=664
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/workspace/abc#tmp/private_key_12345.key (rsa w/o comment)
[ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
[test-env-config] Running shell script
+ ssh -o StrictHostKeyChecking=no -i /home/jenkins/mykey.pem admin#123.456.789 /bin/bash -c "hg clone ssh://hg#bitbucket.org/my-repo"
remote: Warning: Permanently added the RSA host key for IP address '765.432.123' to the list of known hosts.
remote: Permission denied (publickey).
abort: no suitable response from remote hg!
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 664 killed;
[ssh-agent] Stopped.
First some background to understand the reasoning (this is pure ssh, no Jenkins or Mercurial specific): the ssh-agent utility works by making a UNIX domain socket to be then used by ssh. The ssh command attempts to communicate with the agent if it finds the the environment variable SSH_AUTH_SOCK. In addition, ssh can be instructed to forward the agent, via -A. For more details, see the man pages of ssh-agent and ssh.
So, assuming that your withAWS context makes the environment variable SSH_AUTH_SOCK (set by the plugin) available, I think it should be enough to:
add -A to your ssh invocation
in the part 'run some command like docker pull', add the hg clone command, ensuring you are using the ssh:// schema for the mercurial URL.
Security observation: -o StrictHostKeyChecking=no should be used as a last resort. From your example, the IP address of the target is fixed, so you should do the following:
remove the -o StrictHostKeyChecking=no
one-shot: get the host fingerprint of 123.456.789 (for example by ssh-ing into it and then looking for the associated line in your $HOME/.known_hosts). Save that line in a file, say 123.456.789.fingerpint
make the file 123.456.789.fingerprint available to Jenkins when it is invoking your sample code. This can be done by committing that file in the repo that contains the Jenkins pipeline, it is safe to do so since it doesn't contain secrets.
Finally, change your ssh invocation to something like ssh -o UserKnownHostsFile=/path/to/123.456.789.fingerprint ...
I have a jenkins instance running on my raspberry pi 3 and i also have my (simple) apache webserver running on the same raspberry pi.
I've got a pipeline from jenkins to fetch a git repo, build it and put (via scp) the build files on my webserver.
I have a ssh private/public key setup, but it's a bit stupid (?) to have an ssh key when the jenkins is hosted on the same 'machine' with the same ip address no?
Anyway, on my raspberry pi i have setup the autorized keys file and the known host file with the public key on it, and i've added the private key to jenkins via the ssh-agent plugin.
Here you have my jenkinsfile thats being used by jenkins to define my pipeline:
node{
stage('Checkout') {
checkout scm
}
stage('install') {
nodejs(nodeJSInstallationName: 'nodeJS10.5.0') {
sh "npm install"
}
}
stage('build'){
nodejs(nodeJSInstallationName: 'nodeJS10.5.0') {
sh "npm run build"
}
}
stage('connect ssh and remove files') {
sshagent (credentials: ["0527982f-7794-45d0-99b0-135c868c5b36"]) {
sh "ssh pi#123.456.789.123 -p 330 rm -rf /var/www/html/*"
}
}
stage('upload new files'){
sshagent (credentials: ["0527982f-7794-45d0-99b0-135c868c5b36"]) {
sh "scp -P 330 -r ./build/* pi#123.456.789.123:/var/www/html"
}
}
}
Here is the output from the second to last job that is failing:
[Pipeline] }
[Pipeline] // nodejs
[Pipeline] }
[Pipeline] // stage
[Pipeline] stage
[Pipeline] { (connect ssh and remove files)
[Pipeline] sh
[Deploy_To_Production] Running shell script
+ ssh pi#123.456.789.123 -p 330 rm -rf /var/www/html/asset-manifest.json /var/www/html/css /var/www/html/favicon.ico /var/www/html/fonts /var/www/html/images /var/www/html/index.html /var/www/html/manifest.json /var/www/html/service-worker.js /var/www/html/static /var/www/html/vendor
Host key verification failed.
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // node
[Pipeline] End of Pipeline
ERROR: script returned exit code 255
Finished: FAILURE
Note: I've changed my IP address and my ssh port for security reasons.
Manually i can ssh to my raspberry pi and i can execute the commands manually from my laptop (both from same and other domain works).
I've also port forwarded the local ip so that i connect to it via SSH when i'm not home.
I guess I'm doing something wrong with the SSH keys etc, but i'm no expert whatsoever!
Can anyone help?
I need 4 more reputation point to comment, so I must write answer:)
Try use -v to debug ssh connection:
stage('connect ssh and remove files') {
sshagent (credentials: ["0527982f-7794-45d0-99b0-135c868c5b36"]) {
sh "ssh -v pi#123.456.789.123 -p 330 rm -rf /var/www/html/*"
}
}
In another hand
Host key verification failed means that the host key of the remote host was changed or you don't have the host key of the remote host. So at first try just ssh -v pi#123.456.789.123 as Jenkins user, from Jenkins host.
The issue was indeed that the host key verification was failing. I think this was due to not trusting the host.
But the real issue was pointed out by #3sky (see other answer). I needed to login as the jenkins user and try to ssh to my raspberry pi (which are both on the same machine).
So these are the steps i did:
Login via ssh to my raspberry pi
ssh -v pi#123.456.789.123 -p 330
Then I switched user to the jenkins user. After some google search i've found out how
sudo su -s /bin/bash jenkins
Then i ssh again to my own machine (where i already was ssh'ed in), so that i get the pop-up for thrusting this host once and for all!
ssh -v pi#123.456.789.123 -p 330
This solved my issue! Big thanks to 3sky for helping out!