I have Jenkins, it is deployed on a remote Windows machine and it connects via ssh to a remote Linux server. In Jenkins settings, in the “Exec command” section, I write simple commands for the test:
ls –lah
pwd
hostname
But in Jenkins console, I get a response like this:
SSH: ls [-lah]
SSH: Failed command: [ls]
SSH: FAILED: Message [No such file]
SSH: Unsupported command [pwd]
SSH: Unsupported command [hostname]
Does anyone know why these errors are? And how can these commands be executed through Jenkins?
Related
Jenkins installed on an Amazon Linux ec2 machine.
Docker installed on another Amazon Linux ec2 machine.
I created a job with maven to build .war file, and use Publish over SSH plugin to copy the .war file to a tomcat container created by the docker server.
1-Send files or execute commands over SSH.
SSH Server Name: docker-server
Source files: webapp/target/*.war
Remove prefix: webapp/target
Remote directory: //opt//docker
Exec command:
docker stop tomcat-server;
docker rm -f tomcat-server;
docker image rm -f tomcat-server;
cd /opt/docker;
docker build -t tomcat-server
2-Send files or execute commands over SSH
SSH Server Name: docker-server
Exec command: docker run -d name tomcat-server -p 8090:8080 tomcat-server
I got an error when executing the job:
SSH: Connecting from host [ip-172-31-3-34.us-west-1.compute.internal]
SSH: Connecting with configuration [docker-server]
SSH: EXEC: completed after 200
SSH: Disconnecting configuration [docker-server]
**ERROR: Exception when publishing, exception message [Exec exit status not zero. Status [1]]**
Build step 'Send files or execute commands over SSH' changed build result to UNSTABLE
SSH: Connecting from host [ip-172-31-3-34.us-west-1.compute.internal]
SSH: Connecting with configuration [docker-server]
SSH: EXEC: completed after 1,202
SSH: Disconnecting configuration [docker-server]
**ERROR: Exception when publishing, exception message [Exec exit status not zero. Status [125]]
Finished: UNSTABLE**
I have a Jenkins project which pulls and containerises changes from the given repo and then uses an Ansible playbook to deploy to the host machine/s. There are over 10 different server groups in my /etc/ansible/hosts file, all of which can be pinged successfully using ansible -m ping all and SSH'd into from the Jenkins machine.
I spun up a new VM, added it to the hosts file and used ssh-copy-id to add the Jenkins machine's public key. I received a pong from my ansible ping and successfully SSH'd into the machine. When the run the project I receive the following error:
TASK [Gathering Facts] *********************************************************
fatal: [my_machine]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: Host key verification failed.", "unreachable": true}
The Jenkins project is virtually identical with my other projects and VM is the same as my other ones.
In the end I had to add host_key_checking = False into my /etc/ansible/ansible.cfg file but that is just a temporary fix.
Other answers online seem to show that the issue is with the SSH key but I don't believe this is true in my case as I can SSH into the machine. I would like to understand how to get rid of this error message and deploy without not checking the host key.
The remote host is in ~/.ssh/known_hosts.
Any help would be appreciated.
SSH to a remote host will verify the key in the remote host. And if ssh to a new machine, there will question ask you whether to add / trust the key. If you choose "Yes" the key will be saved in the ~/.ssh/known_hosts.
The message "Host key verification failed" implies that the key file of the remote host is not found / changed in the machine that run the Ansible script.
I normally resolve this problem by issuing a ssh to the remote host and add the key to the ~/.ssh/known_hosts file.
For me it helped to disable the host SSH key check in the Jenkins Job Configuration
I am running a docker registry locally on my machine, and I can pull my image from it successfully:
docker pull 192.168.174.205:5001/myimg:latest
I am also running a jenkins container on my machine, but Jenkins cannot pull any image from the local registry. I use a Blue Ocean container (on the same machine) to start a pipeline, and it outputs:
+ docker pull 192.168.174.205:5001/insureio:latest
Error response from daemon: Get https://192.168.174.205:5001/v2/: net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)
script returned exit code 1
TMI
Specs
Docker version 1.13.1, build 4ef4b30/1.13.1
Jenkins ver. 2.204.2
host CentOS Linux 7 (Core)
Reference
I have been working from the instructions on
https://docs.docker.com/registry/deploying/
https://jenkins.io/doc/book/pipeline/docker/#custom-registry
Settings
My /etc/docker/daemon.json file reads {"insecure-registries" : ["192.168.174.205:5001"]}.
The local registry gives a 200 response:
curl http://192.168.174.205:5001/v2/_catalog
{"repositories":["mying"]}
My pipeline script is:
node {
stage('Build') {
docker.withRegistry('http://192.168.174.205:5001') {
docker.image('insureio:latest').inside('') {
sh 'make test'
}
}
}
}
Since both Jenkins and your registry are containers, Jenkins is going to be looking at the 192.168.174.205 IP address at its own network space.
If you're just trying things out, I would suggest doing a docker inspect <your registry container> | grep -i ipaddress to find its IP address (by default it should be in the region of 172.17.XXX.XXX) and configure your pipeline to use that address.
I am running both jenkins master and slave as docker containers by using jenkins/jenkins:lts and jenkins/ssh-slave image on Ubuntu. Following are the steps:
Ran ssh-keygen inside the jenkins-master container(docker exec -it container_id bash) to generate the ssh keys
Added the generated public key to authorized_keys file inside ssh-slave container using dockerfile
Added private key inside Jenkins credentials as per this link
I have looked at many questions related to this issue on stackoverflow but I am stuck with following error:
[02/08/19 20:31:06] [SSH] Opening SSH connection to ###.##.#.#:22.
[02/08/19 20:31:06] [SSH] SSH host key matches key in Known Hosts file. Connection will be allowed.
ERROR: Server rejected the 1 private key(s) for jenkins (credentialId:worker-ssh/method:publickey)
[02/08/19 20:31:06] [SSH] Authentication failed.
Authentication failed.
[02/08/19 20:31:06] Launch failed - cleaning up connection
[02/08/19 20:31:06] [SSH] Connection closed.
Slave Template in Jenkins:
Name: jenkins-worker
Usage: Use this node as much as possible
Launch method: Launch agent via SSH
Hostname: my ip extracted from ifconfig
Host key verification startegy: known hosts file verification strategy (.ssh/known_hosts contains entry for host ip provided)
Dockerfile for ssh-slave
#Docker version 18.09.1
FROM jenkins/ssh-slave
COPY /.ssh/id_rsa.pub /.ssh/authorized_keys
RUN chmod 744 /.ssh/authorized_keys
I have a Red Hat machine on an AWS cloud. I installed Ansible and Docker (experimental version as the community edition cannot be installed now on Red Hat). Now I am runnig a simple command to check whether Docker works:
ansible local -m shell -a "docker pull hello-world"
I'm getting the following error:
localhost | FAILED | rc=1 >>
Using default tag: latest
Warning: failed to get default registry endpoint from daemon (Cannot connect to the Docker daemon. Is the docker daemon running on this host?). Using system default: https://index.docker.io/v1/Cannot connect to the Docker daemon. Is the docker daemon running on this host?
When I use
sudo ansible local -m shell -a "docker pull hello-world"
localhost | UNREACHABLE! => {
"changed": false,
"msg": "Failed to connect to the host via ssh: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).\r\n",
"unreachable": true
}
I have tested Ansible by copying a file into local host and it works fine whereas with Docker I'm facing this issue. Is there anything I am missing or anything that needs to be setup for Docker's experimental version?
You don't want to run ansible through sudo but tell ansible that it should run the command using sudo. That can be done by adding the -s flag
ansible local -s -m shell -a "docker pull hello-world"