Background: My Jenkins is Deploy by docker at target server A.There is 4 intranet target servers(inner server) ABCD.Try to deploy code by plugin Public over SSH to target server C,When I try to establish a connection from jenkins to target server C,There is
jenkins.plugins.publish_over.BapPublisherException: Message [Auth fail]]
My Finnal solution: Put target server's id_rsa.pub to target server's authorized_keys.and i don't know why it works.
here is some of my trys:
put jenkins root's id_rsa.pub to target server root's authorized_keys
put jenkins jenkins's(user) id_rsa.pub to target server root's authorized_keys
create jenkins user at target server and repeat 1,2.this time put into target server's jenkins user.
when I try this,ssh and scp works correctly.but jenkins **Public over SSH ** can't.and now though I solved this but I want to know the reason. thanks for tolerate my gramma..
Related
I am running Jenkins server on AWS EC2.
I tried changing Jenkins URL to myaddress:8080/jenkins in Manage Jenkins->Configure System Section.
Also tried editing JENKINS_LISTEN_ADDRESS to 0.0.0.0/jenkins/ and JENKINS_PREFIX to /jenkins in Jenkins.Service File in lib/systemd/system.
But still server is running on myaddress:8080 instead of myaddress:8080/jenkins.
Jenkins Initial Settings
--prefix=$PREFIX
Runs Jenkins to include the $PREFIX at the end of the URL. For example, set --prefix=/jenkins to make Jenkins accessible at http://myServer:8080/jenkins
While trying to execute some shell command to a remote server from Jenkins, I am getting "Host key verification failed." error.
I have installed Jenkins in a docker on my MAC. Now from the Jenkins browser, I want to run some shell command to a remote server (which is accessible).
To do that I already added the server (10.206.y.z) in the configure page of jenkins (by providing hostname i.e. 10.206.y.z, username & the key generated in 10.206.y.z server in the SSH Servers section) & tested the connection. The connection test passed & then I saved the configuration.
For executing shell command on 10.206.y.z, I created a FreeStyle project & in the Execute Shell section I am passing ssh root1#10.206.y.z 'hostname'.
If I run the project by clicking 'Build Now', I am getting the below error & the build fails.
Running as SYSTEM
Building in workspace /var/jenkins_home/workspace/TestProject
[TestProject] $ /bin/sh -xe /tmp/jenkins4234161871485451783.sh
+ ssh root1#10.206.y.z hostname
Host key verification failed.
Build step 'Execute shell' marked build as failure
Finished: FAILURE
Any help please?
I think that the problem is the key. You are using the key generated in the destination machine (10.206.y.z)
You should use the key generated on the jenkins server and pass it to the destination server. For that you could use ssh-copy-id
I am trying Jenkins to execute an ansible playbook.
But I am getting the unreachable host error which I don't get otherwise.
fatal: [vogo-alpha.cloudapp.net]: UNREACHABLE! => {"changed": false, "msg": "Authentication failure.", "unreachable": true}
I have given this variable in ansible hosts file,
ansible_ssh_private_key_file=/home/luvpreet/.ssh/id_rsa
I think it is because the user jenkins is playing those playbooks and it cannot read this private key file. I tried to make jenkins' user home folder but it was not successful.
It can be done if I switch to the user luvpreet and then run these playbooks.
How do I switch to another user via jenkins shell ?
OR
Is there any other way this problem can be solved ?
There are a couple of possibilities why your solution is working. Most likely because Ansible is trying to ssh to your target machine as the jenkins user which isn't on said machine. I'd approach the problem from a different angle.
First, I'd install the Ansible plugin for Jenkins. This allows you to use the built in credentials located at "Manage Jenkins > Manage Credentials". There you can copy and paste your key in (or point to a key file located on the jenkins server) and set the username that will ssh to the target machine. In your job configuration choose "Invoke Ansible Playbook" for your build step rather than shell. There will be a "Credentials" parameter where you can specify the ssh key you added earlier. The rest should be pretty self explanatory.
I have a setup of 2 VMs : VM1 with jenkins, VM2 with gitlab
On VM2 I have created a repo with user root with public access http://192.168.0.32/root/sparkjava_hello_world (acccess OK)
and generate the access token
On VM1:
- I installed the gitlab plugin in jenkins
- I copied the public key of user jenkins to authorized_key of user git in VM2 : from user jenkins shell, ssh git#VM2 is OK, no password asked
- I created the gitlab api credential and pasted the access token in it
- I configured the gitlab url in Manage Jenkins -> Configure System menu (it responds ok)
BUT when I setup the git source git#192.168.0.32:root/sparkjava_hello_world.git in my jenkins job, it doesnt work :
Failed to connect to repository : Command "/usr/bin/git ls-remote -h git#192.168.0.32:root/sparkjava_hello_world.git HEAD" returned status code 128:
stdout:
stderr: fatal: 'root/sparkjava_hello_world.git' does not appear to be a git repository
fatal: Could not read from remote repository.
I assume the ssh connection to VM2 is ok, since this is not a connection refused message.
I tried "ssh://git#192.168.0.32:root/sparkjava_hello_world.git" doesnt work either
What did i missed, or did wrong ??
thanks for help :)
Check that in VM2 you do have (as defined by default in a typical gitlab.yml) a /home/git/repositories/root/sparkjava_hello_world.git
Try an interactive ssh session on VM2 (from VM1), and do the ls-remote there:
ssh git#192.168.0.32
git ls-remote /home/git/repositories/root/sparkjava_hello_world.git
For Jenkins, what you need is to use your public key (~/.ssh/id_rsa.pub) as:
a deploy key on the GitLab side
a credential on the Jenkins side (see this tutorial)
Make sure to deploy that deploy key on your GitLab project (project settings/deploy keys), and then your Jenkins will be able to access your GitLab project (using that ssh key as credential).
Note: the normal use of a GitLab user key (like user xxx) in VM1 would be:
to define a user xxx in GitLab
to associate its public key in its user settings/ssh keys (that will modify ~git/.ssh/authorized_keys for you, adding a forced command line (this link is for gitolite, but it applies to gitlab too)
That means an ssh -T git#192.168.0.32 should not open an interactive session, but generate the message:
Welcome to GitLab, xxx
I have a compressed file in my jenkins workspace folder, I am trying to transfer that compressed file over ssh, but I got SSH: Transferred 0 file(s)
My Configurations are below-
Transfer set source file: my-files.zip
Remove Prefix:
Remote directory: /home/my-files
My Console Output:
Started by user Mizanur Rahman
SSH: Transferred 0 file(s)
Build step 'Send files or execute commands over SSH' changed build result to SUCCESS
Finished: SUCCESS
The first test would be to try and replicate that tranfer manually, from the server executing the job, with the user running the job.
On that server, try at least a:
ssh -Tv xxx#remote.com
That will confirm if you (as the user running the Jenkins job) actually establish an secure shell session. Replace xxx and remote.com by the remote user and remote server target for that session.