Jenkins - file transfer to sudo user directory in the target server - jenkins

I am trying to transfer all .sh files from one unix server to another using jenkins.
Files are getting transfer but it is coming in my unix home directory, I need to transfer it sudo user directory.
for example:
Source server name is "a" and target server name is "u"
we are using sell4 as sudo user in target server name
it should come in home directory of sell4 user
I have used the below command
Building in workspace /var/lib/jenkins/workspace/EDB-ExtractFilefromSVN
SSH: Connecting from host [a]
SSH: Connecting with configuration [u] ...
SSH: EXEC: STDOUT/STDERR from command [sudo scp *.sh sell4#u:/usr/app/TomcatDomain/ScoringTools_ACCDomain04/] ...
sudo: scp: command not found
SSH: EXEC: completed after 201 ms
SSH: Disconnecting configuration [u] ...
ERROR: Exception when publishing, exception message [Exec exit status not zero. Status [1]]
Gitcolony notification failed - java.lang.IllegalArgumentException: Invalid url:
Finished: UNSTABLE
Can you please suggest what I am going wrong here?
EDITS:
Adding the shell screenshot:

ah so it's some kind of plugin. It seems like you want to run local sudo to login to remote server user. It won't work this way. You can't open door to bathroom and expect walking into a garden.
sudo changes your local user to root, not remote server.
Do not use sudo with scp command but rather follow these answers:
https://unix.stackexchange.com/questions/66021/changing-user-while-scp

Related

Error in jenkins when trying to restart apache2 on a remote host

In jenkins, in the "publish over ssh" plugin, after copying the file, I try to run the command in the "Exec command" block:
sudo service apache2 restart
enter image description here
An error appears:
enter image description here
ERROR: Exception when publishing, exception message [Exec exit status not zero. Status 1]
Build step 'Send files or execute commands over SSH' changed build result to UNSTABLE
Finished: UNSTABLE
Also tried to use the following command:
sudo systemctl restart apache2.service
The connection is made under a specific user, the file is successfully written, but the command is not executed (I checked the status of the service on the host). The user has sudo rights, the password request is also disabled for him.
Restart commands are executed successfully directly on the host itself
Sorry for bad english. Trying to learn jenkins

Why Jenkins says "Server rejected the 1 private key(s)" while launching the agent?

I am successfully able to connect to remote machine using SSH but when I am launching the agent from Jenkins it throws the following error:
ERROR: Server rejected the 1 private key(s) for user1 (credentialId:xxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx/method:publickey)
[01/19/17 05:35:15] [SSH] Authentication failed.
hudson.AbortException: Authentication failed.
at hudson.plugins.sshslaves.SSHLauncher.openConnection(SSHLauncher.java:1219)
at hudson.plugins.sshslaves.SSHLauncher$2.call(SSHLauncher.java:714)
at hudson.plugins.sshslaves.SSHLauncher$2.call(SSHLauncher.java:709)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
[01/19/17 05:35:15] Launch failed - cleaning up connection
[01/19/17 05:35:15] [SSH] Connection closed.
I can establish SSH connection from master machine to the node machine using user1, however when I am trying to launch the agent using user1 from jenkins it is rejecting the private key. Is there any solution to overcome this issue?
I solve this issue following below steps:
From the target slave node's console
Switch to the root user:
sudo su
Add a jenkins user with the home /var/lib/jenkins (Note: I am keeping my home directory in /var/lib/jenkins):
useradd -d /var/lib/jenkins jenkins
From the Jenkins Master
Copy the /var/lib/jenkins/.ssh/id_rsa.pub key from the Jenkins user on the master
From the target slave node's console
Create an authorized_keys file for the Jenkins user
mkdir /var/lib/jenkins/.ssh
touch /var/lib/jenkins/.ssh/authorized_keys
Paste the key from the Jenkins master into the file vim. Save with :wq!
Make sure the files have correct owner and permission.
chown -R jenkins /var/lib/jenkins/.ssh
chmod 600 /var/lib/jenkins/.ssh/authorized_keys
chmod 700 /var/lib/jenkins/.ssh
I solved this issue by following the below steps:
1) Make sure you are on correct path in both slave and master machines. You also need to sign in to the machines with the right user. Say I need to create a new global jenkins user "jenkins" and I want my keys to be in the path "/home/jenkins/.ssh/", add "jenkins" user to the machines first.
2) Now create .ssh folder and generate ssh keys using the steps given in https://support.cloudbees.com/hc/en-us/articles/222978868-How-to-Connect-to-Remote-SSH-Slaves-
3) Make sure you do the above steps - 1 & 2 in your master machines as well
4) You need to have ssh keys in both master and slave machines in the same path and with same "jenkins" user permissions.
5) Finally, ssh both machine IPs to and fro to check the bidirectional connectivity from your terminal.
6) Configure jenkins credentials and nodes. Make sure you give the same remote root directory - "/home/jenkins" in your node configuration and select "manually trusted key verification strategy" - as suggested in https://linuxacademy.com/community/posts/show/topic/16008-jenkins-adding-a-slave
My Solution was:
$ user add -d /var/lib/jenkins jenkins
$ sudo su
$ passwd jenkins
$ chown -R jenkins /var/lib/jenkins/.ssh/*
$ chmod 700 .ssh
It worked after tampering around for 2 hours...
Changing type of ssh key from 'rsa' to 'ed25519' worked for me
ssh-keygen -t ed25519
The master needed to be added the list of known hosts for me.
What you need to do is SSH to the master from your local. Then use the masters private key to SSH to the slave. If you can do this manually, then Jenkins will be able to do it as well.
I used the masters private key as the credential in Jenkins, followed #Aamir's answer then finally some success.

Jenkins - unable to copy build artifacts to desired root folder on target machine

We are just starting with Jenkins for CI - so we're definitely in the newbie phase here. Here's what we are trying to do:
We are using Publish Over SSH Jenkins plugin to transfer the build artifacts to a target server and into a specific root folder - lets call it /var/mycompany/myapp and this is where the problem is happening.
We have configured the Publish Over SSH plugin to use a key in order to make the connection to the target server. In Manage Jenkins > Configure System > SSH Servers section we have also configured our target ssh server (name/hostname/username/remote folder etc). The connection is successful when tested.
The build job has been configured to "Send artifacts over SSH" as a post-build action. Now this works fine as long as I send the files to my user's home folder location on the target server (ie it will create the necessary ~/var/mycompany/mayapp folder structure and transfer all the files). BUT - if I change the "Publish over SSH" config to use a Remote Directory of "/" the job fails with the following error:
SSH: Connecting from host [myjenkins] SSH: Connecting with
configuration [stage-tester] ... SSH: Creating session: username
[myusername], hostname [x.x.x.x], port [22] SSH: Connecting session
... SSH: Connected SSH: Opening SFTP channel ... SSH: SFTP channel
open SSH: Connecting SFTP channel ... SSH: Connected SSH: cd [/] SSH:
OK SSH: cd [var] SSH: OK SSH: cd [mycompany] SSH: OK SSH: mkdir
[myapp] SSH: FAILED: Message [Permission denied]
At first this made sense to us as the key was created by a specific user who was not a sudoer on the target Linux/Fedora server. So We made the user a member of a sudoer group expecting that this would solve the problem. It hasn't fixed the issue - we continue to get the "permission denied" error. So the question is how to we go about gaining access to the server root for our user/key?
Any advice is appreciated.
Thanks!
In the end this is the solution I used:
Using instructions found here I setup a "jenkins-chef" ssh keypair in the /var/lib/jenkins/.ssh folder and transferred the public key to the jenkins user's authorizedkeys file on my target deployment server.
chown the keypair in /var/lib/jenkins/.ssh (from step 1) to the "jenkins" user so that the Jenkins ssh server configuration in step 3 can read the key
Set up an SSH server reference in Jenkins: under Manage Jenkins > Configure System > SSH Servers > Added new ssh server ref pointing at my target deployment server using the 'jenkins' username and path to the new key (/var/lib/jenkins/.ssh/jenkins-chef). Also, we wanted to publish to a folder off of the target server's root (/) folder and it turned out to be important that we specify the "remote directory" as '/' (root)
In my Jenkins job I added a new build step using the 'send files or execute commands over ssh' plugin. I configured it to use the ssh server I defined in step 3:
source files: **/*
exec command: sudo chef-client
remote directory: var/my-apps/my-app
Note: If I attempted to specify remote directory with initial root slash (/var/my-apps/my-app) it would copy then to the Jenkins user's home folder and simply create the specified folder structure. That's why specifying '/' in step 3 was important.
That's it from the Jenkins side. However the first time I tried to run this I got this error:
SSH: EXEC: STDOUT/STDERR from command [sudo chef-client] ... sudo: no
tty present and no askpass program specified
This was because I was attempting to run 'sudo' which issues a password challenge on the target server. To avoid this I made the following change to the sudoers file on the target server using the 'visudo' command: at the bottom of the file I added this line to give the jenkins user the ability to run sudo without being prompted for a password:
jenkins ALL=(ALL) NOPASSWD:ALL
Once that was done it all worked as expected.
As you are working with Linux machines, you can try to use the SCP Plugin.
In the global configuration, you just have to define the target server like that:
You can use your jenkins public ssh key to manage the authentication or a user/pwd.
Next, in the Jenkins job, you can create a post build task to copy the relevant artifacts on your server:

ssh connection failing when pushing on a Gitlab repo

I have installed GitLab. Suppose I installed it in /home/myuser/gitlab.
I created a new project
I was told to create a repo "test" I put in /home/myuser/gitlab/test
I added some SSH key in /home/myuser/.ssh
Then I initialized a Git repo in /home/myuser/gitlab/test.
Following instructions, I added a remote git#localhost:root/testing.git
but when I try to push, I get this error message:
$ git push -u origin master
ssh: connect to host localhost port 22: Connection refused
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
I installed GitLab in OS X and I have other SSH keys in /home/myhome/.ssh, I have set up the user email and name inside /home/myuser/gitlab/.git/config, (and set those globally just for testing) and the server is launched from /home/myuser/gitlab. Does anybody have an idea where this error comes from?
If I run ssh git#localhost, I get
/home/myhome/.ssh/config line 4: garbage at end of line; "home".
where in this file I have some settings for a remote server for another project. I think it is the problem but I don't really know how to fix it.
Update : Here's the content of my ~/.git/config file
Host remote_test_server
Hostname remote_test_user#ftp.remote_test_server
IdentityFile ~/.ssh/id_rsa_stf.pub
User <your home acct>
/home/myhome/.ssh/config line 4: garbage at end of line; "home".
That would prevent any ssh command to properly function, because of a parasite echo done by the remote session.
Check your .profile or other .rc files, and see if any echo is done in those.
Or at least, test with ssh -T git#localhost, in order to disable any TTY allocation.
check also the content of your .ssh/config file, which doesn't seem to be properly formatted.
See this example of a config file:
User should be the login name of the account used for the ssh session on the rmeote server.
It should not be the homedir path.
IdentityFile should reference the private key (~/.ssh/id_rsa_stf), not the public one!
Hostname should reference the remote server 'ftp.remote_test_server', not the user#remoteServer.

Jenkins Host key verification failed

I have a problem with jenkins, setting "git", shows the following error:
Failed to connect to repository : Command "git ls-remote -h https://person#bitbucket.org/person/projectmarket.git HEAD" returned status code 128:
stdout:
stderr: fatal: Authentication failed
I have tested with ssh:
git#bitbucket.org:person/projectmarket.git
This is error:
Failed to connect to repository : Command "git ls-remote -h git#bitbucket.org:person/projectmarket.git HEAD" returned status code 128:
stdout:
stderr: Host key verification failed.
fatal: The remote end hung up unexpectedly
I've also done these steps with "SSH key".
Login under Jenkins
sudo su jenkins
Copy your github key to Jenkins .ssh folder
cp ~/.ssh/id_rsa_github* /var/lib/jenkins/.ssh/
Rename the keys
mv id_rsa_github id_rsa
mv id_rsa_github.pub id_rsa.pub
but still not working git repository in jenkins.
thanks by help!.
Change to the jenkins user and run the command manually:
git ls-remote -h git#bitbucket.org:person/projectmarket.git HEAD
You will get the standard SSH warning when first connecting to a new host via SSH:
The authenticity of host 'bitbucket.org (207.223.240.181)' can't be established.
RSA key fingerprint is 97:8c:1b:f2:6f:14:6b:5c:3b:ec:aa:46:46:74:7c:40.
Are you sure you want to continue connecting (yes/no)?
Type yes and press Enter. The host key for bitbucket.org will now be added to the ~/.ssh/known_hosts file and you won't get this error in Jenkins anymore.
Jenkins is a service account, it doesn't have a shell by design. It is generally accepted that service accounts. shouldn't be able to log in interactively.
To resolve "Jenkins Host key verification failed", do the following steps. I have used mercurial with jenkins.
1)Execute following commands on terminal
$ sudo su -s /bin/bash jenkins
provide password
2)Generate public private key using the following command:
ssh-keygen
you can see output as ::
Generating public/private rsa key pair.
Enter file in which to save the key (/var/lib/jenkins/.ssh/id_rsa):
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
3)Press Enter --> Do not give any passphrase--> press enter
Key has been generated
4) go to --> cat /var/lib/jenkins/.ssh/id_rsa.pub
5) Copy key from id_rsa.pub
6)Exit from bash
7) ssh#yourrepository
8) vi .ssh/authorized_keys
9) Paste the key
10) exit
11)Manually login to mercurial server
Note: Pls do manually login otherwise jenkins will again give error "host verification failed"
12)once manually done, Now go to Jenkins and give build
Enjoy!!!
Good Luck
Or you can use:
ssh -oStrictHostKeyChecking=no host
This will be insecure (man in the middle attacks) but easiest solution.
The better way to do that is to generate correct mappings between host and ip address, so ssh will not complain:
#!/bin/bash
for domain in "github.com" "bitbucket.org"; do
sed -i "/$domain/d" ~/.ssh/known_hosts
line=$(ssh-keyscan $domain,`nslookup $domain | awk '/^Address: / { print $2 ; exit }'`)
echo $line >> ~/.ssh/known_hosts
done
Excerpt from gist.
I think, that many people didnt recognize, at least available since jenkins 2.361:
btw. No Verification is for sure not the best option.
Had same problem, i fix it like that :
reset permission on id_rsa* only for current user no group no other
chmod o-rwx ~/.ssh/id*
chmod G-rwx ~/.ssh/id*
ls -lart ~/.ssh/
-rw------- 1 jenkins nogroup 398 avril 3 09:34 id_rsa.pub
-rw------- 1 jenkins nogroup 1675 avril 3 09:34 id_rsa
And clear ~/.ssh/know_hosts
Now Connect as jenkins
sudo su jenkins
Try the jenkins commands
git ls-remote -h git#bitbucket.org:user/project.git HEAD
If no problem appears, now jenkins will be able to connect the repo (for me ^^ at least)
As for the workaround (e.g. Windows slave), define the following environment variable in global properties:
GIT_SSH_COMMAND="ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no"
Note: If you don't see the option, you probably need EnvInject plugin for it.
login as jenkins using: "sudo su -s /bin/bash jenkins"
git clone the desired repo which causes the key error
it will ask you to add the key by showing Yes/No (enter yes or y)
that's it!
you can now re-run the jenkins job.
I hope you this will fix your issue.
using https://bitbucket.org/YYYY/XX.git
you shoud delete username#
Make sure we are not editing any of the default sshd_config properties to skip the error
Host Verification Failed - Definitely a missing entry of hostname in known_hosts file
Login to the server where the process is failing and do the following:
Sudo to the user running the process
ssh-copy-id destinationuser#destinationhostname
It will prompt like this for the first time, say yes and it will also ask password for the first time:
The authenticity of host 'sample.org (205.214.640.91)' can't be established.
RSA key fingerprint is 97:8c:1b:f2:6f:14:6b:5c:3b:ec:aa:46:46:74:7c:40.
Are you sure you want to continue connecting (yes/no)? *yes*
Password prompt ? give password
Now from the server where process is running, do ssh destinationuser#destinationhostname. It should login without a password.
Note: Do not change the default permissions of files in the user's .ssh directory, you will end up with different issues
I ran into this issue and it turned out the problem was that the jenkins service wasn't being run as the jenkins user. So running the commands as the jenkins user worked just fine.
Copy host keys from both bitbucket and github:
ssh root#deployserver 'echo "$(ssh-keyscan -t rsa,dsa bitbucket.org)" >> /root/.ssh/known_hosts'
ssh root#deployserver 'echo "$(ssh-keyscan -t rsa,dsa github.com)" >> /root/.ssh/known_hosts'
Best way you can just use your "git url" in 'https" URL format in the Jenkinsfile or wherever you want.
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
SSH
If you are trying it with SSH, then the Host key Verification error can come due to several reasons.Follow these steps to overcome all the reasons.
Set the Environment variable as HOME and provide the address as the root directory of .ssh folder. e.g:- If your .ssh is kept inside Name folder.
C:/Users/Name.
Now make sure that the public SSH key is being provided in the repository link also. Either it is github or bitbucket or any other.
Open git bash. And try cloning the project from the repository. This will help in adding your repository URL in the known_host file, which is being auto created in the .ssh folder.
Now open jenkins and create a new job. Then click on configure.
provide the cloning URL in Source code management under Git. The URL should be start with git#github.com/......... or ssh://proje........
Under the Credential you need to add the username and password of your repository form which you are cloning the project. Select that credential.
And now apply and save the configuration.
Bingo! Start building the project. I hope now you will not get any Host Key verification error!
Try
ssh-keygen -R hostname
-R hostname Removes all keys belonging to hostname from a known_hosts file. This option is useful to delete hashed hosts
Use ssh-keyscan should be much more easier:
ssh-keyscan bitbucket.org >> ~/.ssh/known_hosts
This command will put all required hosts to ~/.ssh/known_hosts. You will need to run this command inside your Jenkins machine. You can also create a job and put that command into the "Execute shell" section of the Configure of that job and then execute the job.
issue is with the /var/lib/jenkins/.ssh/known_hosts. It exists in the first case, but not in the second one. This means you are running either on different system or the second case is somehow jailed in chroot or by other means separated from the rest of the filesystem (this is a good idea for running random code from jenkins).
Next steps are finding out how are the chroots for this user created and modify the known hosts inside this chroot. Or just go other ways of ignoring known hosts, such as ssh-keyscan, StrictHostKeyChecking=no or so.
After ssh-keygen probably one only needs to copy the public key to remote host with:
ssh-copy-id -i ~/.ssh/mykey user#host
There is a safe and (relative easy) way to accomplish this, which should also work if you have separate worker nodes/clouds (like docker/kubernetes).
Adding host keys to Jenkins configuration
First go to a console and execute ssh-keyscan your_git_server.url
Copy the output of that command
Then navigate to https://YOUR_JENKINS_URL/manage/configureSecurity/
Scroll down to Git Host Key Verification Configuration
Paste the output of the command into the window. it should look like this:
Both bitbucket and github have pages about their keys and servers. Read them and ensure that you are adding the proper keys and not some random keys
Getting the ssh-keyscan via your Jenkins installation
If you for some reason do not have ssh-keyscan, you can go to the script console ( https://YOUR_JENKINS_URL/manage/script ) and paste in the following script:
def sout = new StringBuilder(), serr = new StringBuilder()
def proc = 'ssh-keyscan bitbucket.org'.execute()
proc.consumeProcessOutput(sout, serr)
proc.waitForOrKill(1000)
println "copy this to jenkins>\n$sout"
//println "err> $serr"

Resources