While the jenkins job is running it is asking for credential like:
[sshexec] Enter password for datasource user
Please let me know how we can proceed further on this.
There is a plugin on Jenkins designed for that: Credential plugin
https://wiki.jenkins.io/display/JENKINS/Credentials+Plugin
You set-up your data within this plugin, and then you can re-use later in your build. The same way as if they were regular shell variable.
spawn ssh id#server
match_max 100000
expect "*?assword:*"
send -- "$your_password\r"
send -- "\r"
interact
But if I may provide a recommendation, this is not the best way to connect in SSH.
You should use ssh key will make you get ride of the password step.
You generate your key:
ssh-keygen -t rsa -b 4096
You push it to your server:
ssh-copy-id id#server
And then you can log-in without any password needed:
ssh id#server
Related
I want to use a curl with a username and password that I set in the bashrc. ie:
curl -u $jenkuser:$jenkpass foobar.org
but this isn't working for me. So what is a good way to set secret credentials that I don't want in my repo/Jenkinsfile
Create a Jenkins project with Execute shell build step. In that shell you can run curl command and to set credentials, there is one option named This build is parameterized, where you can create Password Parameter. These parameters can be used in shell with curl command. Here is screenshot of my test project.
This way is secure because password is stored in encrypted format.
.
I was trying to login to a docker lab host using Putty but not able to login. The error is :-
'no authentication method available server sent public key'.
If I pass both username#server name i get error.
If I only pass the server name i get to the login screen but then when i enter my username the error pops up
I tried searching the web but couldn't find.
Please can anyone help me.
You can refer to the article "PWD + SSH = ❤" (with "PWD" = "Play With Docker").
The full command should be:
ssh -p 1022 <instance_ip_with_dashes>-<short_session_id>#pwdhost
But that requires on the client side to have in $HOME/.ssh
id_rsa: the private key
id_rsa.pub: the public key registered in <instance_ip_with_dashes>-<short_session_id> home.
Since copying a private key is not a good practice, you can do the opposite (in a Git bash session, not using putty):
ssh-keygen -t rsa -P "" -f ~/.ssh/id_rsa
Copy the public generated key to ~<instance_ip_with_dashes>-<short_session_id>/.ssh/authorized_keys
Then your SSH session will be possible.
Note: All that is managed for you if you are using the docker-machine pwd driver, as shown in the article above:
As you can see, in that case, a docker-machine ssh is enough.
http://github.com/play-with-docker/play-with-docker/issues/285
Actually I had to create or generate keys on the client. In order to start the communication between the client and the server.But I was copying the private key of the server silly me.
Key generation can be done in 2 ways
1. If you are using git run ssh-keygen
2. If you want to login using putty the first generate keys using puttygen and then attached the private key .ppk while ssh.
Trying to connect to the server using Jenkins ssh plugin and executing some commands. It's connected but sesu command is not working. Jenkins is unable to recognize sesu command. It says sesu : not found no such file or directory. When trying with putty, sesu command is working. Jenkins version is 2.7. Please help me on same. Thanks in advance. :)
Use locate sesu command. It will show path of sesu command. Use that one in spite of only sesu. e.g. /opt/CA/AccessControl/bin/sesu
As we can't enter password on Jenkins at run-time. Other option for accessing server is generate ssh-key. Use that key to access server from Jenkins.
Use putty or similar tools to create key.
Firstly, log in to your server using credentials.
Then switch to user who has all access rights by using sesu or other switch user command like sudo su.
Execute below steps after that :
ssh-keygen -t rsa
hit enter for all steps without entering input.
then,
Once key is created, type cp ~/.ssh/id_rsa.pub ~/.ssh/authorized_keys hit [Enter]
use created rsa private key after all these step in SSH plug and you are able to access unix server from Jenkins and execute commands on that server.
I am using the credentials plugin in Jenkins to manage credentials for git and database access for my team's builds. I would like to copy the credentials from one jenkins instance to another, independent jenkins instance. How would I go about doing this?
UPDATE: TL;DR Follow the link provided below in a comment by Filip Stachowiak it is the easiest way to do it. In case it doesn't work for you go on reading.
Copying the $HUDSON_HOME/credentials.xml is not the solution because Jenkins encrypts paswords and these can't be decrypted by another instance unless both share a common key.
So, either you use the same encription keys in both Jenkins instances (Where's the encryption key stored in Jenkins? ) or what you can do is:
Create the same user/password, you need to share, in the 2nd Jenkins instance so that a valid password is generated
What is really important is that user ids in both credentials.xml are the same. For that (see the credentials.xml example below) for user: Jenkins the identifier <id>c4855f57-5107-4b69-97fd-298e56a9977d</id> must be the same in both credentials.xml
<com.cloudbees.plugins.credentials.SystemCredentialsProvider plugin="credentials#1.22">
<domainCredentialsMap class="hudson.util.CopyOnWriteMap$Hash">
<entry>
<com.cloudbees.plugins.credentials.domains.Domain>
<specifications/>
</com.cloudbees.plugins.credentials.domains.Domain>
<java.util.concurrent.CopyOnWriteArrayList>
<com.cloudbees.plugins.credentials.impl.UsernamePasswordCredentialsImpl>
<scope>GLOBAL</scope>
<id>c4855f57-5107-4b69-97fd-298e56a9977d</id>
<description>Para SVN</description>
<username>jenkins</username>
<password>J1ztA2vSXHbm60k5PjLl5jg70ZooSFKF+kRAo08UVts=
</password>
</com.cloudbees.plugins.credentials.impl.UsernamePasswordCredentialsImpl>
</java.util.concurrent.CopyOnWriteArrayList>
</entry>
</domainCredentialsMap>
</com.cloudbees.plugins.credentials.SystemCredentialsProvider>
I was also facing the same problem. What worked for me is I copied the credentials.xml, config.xml and the secrets folder from existing jenkins to the new instance. After the restart of jenkins things worked fine.
This is what worked for me.
Create a job in Jenkins that takes the credentials and writes them to output. If Jenkins replaces the password in the output with ****, just obfuscate it first (add a space between each character, reverse the characters, base64 encode it, etc.)
I used a Powershell job to base64 encode it:
[convert]::ToBase64String([text.encoding]::Default.GetBytes($mysecret))
And then used Powershell to convert the base64 string back to a regular string:
[text.encoding]::Default.GetString([convert]::FromBase64String("bXlzZWNyZXQ="))
After trying quite a few things for several days this is the best solution I found for migrating my secrets from a Jenkins 2.176 to a new clean Jenkins 2.249.1 jenkins-cli was the best approach for me.
The process is quite simple just dump the credentials from the old instance to a local machine, or Docker pod with java installed, as a XML file (unencrypted) and then uploaded to the new instance.
Before starting you should verify the following:
Access to the credentials section on both Jenkins instances
Download the jenkins-ccli.jar from one of the instances (https://www.your-jenkins-url.com/cli/)
Have User and Password/Token at hand.
Notice: In case your jenkins uses an oAuth service you will need to
create a token for your user. Once logged into jenkins at the top
right if you click your profile you can verify both username and
generate password.
Now for the special sauce, you have to execute both parts from the same machine/pod:
Notice: If your instances are using valid Certificates and you want to
secure your connection you must remove the -noCertificateCheck
flag from both commands.
# OLD JENKINS DUMP #
export USER=madox#example.com
export TOKEN=f561banana6ead83b587a4a8799c12c307
export SERVER=https://old-jenkins-url.com/
java -jar jenkins-cli.jar -noCertificateCheck -s $SERVER -auth $USER:$TOKEN list-credentials-as-xml "system::system::jenkins" > /tmp/jenkins_credentials.xml
# NEW JENKINS IMPORT #
export USER=admin
export TOKEN=admin
export SERVER=https://new-jenkins-url.com/
java -jar jenkins-cli.jar -noCertificateCheck -s $SERVER -auth $USER:$TOKEN import-credentials-as-xml "system::system::jenkins" < /tmp/jenkins_credentials.xml
If you have the credentials.xml available and the old Jenkins instance still running, there is a way to decrypt individual credentials so you can enter them in the new Jenkins instance via the UI.
The approach is described over at the DevOps stackexchange by kenorb.
This does not convert all the credentials for an easy, automated migration, but helps when you have only few credentials to migrate (manually).
To summarize, you visit the /script page over at the old Jenkins instance, and use the encrypted credential from the credentials.xml file in the following line:
println(hudson.util.Secret.decrypt("{EncryptedCredentialFromCredentialsXml=}"))
To migrate all credentials to a new server, from Jenkins: Migrating credentials:
Stop Jenkins on new server.
new-server # /etc/init.d/jenkins stop
Remove the identity.key.enc file on new server:
new-server # rm identity.key.enc
Copy secret* and credentials.xml to new server.
current-server # cd /var/lib/jenkins
current-server # tar czvf /tmp/credentials.tgz secret* credentials.xml
current-server # scp credentials.tgz $user#$new-server:/tmp/
new-server # cd /var/lib/jenkins
new-server # tar xzvf /tmp/credentials.tgz -C ./
Start Jenkins.
new-server # /etc/init.d/jenkins start
Migrating users from a Jenkins instance to another Jenkins on a new server -
I tried following https://stackoverflow.com/a/35603191 which lead to https://itsecureadmin.com/2018/03/26/jenkins-migrating-credentials/. However I did not succeed in following these steps.
Further, I experimented exporting /var/lib/jenkins/users (or {JENKINS_HOME}/users) directory to the new instance on new server. After restarting the Jenkins on new server - it looks like all the user credentials are available on new server.
Additionally, I cross-checked if the users can log in to the new Jenkins instance. It works for now.
PS: This code is for redhat servers
Old server:
cd /var/lib/jeknins
or cd into wherever your Jenkins home is
tar cvzf users.tgz ./users
New server:
cd /var/lib/jeknins
scp <user>#<oldserver>:/var/lib/jenkins/user.tgz ~/var/lib/jenkins/.
sudo tar xvzf users.tgz
systemctl restart jenkins
Did you try to copy the $JENKINS_HOME/users folder and the $JENKINS_HOME/credentials.xml file to the other Jenkins instance?
Follow the instructions here: http://nerdwin15.com/2013/04/continuous-integration-with-stash-and-jenkins/
I Have jenkins and stash "connected" however, running the builds hangs at
Fetching upstream changes from
ssh://git#git.xyz.com:7999/gp/gp-xyz.git
FATAL: Failed to fetch from ssh://git#git.xyz.com:7999/gp/gp-xyz.git
hudson.plugins.git.GitException: Failed to fetch from
ssh://git#git.xyz.com:7999/gp/gp-xyz.git
So from what I gather the problem is that if i run this command on jenkins (which is running on windows)..
$ git clone ssh://git#git.xyz.com:7999/gp/gp-xyz.git Cloning into
'gp-xyz'... Enter passphrase for key '/c/Documents and
Settings/userMe/.ssh/id_rsa':
Is the fact that I have to enter a password here. How can i configure windows to store the ssh key so that I can clone like the build server does?
What i tried is:
userMe#jenkins /C $ ssh -T git#git.xyz.com:7999 ssh:
git.xyz.com:7999: no address associated with name
userMe#jenkins /C $ ssh -T git#git.xyz.com git#git.xyz.com's
password: Permission denied, please try again. git#git.xyz.com's
password:
However, This confuses me. Because Stash is running on port 7999 and there is no actual user named git on stash but it wont let me change that?
Use open ssh to setup private and public keys on your windows host
You can use a service like open SSH to generate a DSA/RSA-2 key and setup a no_pw option. (Do not setup an RSA-1 key) as stash has issues with RSA-1.
After that , add your public key into your list of keys in your stash user profile.
Regarding your other questions,
By default, Stash http protocol runs on port 7990 and the ssh protocol is supported on protocol 7999.
git is a default userid used by Stash behind the scenes to talk to the underlying git repository