when I use sendmail in local AWS EC2, gmail regard it spam - docker

I want send e-mail when jenkins build my project
my work process is
create EC2(Amazon Linux) and access ssh (ssh ec2-user#(EC2 ip))
install docker and docker jenkins > run jenkins container
install sendmail(i will use sendmail EC2 localhost server not smtp.gmail)
Jenkins Location > jenkins url : http://13.1xx.xx.xxx:8080/ > system admin email address : blablabal#mycompany.com
jenkins > Configure System > E-mail Notification > Test configuration by sending test e-mail > insert my e-mail myemail#gmail.com
Email was successfully sent
and when i check my email, but it regarded spam mail
I try sendmail command in EC2 instance's command line
# In AWS EC2 instance
$ vi test.txt ( save some text )
$ sendmail myemail#gmail.com < test.txt
and it regared spam
i think every email sended by sendmail, so I try sendmail in my local mac book
# MY LOCAL MAC BOOK
$ vi test.txt (save some text)
$ sendmail myemail#gmail.com < test.txt
but it regared normal e-mail, not spam
please somebody help me? thank you

Related

Send emails from a container in gitlab ci

I am using a openSUSE Tumbleweed container in a gitlab ci pipeline which runs some script. In that script, I need to send an email at some point of time with certain content.
In the container, I am installing postfix and configuring that relay server in /etc/postfix/main.cf.
The following command works on my laptop using that same relay server:
echo "This is the body of the email" | mail -s "This is the subject" -r sender#email.com receiver#email.com
but doesn't work from the container, even having the same postfix configuration.
I've seen some tutorials that show how to use the postfix/smtp configuration from the host, but since this is a container running in gitlab ci, that's not applicable.
So, finally opted for a python solution and call the script from bash, this way I really don't need to configure postfix, smtp or any other thing. You just export your variables in bash (our use argparse) and run this script. Of course, you need a relay server without auth (normally on port 25).
import os
import smtplib
from email.mime.text import MIMEText
smtp_server = os.environ.get('RELAY_SERVER')
port = os.environ.get('RELAY_PORT')
sender_email = os.environ.get('SENDER_EMAIL')
receiver_email = os.environ.get('RECEIVER_EMAIL')
mimetext = MIMEText("this is the body of the email")
mimetext['Subject'] = "this is the subject of the email"
mimetext['From'] = sender_email
mimetext['To'] = receiver_email
server = smtplib.SMTP(smtp_server, port)
server.ehlo()
server.sendmail(sender_email, receiver_email.split(','), mimetext.as_string())

How can we pass runtime parameter in jenkins job like password

While the jenkins job is running it is asking for credential like:
[sshexec] Enter password for datasource user
Please let me know how we can proceed further on this.
There is a plugin on Jenkins designed for that: Credential plugin
https://wiki.jenkins.io/display/JENKINS/Credentials+Plugin
You set-up your data within this plugin, and then you can re-use later in your build. The same way as if they were regular shell variable.
spawn ssh id#server
match_max 100000
expect "*?assword:*"
send -- "$your_password\r"
send -- "\r"
interact
But if I may provide a recommendation, this is not the best way to connect in SSH.
You should use ssh key will make you get ride of the password step.
You generate your key:
ssh-keygen -t rsa -b 4096
You push it to your server:
ssh-copy-id id#server
And then you can log-in without any password needed:
ssh id#server

How to export credentials from one jenkins instance to another?

I am using the credentials plugin in Jenkins to manage credentials for git and database access for my team's builds. I would like to copy the credentials from one jenkins instance to another, independent jenkins instance. How would I go about doing this?
UPDATE: TL;DR Follow the link provided below in a comment by Filip Stachowiak it is the easiest way to do it. In case it doesn't work for you go on reading.
Copying the $HUDSON_HOME/credentials.xml is not the solution because Jenkins encrypts paswords and these can't be decrypted by another instance unless both share a common key.
So, either you use the same encription keys in both Jenkins instances (Where's the encryption key stored in Jenkins? ) or what you can do is:
Create the same user/password, you need to share, in the 2nd Jenkins instance so that a valid password is generated
What is really important is that user ids in both credentials.xml are the same. For that (see the credentials.xml example below) for user: Jenkins the identifier <id>c4855f57-5107-4b69-97fd-298e56a9977d</id> must be the same in both credentials.xml
<com.cloudbees.plugins.credentials.SystemCredentialsProvider plugin="credentials#1.22">
<domainCredentialsMap class="hudson.util.CopyOnWriteMap$Hash">
<entry>
<com.cloudbees.plugins.credentials.domains.Domain>
<specifications/>
</com.cloudbees.plugins.credentials.domains.Domain>
<java.util.concurrent.CopyOnWriteArrayList>
<com.cloudbees.plugins.credentials.impl.UsernamePasswordCredentialsImpl>
<scope>GLOBAL</scope>
<id>c4855f57-5107-4b69-97fd-298e56a9977d</id>
<description>Para SVN</description>
<username>jenkins</username>
<password>J1ztA2vSXHbm60k5PjLl5jg70ZooSFKF+kRAo08UVts=
</password>
</com.cloudbees.plugins.credentials.impl.UsernamePasswordCredentialsImpl>
</java.util.concurrent.CopyOnWriteArrayList>
</entry>
</domainCredentialsMap>
</com.cloudbees.plugins.credentials.SystemCredentialsProvider>
I was also facing the same problem. What worked for me is I copied the credentials.xml, config.xml and the secrets folder from existing jenkins to the new instance. After the restart of jenkins things worked fine.
This is what worked for me.
Create a job in Jenkins that takes the credentials and writes them to output. If Jenkins replaces the password in the output with ****, just obfuscate it first (add a space between each character, reverse the characters, base64 encode it, etc.)
I used a Powershell job to base64 encode it:
[convert]::ToBase64String([text.encoding]::Default.GetBytes($mysecret))
And then used Powershell to convert the base64 string back to a regular string:
[text.encoding]::Default.GetString([convert]::FromBase64String("bXlzZWNyZXQ="))
After trying quite a few things for several days this is the best solution I found for migrating my secrets from a Jenkins 2.176 to a new clean Jenkins 2.249.1 jenkins-cli was the best approach for me.
The process is quite simple just dump the credentials from the old instance to a local machine, or Docker pod with java installed, as a XML file (unencrypted) and then uploaded to the new instance.
Before starting you should verify the following:
Access to the credentials section on both Jenkins instances
Download the jenkins-ccli.jar from one of the instances (https://www.your-jenkins-url.com/cli/)
Have User and Password/Token at hand.
Notice: In case your jenkins uses an oAuth service you will need to
create a token for your user. Once logged into jenkins at the top
right if you click your profile you can verify both username and
generate password.
Now for the special sauce, you have to execute both parts from the same machine/pod:
Notice: If your instances are using valid Certificates and you want to
secure your connection you must remove the -noCertificateCheck
flag from both commands.
# OLD JENKINS DUMP # 
export USER=madox#example.com
export TOKEN=f561banana6ead83b587a4a8799c12c307
export SERVER=https://old-jenkins-url.com/
java -jar jenkins-cli.jar -noCertificateCheck -s $SERVER -auth $USER:$TOKEN list-credentials-as-xml "system::system::jenkins" > /tmp/jenkins_credentials.xml
# NEW JENKINS IMPORT # 
export USER=admin
export TOKEN=admin
export SERVER=https://new-jenkins-url.com/
java -jar jenkins-cli.jar -noCertificateCheck -s $SERVER -auth $USER:$TOKEN import-credentials-as-xml "system::system::jenkins" < /tmp/jenkins_credentials.xml
If you have the credentials.xml available and the old Jenkins instance still running, there is a way to decrypt individual credentials so you can enter them in the new Jenkins instance via the UI.
The approach is described over at the DevOps stackexchange by kenorb.
This does not convert all the credentials for an easy, automated migration, but helps when you have only few credentials to migrate (manually).
To summarize, you visit the /script page over at the old Jenkins instance, and use the encrypted credential from the credentials.xml file in the following line:
println(hudson.util.Secret.decrypt("{EncryptedCredentialFromCredentialsXml=}"))
To migrate all credentials to a new server, from Jenkins: Migrating credentials:
Stop Jenkins on new server.
new-server # /etc/init.d/jenkins stop
Remove the identity.key.enc file on new server:
new-server # rm identity.key.enc
Copy secret* and credentials.xml to new server.
current-server # cd /var/lib/jenkins
current-server # tar czvf /tmp/credentials.tgz secret* credentials.xml
current-server # scp credentials.tgz $user#$new-server:/tmp/
new-server # cd /var/lib/jenkins
new-server # tar xzvf /tmp/credentials.tgz -C ./
Start Jenkins.
new-server # /etc/init.d/jenkins start
Migrating users from a Jenkins instance to another Jenkins on a new server -
I tried following https://stackoverflow.com/a/35603191 which lead to https://itsecureadmin.com/2018/03/26/jenkins-migrating-credentials/. However I did not succeed in following these steps.
Further, I experimented exporting /var/lib/jenkins/users (or {JENKINS_HOME}/users) directory to the new instance on new server. After restarting the Jenkins on new server - it looks like all the user credentials are available on new server.
Additionally, I cross-checked if the users can log in to the new Jenkins instance. It works for now.
PS: This code is for redhat servers
Old server:
cd /var/lib/jeknins
or cd into wherever your Jenkins home is
tar cvzf users.tgz ./users
New server:
cd /var/lib/jeknins
scp <user>#<oldserver>:/var/lib/jenkins/user.tgz ~/var/lib/jenkins/.
sudo tar xvzf users.tgz
systemctl restart jenkins
Did you try to copy the $JENKINS_HOME/users folder and the $JENKINS_HOME/credentials.xml file to the other Jenkins instance?

Issue Connecting Stash and Jenkins with SSH

Follow the instructions here: http://nerdwin15.com/2013/04/continuous-integration-with-stash-and-jenkins/
I Have jenkins and stash "connected" however, running the builds hangs at
Fetching upstream changes from
ssh://git#git.xyz.com:7999/gp/gp-xyz.git
FATAL: Failed to fetch from ssh://git#git.xyz.com:7999/gp/gp-xyz.git
hudson.plugins.git.GitException: Failed to fetch from
ssh://git#git.xyz.com:7999/gp/gp-xyz.git
So from what I gather the problem is that if i run this command on jenkins (which is running on windows)..
$ git clone ssh://git#git.xyz.com:7999/gp/gp-xyz.git Cloning into
'gp-xyz'... Enter passphrase for key '/c/Documents and
Settings/userMe/.ssh/id_rsa':
Is the fact that I have to enter a password here. How can i configure windows to store the ssh key so that I can clone like the build server does?
What i tried is:
userMe#jenkins /C $ ssh -T git#git.xyz.com:7999 ssh:
git.xyz.com:7999: no address associated with name
userMe#jenkins /C $ ssh -T git#git.xyz.com git#git.xyz.com's
password: Permission denied, please try again. git#git.xyz.com's
password:
However, This confuses me. Because Stash is running on port 7999 and there is no actual user named git on stash but it wont let me change that?
Use open ssh to setup private and public keys on your windows host
You can use a service like open SSH to generate a DSA/RSA-2 key and setup a no_pw option. (Do not setup an RSA-1 key) as stash has issues with RSA-1.
After that , add your public key into your list of keys in your stash user profile.
Regarding your other questions,
By default, Stash http protocol runs on port 7990 and the ssh protocol is supported on protocol 7999.
git is a default userid used by Stash behind the scenes to talk to the underlying git repository

ssh connection failing when pushing on a Gitlab repo

I have installed GitLab. Suppose I installed it in /home/myuser/gitlab.
I created a new project
I was told to create a repo "test" I put in /home/myuser/gitlab/test
I added some SSH key in /home/myuser/.ssh
Then I initialized a Git repo in /home/myuser/gitlab/test.
Following instructions, I added a remote git#localhost:root/testing.git
but when I try to push, I get this error message:
$ git push -u origin master
ssh: connect to host localhost port 22: Connection refused
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
I installed GitLab in OS X and I have other SSH keys in /home/myhome/.ssh, I have set up the user email and name inside /home/myuser/gitlab/.git/config, (and set those globally just for testing) and the server is launched from /home/myuser/gitlab. Does anybody have an idea where this error comes from?
If I run ssh git#localhost, I get
/home/myhome/.ssh/config line 4: garbage at end of line; "home".
where in this file I have some settings for a remote server for another project. I think it is the problem but I don't really know how to fix it.
Update : Here's the content of my ~/.git/config file
Host remote_test_server
Hostname remote_test_user#ftp.remote_test_server
IdentityFile ~/.ssh/id_rsa_stf.pub
User <your home acct>
/home/myhome/.ssh/config line 4: garbage at end of line; "home".
That would prevent any ssh command to properly function, because of a parasite echo done by the remote session.
Check your .profile or other .rc files, and see if any echo is done in those.
Or at least, test with ssh -T git#localhost, in order to disable any TTY allocation.
check also the content of your .ssh/config file, which doesn't seem to be properly formatted.
See this example of a config file:
User should be the login name of the account used for the ssh session on the rmeote server.
It should not be the homedir path.
IdentityFile should reference the private key (~/.ssh/id_rsa_stf), not the public one!
Hostname should reference the remote server 'ftp.remote_test_server', not the user#remoteServer.

Resources